Compare commits

..

160 Commits

Author SHA1 Message Date
Scott Wilson
b5801af9a8 improvements: address feedback 2025-06-30 18:32:36 -07:00
Scott Wilson
20366a8c07 improvement: address feedback 2025-06-30 18:09:50 -07:00
Scott Wilson
447e28511c improvement: update stale/conflict text 2025-06-30 16:44:29 -07:00
Scott Wilson
650ed656e3 improvement: color/layout styling adjustments to change request page 2025-06-30 16:30:37 -07:00
Maidul Islam
3871fa552c Merge pull request #3888 from Infisical/revert-3885-misc/add-indices-for-referencing-columns-in-identity-access-token
Revert "misc: add indices for referencing columns in identity access token"
2025-06-30 17:27:31 -04:00
Sheen
9c72ee7f10 Revert "misc: add indices for referencing columns in identity access token" 2025-07-01 05:23:51 +08:00
Maidul Islam
22e8617661 Merge pull request #3885 from Infisical/misc/add-indices-for-referencing-columns-in-identity-access-token
misc: add indices for referencing columns in identity access token
2025-06-30 17:01:20 -04:00
Sheen Capadngan
2f29a513cc misc: make index creation concurrently 2025-07-01 03:36:55 +08:00
x032205
d3833c33b3 Merge pull request #3878 from Infisical/fix-approval-policy-bypassing
Fix bypassing approval policies
2025-06-30 13:37:28 -04:00
Sheen Capadngan
978a3e5828 misc: add indices for referencing columns in identity access token 2025-07-01 01:25:11 +08:00
Scott Wilson
27bf91e58f Merge pull request #3873 from Infisical/org-access-control-improvements
improvement(org-access-control): Standardize and improve org access control UI
2025-06-30 09:54:42 -07:00
Scott Wilson
f2c3c76c60 improvement: address feedback on remove rule policy edit 2025-06-30 09:21:00 -07:00
Scott Wilson
85023916e4 improvement: address feedback 2025-06-30 09:12:47 -07:00
Akhil Mohan
02afd6a8e7 Merge pull request #3882 from Infisical/feat/fix-access-token-ips
feat: resolved inefficient join for ip restriction in access token
2025-06-30 21:22:28 +05:30
=
929eac4350 feat: resolved inefficient join for ip restriction in access token 2025-06-30 20:13:26 +05:30
Vlad Matsiiako
c6074dd69a Merge pull request #3881 from Infisical/docs-update
update spend policy
2025-06-29 18:10:54 -07:00
Vladyslav Matsiiako
a9b26755ba update spend policy 2025-06-29 17:43:05 -07:00
Vlad Matsiiako
033e5d3f81 Merge pull request #3880 from Infisical/docs-update
update logos in docs
2025-06-28 16:38:05 -07:00
Vladyslav Matsiiako
90634e1913 update logos in docs 2025-06-28 16:26:58 -07:00
x032205
58b61a861a Fix bypassing approval policies 2025-06-28 04:17:09 -04:00
x032205
3c8ec7d7fb Merge pull request #3869 from Infisical/sequence-approval-policy-ui-additions
improvement(access-policies): Revamp approval sequence table display and access request modal
2025-06-28 04:07:41 -04:00
x032205
26a59286c5 Merge pull request #3877 from Infisical/remove-datadog-logs
Remove debug logs for DataDog stream
2025-06-28 03:45:14 -04:00
x032205
392792bb1e Remove debug logs for DataDog stream 2025-06-28 03:37:32 -04:00
Scott Wilson
48f40ff938 improvement: address feedback 2025-06-27 21:00:48 -07:00
Maidul Islam
969896e431 Merge pull request #3874 from Infisical/remove-certauth-join
Remove cert auth left join
2025-06-27 20:41:58 -04:00
Maidul Islam
fd85da5739 set trusted ip to empty 2025-06-27 20:36:32 -04:00
Maidul Islam
2caf6ff94b remove cert auth left join 2025-06-27 20:21:28 -04:00
Scott Wilson
ed7d709a70 improvement: standardize and improve org access control 2025-06-27 15:15:12 -07:00
Sheen
aff97374a9 Merge pull request #3868 from Infisical/misc/add-mention-of-service-usage-api-for-gcp
misc: add mention of service usage API for GCP
2025-06-28 04:26:21 +08:00
Scott Wilson
e8e90585ca Merge pull request #3871 from Infisical/project-role-type-col
improvement(project-roles): Add type col to project roles table and default sort
2025-06-27 11:42:47 -07:00
Scott Wilson
abd9dbf714 improvement: add type col to project roles table and default sort 2025-06-27 11:34:54 -07:00
Sheen
89aed3640b Merge pull request #3852 from akhilmhdh/feat/tls-identity-auth
feat: TLS cert identity auth
2025-06-28 02:29:25 +08:00
carlosmonastyrski
5513ff7631 Merge pull request #3866 from Infisical/feat/posthogEventBatch
feat(telemetry): Add aggregated events and groups to posthog
2025-06-27 14:42:55 -03:00
Sheen Capadngan
9fb7676739 misc: reordered doc for mi auth 2025-06-28 01:35:46 +08:00
Sheen Capadngan
6ac734d6c4 removed unnecessary changes 2025-06-28 01:32:53 +08:00
carlosmonastyrski
8044999785 feat(telemetry): increase even redis key exp to 15 mins 2025-06-27 14:31:54 -03:00
carlosmonastyrski
be51e4372d feat(telemetry): addressed PR suggestions 2025-06-27 14:30:31 -03:00
Sheen Capadngan
460b545925 Merge branch 'feat/tls-identity-auth' of https://github.com/akhilmhdh/infisical into HEAD 2025-06-28 01:29:49 +08:00
Sheen Capadngan
2f26c1930b misc: doc updates 2025-06-28 01:26:24 +08:00
Scott Wilson
953cc3a850 improvements: revise approval sequence table display and access request modal 2025-06-27 09:30:11 -07:00
Sheen Capadngan
fc9ae05f89 misc: updated TLS acronym 2025-06-28 00:21:08 +08:00
Sheen Capadngan
de22a3c56b misc: updated casing of acronym 2025-06-28 00:17:42 +08:00
Sheen
7c4baa6fd4 misc: added image for service usage API 2025-06-27 13:19:14 +00:00
Sheen Capadngan
f285648c95 misc: add mention of service usage API for GCP 2025-06-27 21:10:02 +08:00
carlosmonastyrski
0f04890d8f feat(telemetry): addressed PR suggestions 2025-06-26 21:18:07 -03:00
carlosmonastyrski
61274243e2 feat(telemetry): add batch events and groups logic 2025-06-26 20:58:01 -03:00
Scott Wilson
9366428091 Merge pull request #3865 from Infisical/remove-manual-styled-css-on-checkboxes
fix(checkbox): Remove manual css overrides of checkbox checked state
2025-06-26 15:38:05 -07:00
Scott Wilson
62482852aa fix: remove manual css overrides of checkbox checked state 2025-06-26 15:33:27 -07:00
x032205
cc02c00b61 Merge pull request #3864 from Infisical/update-aws-param-store-docs
Clarify relationship between path and key schema for AWS parameter store
2025-06-26 18:19:06 -04:00
x032205
2e256e4282 Tooltip 2025-06-26 18:14:48 -04:00
Scott Wilson
1b4bae6a84 Merge pull request #3863 from Infisical/remove-secret-scanning-v1-backend
chore(secret-scanning-v1): remove secret scanning v1 queue and webhook endpoint
2025-06-26 14:51:23 -07:00
Scott Wilson
1f0bcae0fc Merge pull request #3860 from Infisical/secret-sync-selection-improvements
improvement(secret-sync/app-connection): Add search/pagination to secret sync and app connection selection modals
2025-06-26 14:50:44 -07:00
x032205
dcd21883d1 Clarify relationship between path and key schema for AWS parameter store
docs
2025-06-26 17:02:21 -04:00
Scott Wilson
d7913a75c2 chore: remove secret scanning v1 queue and webhook endpoint 2025-06-26 11:32:45 -07:00
Scott Wilson
205442bff5 Merge pull request #3859 from Infisical/overview-ui-improvements
improvement(secret-overview): Add collapsed environment view to secret overview page
2025-06-26 09:24:33 -07:00
Scott Wilson
8ab51aba12 improvement: add search/pagination app connection select 2025-06-26 09:21:35 -07:00
Scott Wilson
e8d19eb823 improvement: disable tooltip hover content for env name tooltip 2025-06-26 09:12:11 -07:00
Scott Wilson
3d1f054b87 improvement: add pagination/search to secret sync selection 2025-06-26 08:13:57 -07:00
Scott Wilson
5d30215ea7 improvement: increase env tooltip max width and adjust alignment 2025-06-26 07:56:47 -07:00
Scott Wilson
29fedfdde5 Merge pull request #3850 from Infisical/policy-edit-revisions
improvement(project-policies): Revamp edit role page and access tree
2025-06-26 07:46:35 -07:00
Scott Wilson
b5317d1d75 fix: add ability to remove non-conditional rules 2025-06-26 07:37:30 -07:00
Scott Wilson
86c145301e improvement: add collapsed environment view to secret overview page and minor ui adjustments 2025-06-25 16:49:34 -07:00
carlosmonastyrski
6446311b6d Merge pull request #3835 from Infisical/feat/gitlabSecretSync
feat(secret-sync): Add gitlab secret sync
2025-06-25 17:53:12 -03:00
Daniel Hougaard
3e80f1907c Merge pull request #3857 from Infisical/daniel/fix-dotnet-docs
docs: fix redirect for .NET SDK
2025-06-25 23:18:14 +04:00
Daniel Hougaard
79e62eec25 docs: fix redirect for .NET SDK 2025-06-25 23:11:11 +04:00
Daniel Hougaard
c41730c5fb Merge pull request #3856 from Infisical/daniel/fix-docs
fix(docs): sdk and changelog tab not loading
2025-06-25 22:34:09 +04:00
Daniel Hougaard
aac63d3097 fix(docs): sdk and changelog tab not working 2025-06-25 22:32:08 +04:00
carlosmonastyrski
f0b9d3c816 feat(secret-sync): improve hide secrets tooltip message 2025-06-25 14:10:28 -03:00
carlosmonastyrski
ea393d144a feat(secret-sync): minor change on docs 2025-06-25 13:57:07 -03:00
carlosmonastyrski
c4c0f86598 feat(secret-sync): improve update logic and add warning on docs for gitlab limitation on hidden variables 2025-06-25 13:51:38 -03:00
x032205
1f7617d132 Merge pull request #3851 from Infisical/ENG-3013
Allow undefined value for tags to prevent unwanted overrides
2025-06-25 12:45:43 -04:00
carlosmonastyrski
c95680b95d feat(secret-sync): type fix 2025-06-25 13:33:43 -03:00
x032205
18f1f93b5f Review fixes 2025-06-25 12:29:23 -04:00
carlosmonastyrski
70ea761375 feat(secret-sync): fix update masked_and_hidden field to not be sent unless it's true 2025-06-25 13:17:41 -03:00
Scott Wilson
5b4790ee78 improvements: truncate environment selection and only show visualize access when expanded 2025-06-25 09:09:08 -07:00
x032205
5ab2a6bb5d Feedback 2025-06-25 11:56:11 -04:00
Scott Wilson
dcac85fe6c Merge pull request #3847 from Infisical/share-your-own-secret-link-fix
fix(secret-sharing): Support self-hosted for "share your own secret" link
2025-06-25 08:31:13 -07:00
Maidul Islam
2f07471404 Merge pull request #3853 from akhilmhdh/feat/copy-token
feat: added copy token button
2025-06-25 10:55:07 -04:00
Maidul Islam
137fd5ef07 added minor text updates 2025-06-25 10:50:16 -04:00
=
883c7835a1 feat: added copy token button 2025-06-25 15:28:58 +05:30
=
e33f34ceb4 fix: corrected the doc key 2025-06-25 14:46:13 +05:30
=
af5805a5ca feat: resolved incorrect invalidation 2025-06-25 14:46:13 +05:30
Akhil Mohan
bcf1c49a1b Update docs/documentation/platform/identities/tls-cert-auth.mdx
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2025-06-25 14:45:14 +05:30
Akhil Mohan
84fedf8eda Update docs/documentation/platform/identities/tls-cert-auth.mdx
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2025-06-25 14:44:45 +05:30
Akhil Mohan
97755981eb Update docs/documentation/platform/identities/tls-cert-auth.mdx
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2025-06-25 14:43:01 +05:30
Akhil Mohan
8291663802 Update frontend/src/pages/organization/AccessManagementPage/components/OrgIdentityTab/components/IdentitySection/IdentityTlsCertAuthForm.tsx
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2025-06-25 14:42:24 +05:30
Akhil Mohan
d9aed45504 Update frontend/src/pages/organization/AccessManagementPage/components/OrgIdentityTab/components/IdentitySection/IdentityTlsCertAuthForm.tsx
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2025-06-25 14:42:11 +05:30
=
8ada11edf3 feat: docs for tls cert auth 2025-06-25 14:27:04 +05:30
=
4bd62aa462 feat: updated frontend to have the tls cert auth login 2025-06-25 14:26:55 +05:30
carlosmonastyrski
0366e58a5b Type fix 2025-06-25 00:24:24 -03:00
x032205
9f6dca23db Greptile reviews 2025-06-24 23:19:42 -04:00
carlosmonastyrski
18e733c71f feat(secret-sync): minor fixes 2025-06-25 00:16:44 -03:00
x032205
f0a95808e7 Allow undefined value for tags to prevent unwanted overrides 2025-06-24 23:13:53 -04:00
x032205
90a0d0f744 Merge pull request #3848 from Infisical/improve-audit-log-streams
improve audit log streams: add backend logs + DD source
2025-06-24 22:18:04 -04:00
x032205
7f9c9be2c8 review fix 2025-06-24 22:00:45 -04:00
carlosmonastyrski
070982081c Merge remote-tracking branch 'origin/main' into feat/gitlabSecretSync 2025-06-24 22:42:28 -03:00
carlosmonastyrski
f462c3f85d feat(secret-sync): minor fixes 2025-06-24 21:38:33 -03:00
Scott Wilson
8683693103 improvement: address greptile feedback 2025-06-24 15:35:42 -07:00
Scott Wilson
737fffcceb improvement: address greptile feedback 2025-06-24 15:35:08 -07:00
Scott Wilson
ffac24ce75 improvement: revise edit role page and access tree 2025-06-24 15:23:27 -07:00
carlosmonastyrski
c505c5877f feat(secret-sync): updated docs 2025-06-24 18:11:18 -03:00
Maidul Islam
b59fa14bb6 Merge pull request #3818 from Infisical/feat/cli-bootstrap-create-k8-secret
feat: added auto-bootstrap support to helm
2025-06-24 17:03:13 -04:00
carlosmonastyrski
d4bf8a33dc feat(secret-sync): rework GitLab secret-sync to add group variables 2025-06-24 18:01:32 -03:00
Sheen
0eb36d7e35 misc: final doc changes 2025-06-24 20:56:06 +00:00
Sheen Capadngan
ae2da0066a misc: add helm chart auto bootstrap to methods 2025-06-25 04:40:07 +08:00
x032205
6566393e21 Review fixes 2025-06-24 14:39:46 -04:00
Sheen Capadngan
1d7da56b40 misc: used kubernetes client 2025-06-25 02:38:51 +08:00
x032205
af245b1f16 Add "service: audit-logs" entry for DataDog 2025-06-24 14:22:26 -04:00
Sheen
3d2465ae41 Merge pull request #3825 from Infisical/feat/add-cloudflare-app-connection-and-sync
feat: added cloudflare app connection and secret sync
2025-06-25 00:44:58 +08:00
x032205
c17df7e951 Improve URL detection 2025-06-24 12:44:16 -04:00
x032205
4d4953e95a improve audit log streams: add backend logs + DD source 2025-06-24 12:35:49 -04:00
carlosmonastyrski
f4f34802bc Merge pull request #3816 from Infisical/fix/addProjectSlugToSecretsV3
Add projectSlug parameter on secrets v3 endpoints
2025-06-24 13:28:23 -03:00
Daniel Hougaard
59cc857aef fix: further improve inconsistencies 2025-06-24 19:37:32 +04:00
Daniel Hougaard
a6713b2f76 Merge pull request #3846 from Infisical/daniel/multiple-folders
fix(folders): duplicate folders
2025-06-24 19:04:26 +04:00
Daniel Hougaard
3c9a7c77ff chore: re-add comment 2025-06-24 18:58:03 +04:00
Daniel Hougaard
f1bfea61d0 fix: replace keystore lock with postgres lock 2025-06-24 18:54:18 +04:00
Sheen
144ad2f25f misc: added image for generated token 2025-06-24 14:51:11 +00:00
carlosmonastyrski
43e0d400f9 feat(secret-sync): add Gitlab PR comments suggestions 2025-06-24 10:05:46 -03:00
=
b80b77ec36 feat: completed backend changes for tls auth 2025-06-24 16:46:46 +05:30
Sheen Capadngan
02a2309953 misc: added note for bootstrap output flag 2025-06-24 18:26:17 +08:00
Sheen Capadngan
f1587d8375 misc: addressed comments 2025-06-24 18:18:07 +08:00
carlosmonastyrski
42aaddccd5 Lint fix 2025-06-23 23:13:29 -03:00
carlosmonastyrski
39abeaaab5 Small fix on workspaceId variable definition on secret-router 2025-06-23 23:05:12 -03:00
Scott Wilson
198e74cd88 fix: include nooppener in window.open 2025-06-23 18:05:48 -07:00
Scott Wilson
8ed0a1de84 fix: correct window open for share your own secret link to handle self-hosted 2025-06-23 18:01:38 -07:00
Daniel Hougaard
b336c0c3d6 Update secret-folder-service.ts 2025-06-24 03:33:45 +04:00
Daniel Hougaard
305f2d79de remove unused path 2025-06-24 03:32:18 +04:00
Daniel Hougaard
d4a6faa92c fix(folders): multiple folders being created 2025-06-24 03:24:47 +04:00
carlosmonastyrski
4800e9c36e Address PR comments 2025-06-23 17:45:21 -03:00
Sheen Capadngan
8024d7448f misc: updated docs json 2025-06-23 22:18:50 +08:00
Sheen Capadngan
c65b79e00d Merge remote-tracking branch 'origin/main' into feat/add-cloudflare-app-connection-and-sync 2025-06-23 22:16:09 +08:00
carlosmonastyrski
c305ddd463 feat(secret-sync): Gitlab PR suggestions 2025-06-23 10:52:59 -03:00
carlosmonastyrski
27cb686216 feat(secret-sync): Fix frontend file names 2025-06-20 21:26:12 -03:00
carlosmonastyrski
e201d77a8f feat(secret-sync): Add gitlab secret sync 2025-06-20 21:13:14 -03:00
Sheen Capadngan
a43d4fd430 addressed greptie 2025-06-20 21:02:09 +08:00
Sheen Capadngan
80b6fb677c misc: addressed url issue 2025-06-20 20:52:00 +08:00
Sheen Capadngan
5bc8acd0a7 doc: added api references 2025-06-20 20:46:31 +08:00
Sheen
2575845df7 misc: added images to secret sync doc 2025-06-20 12:36:39 +00:00
Sheen Capadngan
641d58c157 misc: addressed sync overflow issue 2025-06-20 20:23:03 +08:00
Sheen Capadngan
430f5d516c misc: text updates to secret sync 2025-06-20 20:20:10 +08:00
Sheen Capadngan
5cec194e74 misc: initial cloudflare pages sync doc 2025-06-20 20:17:02 +08:00
Sheen Capadngan
5ede4f6f4b misc: added placeholder for account ID 2025-06-20 20:08:07 +08:00
Sheen
4d3581f835 doc: added assets for app connection 2025-06-20 12:07:21 +00:00
Sheen Capadngan
665f7fa5c3 misc: updated account ID 2025-06-20 19:50:03 +08:00
Sheen Capadngan
9f4b1d2565 image path updates 2025-06-20 19:42:22 +08:00
Sheen Capadngan
59e2a20180 misc: addressed minor issues 2025-06-20 19:39:33 +08:00
Sheen Capadngan
4fee5a5839 doc: added initial app connection doc 2025-06-20 19:36:27 +08:00
Sheen Capadngan
61e245ea58 Merge remote-tracking branch 'origin/main' into feat/add-cloudflare-app-connection-and-sync 2025-06-20 19:24:45 +08:00
Sheen Capadngan
57e97a146b feat: cloudflare pages secret sync 2025-06-20 03:43:36 +08:00
Sheen Capadngan
d2c7ed62d0 feat: added cloudflare app connection 2025-06-20 01:16:56 +08:00
Sheen Capadngan
470d7cca6a misc: updated chart version 2025-06-19 20:57:42 +08:00
Sheen Capadngan
8e3918ada3 misc: addressed tag issue for CLI 2025-06-19 20:20:53 +08:00
Sheen Capadngan
bd54054bc3 misc: enabled auto bootstrap for check 2025-06-19 03:53:57 +08:00
Sheen Capadngan
cfe51d4a52 misc: improved template dcs 2025-06-19 03:50:56 +08:00
Sheen Capadngan
9cdd7380df misc: greptie 2025-06-19 02:30:26 +08:00
Sheen Capadngan
07d491acd1 misc: corrected template doc 2025-06-19 02:26:13 +08:00
Sheen Capadngan
3276853427 misc: added helm support for auto bootstrap 2025-06-19 02:12:08 +08:00
carlosmonastyrski
a8eb72a8c5 Fix type issue 2025-06-18 14:48:29 -03:00
Sheen Capadngan
2b8220a71b feat: added support for outputting bootstrap credentials to k8 secret 2025-06-19 01:43:47 +08:00
carlosmonastyrski
f76d3e2a14 Add projectSlug parameter on secrets v3 endpoints 2025-06-18 14:35:49 -03:00
322 changed files with 12570 additions and 3886 deletions

View File

@@ -107,6 +107,10 @@ INF_APP_CONNECTION_GITHUB_APP_PRIVATE_KEY=
INF_APP_CONNECTION_GITHUB_APP_SLUG=
INF_APP_CONNECTION_GITHUB_APP_ID=
#gitlab app connection
INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID=
INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET=
#github radar app connection
INF_APP_CONNECTION_GITHUB_RADAR_APP_CLIENT_ID=
INF_APP_CONNECTION_GITHUB_RADAR_APP_CLIENT_SECRET=

View File

@@ -51,11 +51,18 @@ jobs:
--from-literal=ENCRYPTION_KEY=6c1fe4e407b8911c104518103505b218 \
--from-literal=SITE_URL=http://localhost:8080
- name: Create bootstrap secret
run: |
kubectl create secret generic infisical-bootstrap-credentials \
--namespace infisical-standalone-postgres \
--from-literal=INFISICAL_ADMIN_EMAIL=admin@example.com \
--from-literal=INFISICAL_ADMIN_PASSWORD=admin-password
- name: Run chart-testing (install)
run: |
ct install \
--config ct.yaml \
--charts helm-charts/infisical-standalone-postgres \
--helm-extra-args="--timeout=300s" \
--helm-extra-set-args="--set ingress.nginx.enabled=false --set infisical.autoDatabaseSchemaMigration=false --set infisical.replicaCount=1 --set infisical.image.tag=v0.132.2-postgres" \
--helm-extra-set-args="--set ingress.nginx.enabled=false --set infisical.autoDatabaseSchemaMigration=false --set infisical.replicaCount=1 --set infisical.image.tag=v0.132.2-postgres --set infisical.autoBootstrap.enabled=true" \
--namespace infisical-standalone-postgres

View File

@@ -45,3 +45,4 @@ cli/detect/config/gitleaks.toml:gcp-api-key:582
.github/workflows/helm-release-infisical-core.yml:generic-api-key:48
.github/workflows/helm-release-infisical-core.yml:generic-api-key:47
backend/src/services/smtp/smtp-service.ts:generic-api-key:79
frontend/src/components/secret-syncs/forms/SecretSyncDestinationFields/CloudflarePagesSyncFields.tsx:cloudflare-api-key:7

View File

@@ -8,6 +8,9 @@ import { Lock } from "@app/lib/red-lock";
export const mockKeyStore = (): TKeyStoreFactory => {
const store: Record<string, string | number | Buffer> = {};
const getRegex = (pattern: string) =>
new RE2(`^${pattern.replace(/[-[\]/{}()+?.\\^$|]/g, "\\$&").replace(/\*/g, ".*")}$`);
return {
setItem: async (key, value) => {
store[key] = value;
@@ -23,7 +26,7 @@ export const mockKeyStore = (): TKeyStoreFactory => {
return 1;
},
deleteItems: async ({ pattern, batchSize = 500, delay = 1500, jitter = 200 }) => {
const regex = new RE2(`^${pattern.replace(/[-[\]/{}()+?.\\^$|]/g, "\\$&").replace(/\*/g, ".*")}$`);
const regex = getRegex(pattern);
let totalDeleted = 0;
const keys = Object.keys(store);
@@ -53,6 +56,27 @@ export const mockKeyStore = (): TKeyStoreFactory => {
incrementBy: async () => {
return 1;
},
getItems: async (keys) => {
const values = keys.map((key) => {
const value = store[key];
if (typeof value === "string") {
return value;
}
return null;
});
return values;
},
getKeysByPattern: async (pattern) => {
const regex = getRegex(pattern);
const keys = Object.keys(store);
return keys.filter((key) => regex.test(key));
},
deleteItemsByKeyIn: async (keys) => {
for (const key of keys) {
delete store[key];
}
return keys.length;
},
acquireLock: () => {
return Promise.resolve({
release: () => {}

View File

@@ -30,6 +30,7 @@
"@fastify/static": "^7.0.4",
"@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^2.1.0",
"@gitbeaker/rest": "^42.5.0",
"@google-cloud/kms": "^4.5.0",
"@infisical/quic": "^1.0.8",
"@node-saml/passport-saml": "^5.0.1",
@@ -7807,6 +7808,48 @@
"p-limit": "^3.1.0"
}
},
"node_modules/@gitbeaker/core": {
"version": "42.5.0",
"resolved": "https://registry.npmjs.org/@gitbeaker/core/-/core-42.5.0.tgz",
"integrity": "sha512-rMWpOPaZi1iLiifnOIoVO57p2EmQQdfIwP4txqNyMvG4WjYP5Ez0U7jRD9Nra41x6K5kTPBZkuQcAdxVWRJcEQ==",
"license": "MIT",
"dependencies": {
"@gitbeaker/requester-utils": "^42.5.0",
"qs": "^6.12.2",
"xcase": "^2.0.1"
},
"engines": {
"node": ">=18.20.0"
}
},
"node_modules/@gitbeaker/requester-utils": {
"version": "42.5.0",
"resolved": "https://registry.npmjs.org/@gitbeaker/requester-utils/-/requester-utils-42.5.0.tgz",
"integrity": "sha512-HLdLS9LPBMVQumvroQg/4qkphLDtwDB+ygEsrD2u4oYCMUtXV4V1xaVqU4yTXjbTJ5sItOtdB43vYRkBcgueBw==",
"license": "MIT",
"dependencies": {
"picomatch-browser": "^2.2.6",
"qs": "^6.12.2",
"rate-limiter-flexible": "^4.0.1",
"xcase": "^2.0.1"
},
"engines": {
"node": ">=18.20.0"
}
},
"node_modules/@gitbeaker/rest": {
"version": "42.5.0",
"resolved": "https://registry.npmjs.org/@gitbeaker/rest/-/rest-42.5.0.tgz",
"integrity": "sha512-oC5cM6jS7aFOp0luTw5mWSRuMgdxwHRLZQ/aWkI+ETMfsprR/HyxsXfljlMY/XJ/fRxTbRJiodR5Axf66WjO3w==",
"license": "MIT",
"dependencies": {
"@gitbeaker/core": "^42.5.0",
"@gitbeaker/requester-utils": "^42.5.0"
},
"engines": {
"node": ">=18.20.0"
}
},
"node_modules/@google-cloud/kms": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/@google-cloud/kms/-/kms-4.5.0.tgz",
@@ -24628,6 +24671,18 @@
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/picomatch-browser": {
"version": "2.2.6",
"resolved": "https://registry.npmjs.org/picomatch-browser/-/picomatch-browser-2.2.6.tgz",
"integrity": "sha512-0ypsOQt9D4e3hziV8O4elD9uN0z/jtUEfxVRtNaAAtXIyUx9m/SzlO020i8YNL2aL/E6blOvvHQcin6HZlFy/w==",
"license": "MIT",
"engines": {
"node": ">=8.6"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/pify": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/pify/-/pify-4.0.1.tgz",
@@ -25562,6 +25617,12 @@
"node": ">= 0.6"
}
},
"node_modules/rate-limiter-flexible": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/rate-limiter-flexible/-/rate-limiter-flexible-4.0.1.tgz",
"integrity": "sha512-2/dGHpDFpeA0+755oUkW+EKyklqLS9lu0go9pDsbhqQjZcxfRyJ6LA4JI0+HAdZ2bemD/oOjUeZQB2lCZqXQfQ==",
"license": "ISC"
},
"node_modules/raw-body": {
"version": "2.5.2",
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.5.2.tgz",
@@ -31039,6 +31100,12 @@
}
}
},
"node_modules/xcase": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/xcase/-/xcase-2.0.1.tgz",
"integrity": "sha512-UmFXIPU+9Eg3E9m/728Bii0lAIuoc+6nbrNUKaRPJOFp91ih44qqGlWtxMB6kXFrRD6po+86ksHM5XHCfk6iPw==",
"license": "MIT"
},
"node_modules/xml-crypto": {
"version": "6.0.1",
"resolved": "https://registry.npmjs.org/xml-crypto/-/xml-crypto-6.0.1.tgz",

View File

@@ -149,6 +149,7 @@
"@fastify/static": "^7.0.4",
"@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^2.1.0",
"@gitbeaker/rest": "^42.5.0",
"@google-cloud/kms": "^4.5.0",
"@infisical/quic": "^1.0.8",
"@node-saml/passport-saml": "^5.0.1",

View File

@@ -74,6 +74,7 @@ import { TAllowedFields } from "@app/services/identity-ldap-auth/identity-ldap-a
import { TIdentityOciAuthServiceFactory } from "@app/services/identity-oci-auth/identity-oci-auth-service";
import { TIdentityOidcAuthServiceFactory } from "@app/services/identity-oidc-auth/identity-oidc-auth-service";
import { TIdentityProjectServiceFactory } from "@app/services/identity-project/identity-project-service";
import { TIdentityTlsCertAuthServiceFactory } from "@app/services/identity-tls-cert-auth/identity-tls-cert-auth-types";
import { TIdentityTokenAuthServiceFactory } from "@app/services/identity-token-auth/identity-token-auth-service";
import { TIdentityUaServiceFactory } from "@app/services/identity-ua/identity-ua-service";
import { TIntegrationServiceFactory } from "@app/services/integration/integration-service";
@@ -218,6 +219,7 @@ declare module "fastify" {
identityKubernetesAuth: TIdentityKubernetesAuthServiceFactory;
identityGcpAuth: TIdentityGcpAuthServiceFactory;
identityAliCloudAuth: TIdentityAliCloudAuthServiceFactory;
identityTlsCertAuth: TIdentityTlsCertAuthServiceFactory;
identityAwsAuth: TIdentityAwsAuthServiceFactory;
identityAzureAuth: TIdentityAzureAuthServiceFactory;
identityOciAuth: TIdentityOciAuthServiceFactory;

View File

@@ -164,6 +164,9 @@ import {
TIdentityProjectMemberships,
TIdentityProjectMembershipsInsert,
TIdentityProjectMembershipsUpdate,
TIdentityTlsCertAuths,
TIdentityTlsCertAuthsInsert,
TIdentityTlsCertAuthsUpdate,
TIdentityTokenAuths,
TIdentityTokenAuthsInsert,
TIdentityTokenAuthsUpdate,
@@ -496,11 +499,6 @@ import {
TMicrosoftTeamsIntegrationsInsert,
TMicrosoftTeamsIntegrationsUpdate
} from "@app/db/schemas/microsoft-teams-integrations";
import {
TPosthogAggregatedEvents,
TPosthogAggregatedEventsInsert,
TPosthogAggregatedEventsUpdate
} from "@app/db/schemas/posthog-aggregated-events";
import {
TProjectMicrosoftTeamsConfigs,
TProjectMicrosoftTeamsConfigsInsert,
@@ -799,6 +797,11 @@ declare module "knex/types/tables" {
TIdentityAlicloudAuthsInsert,
TIdentityAlicloudAuthsUpdate
>;
[TableName.IdentityTlsCertAuth]: KnexOriginal.CompositeTableType<
TIdentityTlsCertAuths,
TIdentityTlsCertAuthsInsert,
TIdentityTlsCertAuthsUpdate
>;
[TableName.IdentityAwsAuth]: KnexOriginal.CompositeTableType<
TIdentityAwsAuths,
TIdentityAwsAuthsInsert,
@@ -1208,10 +1211,5 @@ declare module "knex/types/tables" {
TSecretScanningConfigsInsert,
TSecretScanningConfigsUpdate
>;
[TableName.PosthogAggregatedEvents]: KnexOriginal.CompositeTableType<
TPosthogAggregatedEvents,
TPosthogAggregatedEventsInsert,
TPosthogAggregatedEventsUpdate
>;
}
}

View File

@@ -0,0 +1,28 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.IdentityTlsCertAuth))) {
await knex.schema.createTable(TableName.IdentityTlsCertAuth, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.bigInteger("accessTokenTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenMaxTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenNumUsesLimit").defaultTo(0).notNullable();
t.jsonb("accessTokenTrustedIps").notNullable();
t.timestamps(true, true, true);
t.uuid("identityId").notNullable().unique();
t.foreign("identityId").references("id").inTable(TableName.Identity).onDelete("CASCADE");
t.string("allowedCommonNames").nullable();
t.binary("encryptedCaCertificate").notNullable();
});
}
await createOnUpdateTrigger(knex, TableName.IdentityTlsCertAuth);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.IdentityTlsCertAuth);
await dropOnUpdateTrigger(knex, TableName.IdentityTlsCertAuth);
}

View File

@@ -1,26 +0,0 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.PosthogAggregatedEvents))) {
await knex.schema.createTable(TableName.PosthogAggregatedEvents, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.string("distinctId").notNullable();
t.string("event").notNullable();
t.bigInteger("eventCount").defaultTo(0).notNullable();
t.jsonb("properties").notNullable();
t.string("batchId").notNullable();
t.string("organizationId").nullable();
t.timestamps(true, true, true);
});
}
await createOnUpdateTrigger(knex, TableName.PosthogAggregatedEvents);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.PosthogAggregatedEvents);
await dropOnUpdateTrigger(knex, TableName.PosthogAggregatedEvents);
}

View File

@@ -0,0 +1,27 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const IdentityTlsCertAuthsSchema = z.object({
id: z.string().uuid(),
accessTokenTTL: z.coerce.number().default(7200),
accessTokenMaxTTL: z.coerce.number().default(7200),
accessTokenNumUsesLimit: z.coerce.number().default(0),
accessTokenTrustedIps: z.unknown(),
createdAt: z.date(),
updatedAt: z.date(),
identityId: z.string().uuid(),
allowedCommonNames: z.string().nullable().optional(),
encryptedCaCertificate: zodBuffer
});
export type TIdentityTlsCertAuths = z.infer<typeof IdentityTlsCertAuthsSchema>;
export type TIdentityTlsCertAuthsInsert = Omit<z.input<typeof IdentityTlsCertAuthsSchema>, TImmutableDBKeys>;
export type TIdentityTlsCertAuthsUpdate = Partial<Omit<z.input<typeof IdentityTlsCertAuthsSchema>, TImmutableDBKeys>>;

View File

@@ -52,6 +52,7 @@ export * from "./identity-org-memberships";
export * from "./identity-project-additional-privilege";
export * from "./identity-project-membership-role";
export * from "./identity-project-memberships";
export * from "./identity-tls-cert-auths";
export * from "./identity-token-auths";
export * from "./identity-ua-client-secrets";
export * from "./identity-universal-auths";

View File

@@ -86,6 +86,7 @@ export enum TableName {
IdentityOidcAuth = "identity_oidc_auths",
IdentityJwtAuth = "identity_jwt_auths",
IdentityLdapAuth = "identity_ldap_auths",
IdentityTlsCertAuth = "identity_tls_cert_auths",
IdentityOrgMembership = "identity_org_memberships",
IdentityProjectMembership = "identity_project_memberships",
IdentityProjectMembershipRole = "identity_project_membership_role",
@@ -171,9 +172,7 @@ export enum TableName {
SecretScanningResource = "secret_scanning_resources",
SecretScanningScan = "secret_scanning_scans",
SecretScanningFinding = "secret_scanning_findings",
SecretScanningConfig = "secret_scanning_configs",
// Telemetry
PosthogAggregatedEvents = "posthog_aggregated_events"
SecretScanningConfig = "secret_scanning_configs"
}
export type TImmutableDBKeys = "id" | "createdAt" | "updatedAt" | "commitId";
@@ -253,6 +252,7 @@ export enum IdentityAuthMethod {
ALICLOUD_AUTH = "alicloud-auth",
AWS_AUTH = "aws-auth",
AZURE_AUTH = "azure-auth",
TLS_CERT_AUTH = "tls-cert-auth",
OCI_AUTH = "oci-auth",
OIDC_AUTH = "oidc-auth",
JWT_AUTH = "jwt-auth",

View File

@@ -1,26 +0,0 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const PosthogAggregatedEventsSchema = z.object({
id: z.string().uuid(),
distinctId: z.string(),
event: z.string(),
eventCount: z.coerce.number().default(0),
properties: z.unknown(),
batchId: z.string(),
organizationId: z.string().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TPosthogAggregatedEvents = z.infer<typeof PosthogAggregatedEventsSchema>;
export type TPosthogAggregatedEventsInsert = Omit<z.input<typeof PosthogAggregatedEventsSchema>, TImmutableDBKeys>;
export type TPosthogAggregatedEventsUpdate = Partial<
Omit<z.input<typeof PosthogAggregatedEventsSchema>, TImmutableDBKeys>
>;

View File

@@ -350,6 +350,12 @@ export const accessApprovalRequestServiceFactory = ({
const canBypass = !policy.bypassers.length || policy.bypassers.some((bypasser) => bypasser.userId === actorId);
const cannotBypassUnderSoftEnforcement = !(isSoftEnforcement && canBypass);
// Calculate break glass attempt before sequence checks
const isBreakGlassApprovalAttempt =
policy.enforcementLevel === EnforcementLevel.Soft &&
actorId === accessApprovalRequest.requestedByUserId &&
status === ApprovalStatus.APPROVED;
const isApprover = policy.approvers.find((approver) => approver.userId === actorId);
// If user is (not an approver OR cant self approve) AND can't bypass policy
if ((!isApprover || (!policy.allowedSelfApprovals && isSelfApproval)) && cannotBypassUnderSoftEnforcement) {
@@ -409,15 +415,14 @@ export const accessApprovalRequestServiceFactory = ({
const isApproverOfTheSequence = policy.approvers.find(
(el) => el.sequence === presentSequence.step && el.userId === actorId
);
if (!isApproverOfTheSequence) throw new BadRequestError({ message: "You are not reviewer in this step" });
// Only throw if actor is not the approver and not bypassing
if (!isApproverOfTheSequence && !isBreakGlassApprovalAttempt) {
throw new BadRequestError({ message: "You are not a reviewer in this step" });
}
}
const reviewStatus = await accessApprovalRequestReviewerDAL.transaction(async (tx) => {
const isBreakGlassApprovalAttempt =
policy.enforcementLevel === EnforcementLevel.Soft &&
actorId === accessApprovalRequest.requestedByUserId &&
status === ApprovalStatus.APPROVED;
let reviewForThisActorProcessing: {
id: string;
requestId: string;

View File

@@ -0,0 +1,21 @@
export function providerSpecificPayload(url: string) {
const { hostname } = new URL(url);
const payload: Record<string, string> = {};
switch (hostname) {
case "http-intake.logs.datadoghq.com":
case "http-intake.logs.us3.datadoghq.com":
case "http-intake.logs.us5.datadoghq.com":
case "http-intake.logs.datadoghq.eu":
case "http-intake.logs.ap1.datadoghq.com":
case "http-intake.logs.ddog-gov.com":
payload.ddsource = "infisical";
payload.service = "audit-logs";
break;
default:
break;
}
return payload;
}

View File

@@ -13,6 +13,7 @@ import { TLicenseServiceFactory } from "../license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "../permission/org-permission";
import { TPermissionServiceFactory } from "../permission/permission-service-types";
import { TAuditLogStreamDALFactory } from "./audit-log-stream-dal";
import { providerSpecificPayload } from "./audit-log-stream-fns";
import { LogStreamHeaders, TAuditLogStreamServiceFactory } from "./audit-log-stream-types";
type TAuditLogStreamServiceFactoryDep = {
@@ -69,10 +70,11 @@ export const auditLogStreamServiceFactory = ({
headers.forEach(({ key, value }) => {
streamHeaders[key] = value;
});
await request
.post(
url,
{ ping: "ok" },
{ ...providerSpecificPayload(url), ping: "ok" },
{
headers: streamHeaders,
// request timeout
@@ -137,7 +139,7 @@ export const auditLogStreamServiceFactory = ({
await request
.post(
url || logStream.url,
{ ping: "ok" },
{ ...providerSpecificPayload(url || logStream.url), ping: "ok" },
{
headers: streamHeaders,
// request timeout

View File

@@ -1,13 +1,15 @@
import { RawAxiosRequestHeaders } from "axios";
import { AxiosError, RawAxiosRequestHeaders } from "axios";
import { SecretKeyEncoding } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { logger } from "@app/lib/logger";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TAuditLogStreamDALFactory } from "../audit-log-stream/audit-log-stream-dal";
import { providerSpecificPayload } from "../audit-log-stream/audit-log-stream-fns";
import { LogStreamHeaders } from "../audit-log-stream/audit-log-stream-types";
import { TLicenseServiceFactory } from "../license/license-service";
import { TAuditLogDALFactory } from "./audit-log-dal";
@@ -128,13 +130,25 @@ export const auditLogQueueServiceFactory = async ({
headers[key] = value;
});
return request.post(url, auditLog, {
headers,
// request timeout
timeout: AUDIT_LOG_STREAM_TIMEOUT,
// connection timeout
signal: AbortSignal.timeout(AUDIT_LOG_STREAM_TIMEOUT)
});
try {
const response = await request.post(
url,
{ ...providerSpecificPayload(url), ...auditLog },
{
headers,
// request timeout
timeout: AUDIT_LOG_STREAM_TIMEOUT,
// connection timeout
signal: AbortSignal.timeout(AUDIT_LOG_STREAM_TIMEOUT)
}
);
return response;
} catch (error) {
logger.error(
`Failed to stream audit log [url=${url}] for org [orgId=${orgId}] [error=${(error as AxiosError).message}]`
);
return error;
}
}
)
);
@@ -218,13 +232,25 @@ export const auditLogQueueServiceFactory = async ({
headers[key] = value;
});
return request.post(url, auditLog, {
headers,
// request timeout
timeout: AUDIT_LOG_STREAM_TIMEOUT,
// connection timeout
signal: AbortSignal.timeout(AUDIT_LOG_STREAM_TIMEOUT)
});
try {
const response = await request.post(
url,
{ ...providerSpecificPayload(url), ...auditLog },
{
headers,
// request timeout
timeout: AUDIT_LOG_STREAM_TIMEOUT,
// connection timeout
signal: AbortSignal.timeout(AUDIT_LOG_STREAM_TIMEOUT)
}
);
return response;
} catch (error) {
logger.error(
`Failed to stream audit log [url=${url}] for org [orgId=${orgId}] [error=${(error as AxiosError).message}]`
);
return error;
}
}
)
);

View File

@@ -202,6 +202,12 @@ export enum EventType {
REVOKE_IDENTITY_ALICLOUD_AUTH = "revoke-identity-alicloud-auth",
GET_IDENTITY_ALICLOUD_AUTH = "get-identity-alicloud-auth",
LOGIN_IDENTITY_TLS_CERT_AUTH = "login-identity-tls-cert-auth",
ADD_IDENTITY_TLS_CERT_AUTH = "add-identity-tls-cert-auth",
UPDATE_IDENTITY_TLS_CERT_AUTH = "update-identity-tls-cert-auth",
REVOKE_IDENTITY_TLS_CERT_AUTH = "revoke-identity-tls-cert-auth",
GET_IDENTITY_TLS_CERT_AUTH = "get-identity-tls-cert-auth",
LOGIN_IDENTITY_AWS_AUTH = "login-identity-aws-auth",
ADD_IDENTITY_AWS_AUTH = "add-identity-aws-auth",
UPDATE_IDENTITY_AWS_AUTH = "update-identity-aws-auth",
@@ -1141,6 +1147,53 @@ interface GetIdentityAliCloudAuthEvent {
};
}
interface LoginIdentityTlsCertAuthEvent {
type: EventType.LOGIN_IDENTITY_TLS_CERT_AUTH;
metadata: {
identityId: string;
identityTlsCertAuthId: string;
identityAccessTokenId: string;
};
}
interface AddIdentityTlsCertAuthEvent {
type: EventType.ADD_IDENTITY_TLS_CERT_AUTH;
metadata: {
identityId: string;
allowedCommonNames: string | null | undefined;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: Array<TIdentityTrustedIp>;
};
}
interface DeleteIdentityTlsCertAuthEvent {
type: EventType.REVOKE_IDENTITY_TLS_CERT_AUTH;
metadata: {
identityId: string;
};
}
interface UpdateIdentityTlsCertAuthEvent {
type: EventType.UPDATE_IDENTITY_TLS_CERT_AUTH;
metadata: {
identityId: string;
allowedCommonNames: string | null | undefined;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: Array<TIdentityTrustedIp>;
};
}
interface GetIdentityTlsCertAuthEvent {
type: EventType.GET_IDENTITY_TLS_CERT_AUTH;
metadata: {
identityId: string;
};
}
interface LoginIdentityOciAuthEvent {
type: EventType.LOGIN_IDENTITY_OCI_AUTH;
metadata: {
@@ -3358,6 +3411,11 @@ export type Event =
| UpdateIdentityAliCloudAuthEvent
| GetIdentityAliCloudAuthEvent
| DeleteIdentityAliCloudAuthEvent
| LoginIdentityTlsCertAuthEvent
| AddIdentityTlsCertAuthEvent
| UpdateIdentityTlsCertAuthEvent
| GetIdentityTlsCertAuthEvent
| DeleteIdentityTlsCertAuthEvent
| LoginIdentityOciAuthEvent
| AddIdentityOciAuthEvent
| UpdateIdentityOciAuthEvent

View File

@@ -1,17 +1,11 @@
import { ProbotOctokit } from "probot";
import { OrgMembershipRole, TableName } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env";
import { logger } from "@app/lib/logger";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { InternalServerError } from "@app/lib/errors";
import { TQueueServiceFactory } from "@app/queue";
import { TOrgDALFactory } from "@app/services/org/org-dal";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TSmtpService } from "@app/services/smtp/smtp-service";
import { TTelemetryServiceFactory } from "@app/services/telemetry/telemetry-service";
import { PostHogEventTypes } from "@app/services/telemetry/telemetry-types";
import { TSecretScanningDALFactory } from "../secret-scanning-dal";
import { scanContentAndGetFindings, scanFullRepoContentAndGetFindings } from "./secret-scanning-fns";
import { SecretMatch, TScanFullRepoEventPayload, TScanPushEventPayload } from "./secret-scanning-queue-types";
import { TScanFullRepoEventPayload, TScanPushEventPayload } from "./secret-scanning-queue-types";
type TSecretScanningQueueFactoryDep = {
queueService: TQueueServiceFactory;
@@ -23,227 +17,21 @@ type TSecretScanningQueueFactoryDep = {
export type TSecretScanningQueueFactory = ReturnType<typeof secretScanningQueueFactory>;
export const secretScanningQueueFactory = ({
queueService,
secretScanningDAL,
smtpService,
telemetryService,
orgMembershipDAL: orgMemberDAL
}: TSecretScanningQueueFactoryDep) => {
const startFullRepoScan = async (payload: TScanFullRepoEventPayload) => {
await queueService.queue(QueueName.SecretFullRepoScan, QueueJobs.SecretScan, payload, {
attempts: 3,
backoff: {
type: "exponential",
delay: 5000
},
removeOnComplete: true,
removeOnFail: {
count: 20 // keep the most recent 20 jobs
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
export const secretScanningQueueFactory = (_props: TSecretScanningQueueFactoryDep) => {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const startFullRepoScan = async (_payload: TScanFullRepoEventPayload) => {
throw new InternalServerError({
message: "Secret Scanning V1 has been deprecated. Please migrate to Secret Scanning V2"
});
};
const startPushEventScan = async (payload: TScanPushEventPayload) => {
await queueService.queue(QueueName.SecretPushEventScan, QueueJobs.SecretScan, payload, {
attempts: 3,
backoff: {
type: "exponential",
delay: 5000
},
removeOnComplete: true,
removeOnFail: {
count: 20 // keep the most recent 20 jobs
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const startPushEventScan = async (_payload: TScanPushEventPayload) => {
throw new InternalServerError({
message: "Secret Scanning V1 has been deprecated. Please migrate to Secret Scanning V2"
});
};
const getOrgAdminEmails = async (organizationId: string) => {
// get emails of admins
const adminsOfWork = await orgMemberDAL.findMembership({
[`${TableName.Organization}.id` as string]: organizationId,
role: OrgMembershipRole.Admin
});
return adminsOfWork.filter((userObject) => userObject.email).map((userObject) => userObject.email as string);
};
queueService.start(QueueName.SecretPushEventScan, async (job) => {
const appCfg = getConfig();
const { organizationId, commits, pusher, repository, installationId } = job.data;
const [owner, repo] = repository.fullName.split("/");
const octokit = new ProbotOctokit({
auth: {
appId: appCfg.SECRET_SCANNING_GIT_APP_ID,
privateKey: appCfg.SECRET_SCANNING_PRIVATE_KEY,
installationId
}
});
const allFindingsByFingerprint: { [key: string]: SecretMatch } = {};
for (const commit of commits) {
for (const filepath of [...commit.added, ...commit.modified]) {
// eslint-disable-next-line
const fileContentsResponse = await octokit.repos.getContent({
owner,
repo,
path: filepath
});
const { data } = fileContentsResponse;
const fileContent = Buffer.from((data as { content: string }).content, "base64").toString();
// eslint-disable-next-line
const findings = await scanContentAndGetFindings(`\n${fileContent}`); // extra line to count lines correctly
for (const finding of findings) {
const fingerPrintWithCommitId = `${commit.id}:${filepath}:${finding.RuleID}:${finding.StartLine}`;
const fingerPrintWithoutCommitId = `${filepath}:${finding.RuleID}:${finding.StartLine}`;
finding.Fingerprint = fingerPrintWithCommitId;
finding.FingerPrintWithoutCommitId = fingerPrintWithoutCommitId;
finding.Commit = commit.id;
finding.File = filepath;
finding.Author = commit.author.name;
finding.Email = commit?.author?.email ? commit?.author?.email : "";
allFindingsByFingerprint[fingerPrintWithCommitId] = finding;
}
}
}
await secretScanningDAL.transaction(async (tx) => {
if (!Object.keys(allFindingsByFingerprint).length) return;
await secretScanningDAL.upsert(
Object.keys(allFindingsByFingerprint).map((key) => ({
installationId,
email: allFindingsByFingerprint[key].Email,
author: allFindingsByFingerprint[key].Author,
date: allFindingsByFingerprint[key].Date,
file: allFindingsByFingerprint[key].File,
tags: allFindingsByFingerprint[key].Tags,
commit: allFindingsByFingerprint[key].Commit,
ruleID: allFindingsByFingerprint[key].RuleID,
endLine: String(allFindingsByFingerprint[key].EndLine),
entropy: String(allFindingsByFingerprint[key].Entropy),
message: allFindingsByFingerprint[key].Message,
endColumn: String(allFindingsByFingerprint[key].EndColumn),
startLine: String(allFindingsByFingerprint[key].StartLine),
startColumn: String(allFindingsByFingerprint[key].StartColumn),
fingerPrintWithoutCommitId: allFindingsByFingerprint[key].FingerPrintWithoutCommitId,
description: allFindingsByFingerprint[key].Description,
symlinkFile: allFindingsByFingerprint[key].SymlinkFile,
orgId: organizationId,
pusherEmail: pusher.email,
pusherName: pusher.name,
repositoryFullName: repository.fullName,
repositoryId: String(repository.id),
fingerprint: allFindingsByFingerprint[key].Fingerprint
})),
tx
);
});
const adminEmails = await getOrgAdminEmails(organizationId);
if (pusher?.email) {
adminEmails.push(pusher.email);
}
if (Object.keys(allFindingsByFingerprint).length) {
await smtpService.sendMail({
template: SmtpTemplates.SecretLeakIncident,
subjectLine: `Incident alert: leaked secrets found in Github repository ${repository.fullName}`,
recipients: adminEmails.filter((email) => email).map((email) => email),
substitutions: {
numberOfSecrets: Object.keys(allFindingsByFingerprint).length,
pusher_email: pusher.email,
pusher_name: pusher.name
}
});
}
await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretScannerPush,
distinctId: repository.fullName,
properties: {
numberOfRisks: Object.keys(allFindingsByFingerprint).length
}
});
});
queueService.start(QueueName.SecretFullRepoScan, async (job) => {
const appCfg = getConfig();
const { organizationId, repository, installationId } = job.data;
const octokit = new ProbotOctokit({
auth: {
appId: appCfg.SECRET_SCANNING_GIT_APP_ID,
privateKey: appCfg.SECRET_SCANNING_PRIVATE_KEY,
installationId
}
});
const findings = await scanFullRepoContentAndGetFindings(
// this is because of collision of octokit in probot and github
// eslint-disable-next-line
octokit as any,
installationId,
repository.fullName
);
await secretScanningDAL.transaction(async (tx) => {
if (!findings.length) return;
// eslint-disable-next-line
await secretScanningDAL.upsert(
findings.map((finding) => ({
installationId,
email: finding.Email,
author: finding.Author,
date: finding.Date,
file: finding.File,
tags: finding.Tags,
commit: finding.Commit,
ruleID: finding.RuleID,
endLine: String(finding.EndLine),
entropy: String(finding.Entropy),
message: finding.Message,
endColumn: String(finding.EndColumn),
startLine: String(finding.StartLine),
startColumn: String(finding.StartColumn),
fingerPrintWithoutCommitId: finding.FingerPrintWithoutCommitId,
description: finding.Description,
symlinkFile: finding.SymlinkFile,
orgId: organizationId,
repositoryFullName: repository.fullName,
repositoryId: String(repository.id),
fingerprint: finding.Fingerprint
})),
tx
);
});
const adminEmails = await getOrgAdminEmails(organizationId);
if (findings.length) {
await smtpService.sendMail({
template: SmtpTemplates.SecretLeakIncident,
subjectLine: `Incident alert: leaked secrets found in Github repository ${repository.fullName}`,
recipients: adminEmails.filter((email) => email).map((email) => email),
substitutions: {
numberOfSecrets: findings.length
}
});
}
await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretScannerFull,
distinctId: repository.fullName,
properties: {
numberOfRisks: findings.length
}
});
});
queueService.listen(QueueName.SecretPushEventScan, "failed", (job, err) => {
logger.error(err, "Failed to secret scan on push", job?.data);
});
queueService.listen(QueueName.SecretFullRepoScan, "failed", (job, err) => {
logger.error(err, "Failed to do full repo secret scan", job?.data);
});
return { startFullRepoScan, startPushEventScan };
};

View File

@@ -98,6 +98,7 @@ export const secretScanningServiceFactory = ({
if (canUseSecretScanning(actorOrgId)) {
await Promise.all(
repositories.map(({ id, full_name }) =>
// eslint-disable-next-line @typescript-eslint/no-unsafe-call,@typescript-eslint/no-unsafe-return,@typescript-eslint/no-unsafe-member-access
secretScanningQueue.startFullRepoScan({
organizationId: session.orgId,
installationId,
@@ -180,6 +181,7 @@ export const secretScanningServiceFactory = ({
if (!installationLink) return;
if (canUseSecretScanning(installationLink.orgId)) {
// eslint-disable-next-line @typescript-eslint/no-unsafe-call,@typescript-eslint/no-unsafe-return,@typescript-eslint/no-unsafe-member-access
await secretScanningQueue.startPushEventScan({
commits,
pusher: { name: pusher.name, email: pusher.email },

View File

@@ -11,7 +11,8 @@ export const PgSqlLock = {
OrgGatewayRootCaInit: (orgId: string) => pgAdvisoryLockHashText(`org-gateway-root-ca:${orgId}`),
OrgGatewayCertExchange: (orgId: string) => pgAdvisoryLockHashText(`org-gateway-cert-exchange:${orgId}`),
SecretRotationV2Creation: (folderId: string) => pgAdvisoryLockHashText(`secret-rotation-v2-creation:${folderId}`),
CreateProject: (orgId: string) => pgAdvisoryLockHashText(`create-project:${orgId}`)
CreateProject: (orgId: string) => pgAdvisoryLockHashText(`create-project:${orgId}`),
CreateFolder: (envId: string, projectId: string) => pgAdvisoryLockHashText(`create-folder:${envId}-${projectId}`)
} as const;
// all the key prefixes used must be set here to avoid conflict
@@ -72,6 +73,7 @@ type TWaitTillReady = {
export type TKeyStoreFactory = {
setItem: (key: string, value: string | number | Buffer, prefix?: string) => Promise<"OK">;
getItem: (key: string, prefix?: string) => Promise<string | null>;
getItems: (keys: string[], prefix?: string) => Promise<(string | null)[]>;
setExpiry: (key: string, expiryInSeconds: number) => Promise<number>;
setItemWithExpiry: (
key: string,
@@ -80,6 +82,7 @@ export type TKeyStoreFactory = {
prefix?: string
) => Promise<"OK">;
deleteItem: (key: string) => Promise<number>;
deleteItemsByKeyIn: (keys: string[]) => Promise<number>;
deleteItems: (arg: TDeleteItems) => Promise<number>;
incrementBy: (key: string, value: number) => Promise<number>;
acquireLock(
@@ -88,6 +91,7 @@ export type TKeyStoreFactory = {
settings?: Partial<Settings>
): Promise<{ release: () => Promise<ExecutionResult> }>;
waitTillReady: ({ key, waitingCb, keyCheckCb, waitIteration, delay, jitter }: TWaitTillReady) => Promise<void>;
getKeysByPattern: (pattern: string, limit?: number) => Promise<string[]>;
};
export const keyStoreFactory = (redisConfigKeys: TRedisConfigKeys): TKeyStoreFactory => {
@@ -99,6 +103,9 @@ export const keyStoreFactory = (redisConfigKeys: TRedisConfigKeys): TKeyStoreFac
const getItem = async (key: string, prefix?: string) => redis.get(prefix ? `${prefix}:${key}` : key);
const getItems = async (keys: string[], prefix?: string) =>
redis.mget(keys.map((key) => (prefix ? `${prefix}:${key}` : key)));
const setItemWithExpiry = async (
key: string,
expiryInSeconds: number | string,
@@ -108,6 +115,11 @@ export const keyStoreFactory = (redisConfigKeys: TRedisConfigKeys): TKeyStoreFac
const deleteItem = async (key: string) => redis.del(key);
const deleteItemsByKeyIn = async (keys: string[]) => {
if (keys.length === 0) return 0;
return redis.del(keys);
};
const deleteItems = async ({ pattern, batchSize = 500, delay = 1500, jitter = 200 }: TDeleteItems) => {
let cursor = "0";
let totalDeleted = 0;
@@ -163,6 +175,24 @@ export const keyStoreFactory = (redisConfigKeys: TRedisConfigKeys): TKeyStoreFac
}
};
const getKeysByPattern = async (pattern: string, limit?: number) => {
let cursor = "0";
const allKeys: string[] = [];
do {
// eslint-disable-next-line no-await-in-loop
const [nextCursor, keys] = await redis.scan(cursor, "MATCH", pattern, "COUNT", 1000);
cursor = nextCursor;
allKeys.push(...keys);
if (limit && allKeys.length >= limit) {
return allKeys.slice(0, limit);
}
} while (cursor !== "0");
return allKeys;
};
return {
setItem,
getItem,
@@ -174,6 +204,9 @@ export const keyStoreFactory = (redisConfigKeys: TRedisConfigKeys): TKeyStoreFac
acquireLock(resources: string[], duration: number, settings?: Partial<Settings>) {
return redisLock.acquire(resources, duration, settings);
},
waitTillReady
waitTillReady,
getKeysByPattern,
deleteItemsByKeyIn,
getItems
};
};

View File

@@ -8,6 +8,8 @@ import { TKeyStoreFactory } from "./keystore";
export const inMemoryKeyStore = (): TKeyStoreFactory => {
const store: Record<string, string | number | Buffer> = {};
const getRegex = (pattern: string) =>
new RE2(`^${pattern.replace(/[-[\]/{}()+?.\\^$|]/g, "\\$&").replace(/\*/g, ".*")}$`);
return {
setItem: async (key, value) => {
@@ -24,7 +26,7 @@ export const inMemoryKeyStore = (): TKeyStoreFactory => {
return 1;
},
deleteItems: async ({ pattern, batchSize = 500, delay = 1500, jitter = 200 }) => {
const regex = new RE2(`^${pattern.replace(/[-[\]/{}()+?.\\^$|]/g, "\\$&").replace(/\*/g, ".*")}$`);
const regex = getRegex(pattern);
let totalDeleted = 0;
const keys = Object.keys(store);
@@ -59,6 +61,27 @@ export const inMemoryKeyStore = (): TKeyStoreFactory => {
release: () => {}
}) as Promise<Lock>;
},
waitTillReady: async () => {}
waitTillReady: async () => {},
getKeysByPattern: async (pattern) => {
const regex = getRegex(pattern);
const keys = Object.keys(store);
return keys.filter((key) => regex.test(key));
},
deleteItemsByKeyIn: async (keys) => {
for (const key of keys) {
delete store[key];
}
return keys.length;
},
getItems: async (keys) => {
const values = keys.map((key) => {
const value = store[key];
if (typeof value === "string") {
return value;
}
return null;
});
return values;
}
};
};

View File

@@ -22,6 +22,7 @@ export enum ApiDocsTags {
UniversalAuth = "Universal Auth",
GcpAuth = "GCP Auth",
AliCloudAuth = "Alibaba Cloud Auth",
TlsCertAuth = "TLS Certificate Auth",
AwsAuth = "AWS Auth",
OciAuth = "OCI Auth",
AzureAuth = "Azure Auth",
@@ -283,6 +284,38 @@ export const ALICLOUD_AUTH = {
}
} as const;
export const TLS_CERT_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login."
},
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
allowedCommonNames:
"The comma-separated list of trusted common names that are allowed to authenticate with Infisical.",
caCertificate: "The PEM-encoded CA certificate to validate client certificates.",
accessTokenTTL: "The lifetime for an access token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an access token in seconds.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
allowedCommonNames:
"The comma-separated list of trusted common names that are allowed to authenticate with Infisical.",
caCertificate: "The PEM-encoded CA certificate to validate client certificates.",
accessTokenTTL: "The new lifetime for an access token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an access token in seconds.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke the auth method for."
}
} as const;
export const AWS_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login.",
@@ -2228,6 +2261,12 @@ export const AppConnections = {
},
FLYIO: {
accessToken: "The Access Token used to access fly.io."
},
GITLAB: {
instanceUrl: "The GitLab instance URL to connect with.",
accessToken: "The Access Token used to access GitLab.",
code: "The OAuth code to use to connect with GitLab.",
accessTokenType: "The type of token used to connect with GitLab."
}
}
};
@@ -2401,6 +2440,21 @@ export const SecretSyncs = {
},
FLYIO: {
appId: "The ID of the Fly.io app to sync secrets to."
},
GITLAB: {
projectId: "The GitLab Project ID to sync secrets to.",
projectName: "The GitLab Project Name to sync secrets to.",
groupId: "The GitLab Group ID to sync secrets to.",
groupName: "The GitLab Group Name to sync secrets to.",
scope: "The GitLab scope that secrets should be synced to. (default: project)",
targetEnvironment: "The GitLab environment scope that secrets should be synced to. (default: *)",
shouldProtectSecrets: "Whether variables should be protected",
shouldMaskSecrets: "Whether variables should be masked in logs",
shouldHideSecrets: "Whether variables should be hidden"
},
CLOUDFLARE_PAGES: {
projectName: "The name of the Cloudflare Pages project to sync secrets to.",
environment: "The environment of the Cloudflare Pages project to sync secrets to."
}
}
};

View File

@@ -193,6 +193,9 @@ const envSchema = z
PYLON_API_KEY: zpStr(z.string().optional()),
DISABLE_AUDIT_LOG_GENERATION: zodStrBool.default("false"),
SSL_CLIENT_CERTIFICATE_HEADER_KEY: zpStr(z.string().optional()).default("x-ssl-client-cert"),
IDENTITY_TLS_CERT_AUTH_CLIENT_CERTIFICATE_HEADER_KEY: zpStr(z.string().optional()).default(
"x-identity-tls-cert-auth-client-cert"
),
WORKFLOW_SLACK_CLIENT_ID: zpStr(z.string().optional()),
WORKFLOW_SLACK_CLIENT_SECRET: zpStr(z.string().optional()),
ENABLE_MSSQL_SECRET_ROTATION_ENCRYPT: zodStrBool.default("true"),
@@ -247,6 +250,10 @@ const envSchema = z
INF_APP_CONNECTION_GITHUB_RADAR_APP_ID: zpStr(z.string().optional()),
INF_APP_CONNECTION_GITHUB_RADAR_APP_WEBHOOK_SECRET: zpStr(z.string().optional()),
// gitlab oauth
INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID: zpStr(z.string().optional()),
INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET: zpStr(z.string().optional()),
// gcp app
INF_APP_CONNECTION_GCP_SERVICE_ACCOUNT_CREDENTIAL: zpStr(z.string().optional()),

View File

@@ -193,6 +193,8 @@ import { identityOidcAuthServiceFactory } from "@app/services/identity-oidc-auth
import { identityProjectDALFactory } from "@app/services/identity-project/identity-project-dal";
import { identityProjectMembershipRoleDALFactory } from "@app/services/identity-project/identity-project-membership-role-dal";
import { identityProjectServiceFactory } from "@app/services/identity-project/identity-project-service";
import { identityTlsCertAuthDALFactory } from "@app/services/identity-tls-cert-auth/identity-tls-cert-auth-dal";
import { identityTlsCertAuthServiceFactory } from "@app/services/identity-tls-cert-auth/identity-tls-cert-auth-service";
import { identityTokenAuthDALFactory } from "@app/services/identity-token-auth/identity-token-auth-dal";
import { identityTokenAuthServiceFactory } from "@app/services/identity-token-auth/identity-token-auth-service";
import { identityUaClientSecretDALFactory } from "@app/services/identity-ua/identity-ua-client-secret-dal";
@@ -278,7 +280,6 @@ import { TSmtpService } from "@app/services/smtp/smtp-service";
import { invalidateCacheQueueFactory } from "@app/services/super-admin/invalidate-cache-queue";
import { superAdminDALFactory } from "@app/services/super-admin/super-admin-dal";
import { getServerCfg, superAdminServiceFactory } from "@app/services/super-admin/super-admin-service";
import { posthogAggregatedEventsDALFactory } from "@app/services/telemetry/posthog-aggregated-events-dal";
import { telemetryDALFactory } from "@app/services/telemetry/telemetry-dal";
import { telemetryQueueServiceFactory } from "@app/services/telemetry/telemetry-queue";
import { telemetryServiceFactory } from "@app/services/telemetry/telemetry-service";
@@ -298,7 +299,6 @@ import { injectAssumePrivilege } from "../plugins/auth/inject-assume-privilege";
import { injectIdentity } from "../plugins/auth/inject-identity";
import { injectPermission } from "../plugins/auth/inject-permission";
import { injectRateLimits } from "../plugins/inject-rate-limits";
import { registerSecretScannerGhApp } from "../plugins/secret-scanner";
import { registerV1Routes } from "./v1";
import { registerV2Routes } from "./v2";
import { registerV3Routes } from "./v3";
@@ -327,7 +327,6 @@ export const registerRoutes = async (
}
) => {
const appCfg = getConfig();
await server.register(registerSecretScannerGhApp, { prefix: "/ss-webhook" });
await server.register(registerSecretScanningV2Webhooks, {
prefix: "/secret-scanning/webhooks"
});
@@ -387,6 +386,7 @@ export const registerRoutes = async (
const identityKubernetesAuthDAL = identityKubernetesAuthDALFactory(db);
const identityUaClientSecretDAL = identityUaClientSecretDALFactory(db);
const identityAliCloudAuthDAL = identityAliCloudAuthDALFactory(db);
const identityTlsCertAuthDAL = identityTlsCertAuthDALFactory(db);
const identityAwsAuthDAL = identityAwsAuthDALFactory(db);
const identityGcpAuthDAL = identityGcpAuthDALFactory(db);
const identityOciAuthDAL = identityOciAuthDALFactory(db);
@@ -399,7 +399,6 @@ export const registerRoutes = async (
const auditLogStreamDAL = auditLogStreamDALFactory(db);
const trustedIpDAL = trustedIpDALFactory(db);
const telemetryDAL = telemetryDALFactory(db);
const posthogAggregatedEventsDAL = posthogAggregatedEventsDALFactory(db);
const appConnectionDAL = appConnectionDALFactory(db);
const secretSyncDAL = secretSyncDALFactory(db, folderDAL);
@@ -683,8 +682,7 @@ export const registerRoutes = async (
const telemetryService = telemetryServiceFactory({
keyStore,
licenseService,
posthogAggregatedEventsDAL
licenseService
});
const telemetryQueue = telemetryQueueServiceFactory({
keyStore,
@@ -1421,7 +1419,8 @@ export const registerRoutes = async (
const identityAccessTokenService = identityAccessTokenServiceFactory({
identityAccessTokenDAL,
identityOrgMembershipDAL,
accessTokenQueue
accessTokenQueue,
identityDAL
});
const identityProjectService = identityProjectServiceFactory({
@@ -1497,6 +1496,15 @@ export const registerRoutes = async (
permissionService
});
const identityTlsCertAuthService = identityTlsCertAuthServiceFactory({
identityAccessTokenDAL,
identityTlsCertAuthDAL,
identityOrgMembershipDAL,
licenseService,
permissionService,
kmsService
});
const identityAwsAuthService = identityAwsAuthServiceFactory({
identityAccessTokenDAL,
identityAwsAuthDAL,
@@ -1951,6 +1959,7 @@ export const registerRoutes = async (
identityAwsAuth: identityAwsAuthService,
identityAzureAuth: identityAzureAuthService,
identityOciAuth: identityOciAuthService,
identityTlsCertAuth: identityTlsCertAuthService,
identityOidcAuth: identityOidcAuthService,
identityJwtAuth: identityJwtAuthService,
identityLdapAuth: identityLdapAuthService,

View File

@@ -35,6 +35,10 @@ import {
CamundaConnectionListItemSchema,
SanitizedCamundaConnectionSchema
} from "@app/services/app-connection/camunda";
import {
CloudflareConnectionListItemSchema,
SanitizedCloudflareConnectionSchema
} from "@app/services/app-connection/cloudflare/cloudflare-connection-schema";
import {
DatabricksConnectionListItemSchema,
SanitizedDatabricksConnectionSchema
@@ -46,6 +50,7 @@ import {
GitHubRadarConnectionListItemSchema,
SanitizedGitHubRadarConnectionSchema
} from "@app/services/app-connection/github-radar";
import { GitLabConnectionListItemSchema, SanitizedGitLabConnectionSchema } from "@app/services/app-connection/gitlab";
import {
HCVaultConnectionListItemSchema,
SanitizedHCVaultConnectionSchema
@@ -109,7 +114,9 @@ const SanitizedAppConnectionSchema = z.union([
...SanitizedOnePassConnectionSchema.options,
...SanitizedHerokuConnectionSchema.options,
...SanitizedRenderConnectionSchema.options,
...SanitizedFlyioConnectionSchema.options
...SanitizedFlyioConnectionSchema.options,
...SanitizedGitLabConnectionSchema.options,
...SanitizedCloudflareConnectionSchema.options
]);
const AppConnectionOptionsSchema = z.discriminatedUnion("app", [
@@ -139,7 +146,9 @@ const AppConnectionOptionsSchema = z.discriminatedUnion("app", [
OnePassConnectionListItemSchema,
HerokuConnectionListItemSchema,
RenderConnectionListItemSchema,
FlyioConnectionListItemSchema
FlyioConnectionListItemSchema,
GitLabConnectionListItemSchema,
CloudflareConnectionListItemSchema
]);
export const registerAppConnectionRouter = async (server: FastifyZodProvider) => {

View File

@@ -0,0 +1,53 @@
import z from "zod";
import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
CreateCloudflareConnectionSchema,
SanitizedCloudflareConnectionSchema,
UpdateCloudflareConnectionSchema
} from "@app/services/app-connection/cloudflare/cloudflare-connection-schema";
import { AuthMode } from "@app/services/auth/auth-type";
import { registerAppConnectionEndpoints } from "./app-connection-endpoints";
export const registerCloudflareConnectionRouter = async (server: FastifyZodProvider) => {
registerAppConnectionEndpoints({
app: AppConnection.Cloudflare,
server,
sanitizedResponseSchema: SanitizedCloudflareConnectionSchema,
createSchema: CreateCloudflareConnectionSchema,
updateSchema: UpdateCloudflareConnectionSchema
});
// The below endpoints are not exposed and for Infisical App use
server.route({
method: "GET",
url: `/:connectionId/cloudflare-pages-projects`,
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
connectionId: z.string().uuid()
}),
response: {
200: z
.object({
id: z.string(),
name: z.string()
})
.array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const { connectionId } = req.params;
const projects = await server.services.appConnection.cloudflare.listPagesProjects(connectionId, req.permission);
return projects;
}
});
};

View File

@@ -0,0 +1,90 @@
import z from "zod";
import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
CreateGitLabConnectionSchema,
SanitizedGitLabConnectionSchema,
TGitLabGroup,
TGitLabProject,
UpdateGitLabConnectionSchema
} from "@app/services/app-connection/gitlab";
import { AuthMode } from "@app/services/auth/auth-type";
import { registerAppConnectionEndpoints } from "./app-connection-endpoints";
export const registerGitLabConnectionRouter = async (server: FastifyZodProvider) => {
registerAppConnectionEndpoints({
app: AppConnection.GitLab,
server,
sanitizedResponseSchema: SanitizedGitLabConnectionSchema,
createSchema: CreateGitLabConnectionSchema,
updateSchema: UpdateGitLabConnectionSchema
});
// The below endpoints are not exposed and for Infisical App use
server.route({
method: "GET",
url: `/:connectionId/projects`,
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
connectionId: z.string().uuid()
}),
response: {
200: z
.object({
id: z.string(),
name: z.string()
})
.array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const { connectionId } = req.params;
const projects: TGitLabProject[] = await server.services.appConnection.gitlab.listProjects(
connectionId,
req.permission
);
return projects;
}
});
server.route({
method: "GET",
url: `/:connectionId/groups`,
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
connectionId: z.string().uuid()
}),
response: {
200: z
.object({
id: z.string(),
name: z.string()
})
.array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const { connectionId } = req.params;
const groups: TGitLabGroup[] = await server.services.appConnection.gitlab.listGroups(
connectionId,
req.permission
);
return groups;
}
});
};

View File

@@ -10,11 +10,13 @@ import { registerAzureClientSecretsConnectionRouter } from "./azure-client-secre
import { registerAzureDevOpsConnectionRouter } from "./azure-devops-connection-router";
import { registerAzureKeyVaultConnectionRouter } from "./azure-key-vault-connection-router";
import { registerCamundaConnectionRouter } from "./camunda-connection-router";
import { registerCloudflareConnectionRouter } from "./cloudflare-connection-router";
import { registerDatabricksConnectionRouter } from "./databricks-connection-router";
import { registerFlyioConnectionRouter } from "./flyio-connection-router";
import { registerGcpConnectionRouter } from "./gcp-connection-router";
import { registerGitHubConnectionRouter } from "./github-connection-router";
import { registerGitHubRadarConnectionRouter } from "./github-radar-connection-router";
import { registerGitLabConnectionRouter } from "./gitlab-connection-router";
import { registerHCVaultConnectionRouter } from "./hc-vault-connection-router";
import { registerHerokuConnectionRouter } from "./heroku-connection-router";
import { registerHumanitecConnectionRouter } from "./humanitec-connection-router";
@@ -58,5 +60,7 @@ export const APP_CONNECTION_REGISTER_ROUTER_MAP: Record<AppConnection, (server:
[AppConnection.OnePass]: registerOnePassConnectionRouter,
[AppConnection.Heroku]: registerHerokuConnectionRouter,
[AppConnection.Render]: registerRenderConnectionRouter,
[AppConnection.Flyio]: registerFlyioConnectionRouter
[AppConnection.Flyio]: registerFlyioConnectionRouter,
[AppConnection.GitLab]: registerGitLabConnectionRouter,
[AppConnection.Cloudflare]: registerCloudflareConnectionRouter
};

View File

@@ -0,0 +1,396 @@
import crypto from "node:crypto";
import { z } from "zod";
import { IdentityTlsCertAuthsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { ApiDocsTags, TLS_CERT_AUTH } from "@app/lib/api-docs";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { TIdentityTrustedIp } from "@app/services/identity/identity-types";
import { isSuperAdmin } from "@app/services/super-admin/super-admin-fns";
const validateCommonNames = z
.string()
.min(1)
.trim()
.transform((el) =>
el
.split(",")
.map((i) => i.trim())
.join(",")
);
const validateCaCertificate = (caCert: string) => {
if (!caCert) return true;
try {
// eslint-disable-next-line no-new
new crypto.X509Certificate(caCert);
return true;
} catch (err) {
return false;
}
};
export const registerIdentityTlsCertAuthRouter = async (server: FastifyZodProvider) => {
server.route({
method: "POST",
url: "/login",
config: {
rateLimit: writeLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.TlsCertAuth],
description: "Login with TLS Certificate Auth",
body: z.object({
identityId: z.string().trim().describe(TLS_CERT_AUTH.LOGIN.identityId)
}),
response: {
200: z.object({
accessToken: z.string(),
expiresIn: z.coerce.number(),
accessTokenMaxTTL: z.coerce.number(),
tokenType: z.literal("Bearer")
})
}
},
handler: async (req) => {
const appCfg = getConfig();
const clientCertificate = req.headers[appCfg.IDENTITY_TLS_CERT_AUTH_CLIENT_CERTIFICATE_HEADER_KEY];
if (!clientCertificate) {
throw new BadRequestError({ message: "Missing TLS certificate in header" });
}
const { identityTlsCertAuth, accessToken, identityAccessToken, identityMembershipOrg } =
await server.services.identityTlsCertAuth.login({
identityId: req.body.identityId,
clientCertificate: clientCertificate as string
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityMembershipOrg?.orgId,
event: {
type: EventType.LOGIN_IDENTITY_TLS_CERT_AUTH,
metadata: {
identityId: identityTlsCertAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
identityTlsCertAuthId: identityTlsCertAuth.id
}
}
});
return {
accessToken,
tokenType: "Bearer" as const,
expiresIn: identityTlsCertAuth.accessTokenTTL,
accessTokenMaxTTL: identityTlsCertAuth.accessTokenMaxTTL
};
}
});
server.route({
method: "POST",
url: "/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.TlsCertAuth],
description: "Attach TLS Certificate Auth configuration onto identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim().describe(TLS_CERT_AUTH.ATTACH.identityId)
}),
body: z
.object({
allowedCommonNames: validateCommonNames
.optional()
.nullable()
.describe(TLS_CERT_AUTH.ATTACH.allowedCommonNames),
caCertificate: z
.string()
.min(1)
.max(10240)
.refine(validateCaCertificate, "Invalid CA Certificate.")
.describe(TLS_CERT_AUTH.ATTACH.caCertificate),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(TLS_CERT_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(0)
.max(315360000)
.default(2592000)
.describe(TLS_CERT_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.min(1)
.max(315360000)
.default(2592000)
.describe(TLS_CERT_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z
.number()
.int()
.min(0)
.default(0)
.describe(TLS_CERT_AUTH.ATTACH.accessTokenNumUsesLimit)
})
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: {
200: z.object({
identityTlsCertAuth: IdentityTlsCertAuthsSchema
})
}
},
handler: async (req) => {
const identityTlsCertAuth = await server.services.identityTlsCertAuth.attachTlsCertAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId,
isActorSuperAdmin: isSuperAdmin(req.auth)
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.ADD_IDENTITY_TLS_CERT_AUTH,
metadata: {
identityId: identityTlsCertAuth.identityId,
allowedCommonNames: identityTlsCertAuth.allowedCommonNames,
accessTokenTTL: identityTlsCertAuth.accessTokenTTL,
accessTokenMaxTTL: identityTlsCertAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityTlsCertAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityTlsCertAuth.accessTokenNumUsesLimit
}
}
});
return { identityTlsCertAuth };
}
});
server.route({
method: "PATCH",
url: "/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.TlsCertAuth],
description: "Update TLS Certificate Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().describe(TLS_CERT_AUTH.UPDATE.identityId)
}),
body: z
.object({
caCertificate: z
.string()
.min(1)
.max(10240)
.refine(validateCaCertificate, "Invalid CA Certificate.")
.optional()
.describe(TLS_CERT_AUTH.UPDATE.caCertificate),
allowedCommonNames: validateCommonNames
.optional()
.nullable()
.describe(TLS_CERT_AUTH.UPDATE.allowedCommonNames),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional()
.describe(TLS_CERT_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(0)
.max(315360000)
.optional()
.describe(TLS_CERT_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z
.number()
.int()
.min(0)
.optional()
.describe(TLS_CERT_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.min(0)
.optional()
.describe(TLS_CERT_AUTH.UPDATE.accessTokenMaxTTL)
})
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: {
200: z.object({
identityTlsCertAuth: IdentityTlsCertAuthsSchema
})
}
},
handler: async (req) => {
const identityTlsCertAuth = await server.services.identityTlsCertAuth.updateTlsCertAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.UPDATE_IDENTITY_TLS_CERT_AUTH,
metadata: {
identityId: identityTlsCertAuth.identityId,
allowedCommonNames: identityTlsCertAuth.allowedCommonNames,
accessTokenTTL: identityTlsCertAuth.accessTokenTTL,
accessTokenMaxTTL: identityTlsCertAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityTlsCertAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityTlsCertAuth.accessTokenNumUsesLimit
}
}
});
return { identityTlsCertAuth };
}
});
server.route({
method: "GET",
url: "/identities/:identityId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.TlsCertAuth],
description: "Retrieve TLS Certificate Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().describe(TLS_CERT_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({
identityTlsCertAuth: IdentityTlsCertAuthsSchema.extend({
caCertificate: z.string()
})
})
}
},
handler: async (req) => {
const identityTlsCertAuth = await server.services.identityTlsCertAuth.getTlsCertAuth({
identityId: req.params.identityId,
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.GET_IDENTITY_TLS_CERT_AUTH,
metadata: {
identityId: identityTlsCertAuth.identityId
}
}
});
return { identityTlsCertAuth };
}
});
server.route({
method: "DELETE",
url: "/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.TlsCertAuth],
description: "Delete TLS Certificate Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().describe(TLS_CERT_AUTH.REVOKE.identityId)
}),
response: {
200: z.object({
identityTlsCertAuth: IdentityTlsCertAuthsSchema
})
}
},
handler: async (req) => {
const identityTlsCertAuth = await server.services.identityTlsCertAuth.revokeTlsCertAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.REVOKE_IDENTITY_TLS_CERT_AUTH,
metadata: {
identityId: identityTlsCertAuth.identityId
}
}
});
return { identityTlsCertAuth };
}
});
};

View File

@@ -25,6 +25,7 @@ import { registerIdentityLdapAuthRouter } from "./identity-ldap-auth-router";
import { registerIdentityOciAuthRouter } from "./identity-oci-auth-router";
import { registerIdentityOidcAuthRouter } from "./identity-oidc-auth-router";
import { registerIdentityRouter } from "./identity-router";
import { registerIdentityTlsCertAuthRouter } from "./identity-tls-cert-auth-router";
import { registerIdentityTokenAuthRouter } from "./identity-token-auth-router";
import { registerIdentityUaRouter } from "./identity-universal-auth-router";
import { registerIntegrationAuthRouter } from "./integration-auth-router";
@@ -66,6 +67,7 @@ export const registerV1Routes = async (server: FastifyZodProvider) => {
await authRouter.register(registerIdentityAccessTokenRouter);
await authRouter.register(registerIdentityAliCloudAuthRouter);
await authRouter.register(registerIdentityAwsAuthRouter);
await authRouter.register(registerIdentityTlsCertAuthRouter, { prefix: "/tls-cert-auth" });
await authRouter.register(registerIdentityAzureAuthRouter);
await authRouter.register(registerIdentityOciAuthRouter);
await authRouter.register(registerIdentityOidcAuthRouter);

View File

@@ -0,0 +1,17 @@
import {
CloudflarePagesSyncSchema,
CreateCloudflarePagesSyncSchema,
UpdateCloudflarePagesSyncSchema
} from "@app/services/secret-sync/cloudflare-pages/cloudflare-pages-schema";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { registerSyncSecretsEndpoints } from "./secret-sync-endpoints";
export const registerCloudflarePagesSyncRouter = async (server: FastifyZodProvider) =>
registerSyncSecretsEndpoints({
destination: SecretSync.CloudflarePages,
server,
responseSchema: CloudflarePagesSyncSchema,
createSchema: CreateCloudflarePagesSyncSchema,
updateSchema: UpdateCloudflarePagesSyncSchema
});

View File

@@ -0,0 +1,13 @@
import { CreateGitLabSyncSchema, GitLabSyncSchema, UpdateGitLabSyncSchema } from "@app/services/secret-sync/gitlab";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { registerSyncSecretsEndpoints } from "./secret-sync-endpoints";
export const registerGitLabSyncRouter = async (server: FastifyZodProvider) =>
registerSyncSecretsEndpoints({
destination: SecretSync.GitLab,
server,
responseSchema: GitLabSyncSchema,
createSchema: CreateGitLabSyncSchema,
updateSchema: UpdateGitLabSyncSchema
});

View File

@@ -8,10 +8,12 @@ import { registerAzureAppConfigurationSyncRouter } from "./azure-app-configurati
import { registerAzureDevOpsSyncRouter } from "./azure-devops-sync-router";
import { registerAzureKeyVaultSyncRouter } from "./azure-key-vault-sync-router";
import { registerCamundaSyncRouter } from "./camunda-sync-router";
import { registerCloudflarePagesSyncRouter } from "./cloudflare-pages-sync-router";
import { registerDatabricksSyncRouter } from "./databricks-sync-router";
import { registerFlyioSyncRouter } from "./flyio-sync-router";
import { registerGcpSyncRouter } from "./gcp-sync-router";
import { registerGitHubSyncRouter } from "./github-sync-router";
import { registerGitLabSyncRouter } from "./gitlab-sync-router";
import { registerHCVaultSyncRouter } from "./hc-vault-sync-router";
import { registerHerokuSyncRouter } from "./heroku-sync-router";
import { registerHumanitecSyncRouter } from "./humanitec-sync-router";
@@ -43,5 +45,7 @@ export const SECRET_SYNC_REGISTER_ROUTER_MAP: Record<SecretSync, (server: Fastif
[SecretSync.OnePass]: registerOnePassSyncRouter,
[SecretSync.Heroku]: registerHerokuSyncRouter,
[SecretSync.Render]: registerRenderSyncRouter,
[SecretSync.Flyio]: registerFlyioSyncRouter
[SecretSync.Flyio]: registerFlyioSyncRouter,
[SecretSync.GitLab]: registerGitLabSyncRouter,
[SecretSync.CloudflarePages]: registerCloudflarePagesSyncRouter
};

View File

@@ -22,10 +22,15 @@ import {
import { AzureDevOpsSyncListItemSchema, AzureDevOpsSyncSchema } from "@app/services/secret-sync/azure-devops";
import { AzureKeyVaultSyncListItemSchema, AzureKeyVaultSyncSchema } from "@app/services/secret-sync/azure-key-vault";
import { CamundaSyncListItemSchema, CamundaSyncSchema } from "@app/services/secret-sync/camunda";
import {
CloudflarePagesSyncListItemSchema,
CloudflarePagesSyncSchema
} from "@app/services/secret-sync/cloudflare-pages/cloudflare-pages-schema";
import { DatabricksSyncListItemSchema, DatabricksSyncSchema } from "@app/services/secret-sync/databricks";
import { FlyioSyncListItemSchema, FlyioSyncSchema } from "@app/services/secret-sync/flyio";
import { GcpSyncListItemSchema, GcpSyncSchema } from "@app/services/secret-sync/gcp";
import { GitHubSyncListItemSchema, GitHubSyncSchema } from "@app/services/secret-sync/github";
import { GitLabSyncListItemSchema, GitLabSyncSchema } from "@app/services/secret-sync/gitlab";
import { HCVaultSyncListItemSchema, HCVaultSyncSchema } from "@app/services/secret-sync/hc-vault";
import { HerokuSyncListItemSchema, HerokuSyncSchema } from "@app/services/secret-sync/heroku";
import { HumanitecSyncListItemSchema, HumanitecSyncSchema } from "@app/services/secret-sync/humanitec";
@@ -55,7 +60,9 @@ const SecretSyncSchema = z.discriminatedUnion("destination", [
OnePassSyncSchema,
HerokuSyncSchema,
RenderSyncSchema,
FlyioSyncSchema
FlyioSyncSchema,
GitLabSyncSchema,
CloudflarePagesSyncSchema
]);
const SecretSyncOptionsSchema = z.discriminatedUnion("destination", [
@@ -78,7 +85,9 @@ const SecretSyncOptionsSchema = z.discriminatedUnion("destination", [
OnePassSyncListItemSchema,
HerokuSyncListItemSchema,
RenderSyncListItemSchema,
FlyioSyncListItemSchema
FlyioSyncListItemSchema,
GitLabSyncListItemSchema,
CloudflarePagesSyncListItemSchema
]);
export const registerSecretSyncRouter = async (server: FastifyZodProvider) => {

View File

@@ -4,7 +4,7 @@ import { z } from "zod";
import { SecretApprovalRequestsSchema, SecretsSchema, SecretType, ServiceTokenScopes } from "@app/db/schemas";
import { EventType, UserAgentType } from "@app/ee/services/audit-log/audit-log-types";
import { ApiDocsTags, RAW_SECRETS, SECRETS } from "@app/lib/api-docs";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { BadRequestError } from "@app/lib/errors";
import { removeTrailingSlash } from "@app/lib/fn";
import { secretsLimit, writeLimit } from "@app/server/config/rateLimiter";
import { BaseSecretNameSchema, SecretNameSchema } from "@app/server/lib/schemas";
@@ -12,7 +12,6 @@ import { getTelemetryDistinctId } from "@app/server/lib/telemetry";
import { getUserAgentType } from "@app/server/plugins/audit-log";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { ActorType, AuthMode } from "@app/services/auth/auth-type";
import { ProjectFilterType } from "@app/services/project/project-types";
import { ResourceMetadataSchema } from "@app/services/resource-metadata/resource-metadata-schema";
import { SecretOperations, SecretProtectionType } from "@app/services/secret/secret-types";
import { SecretUpdateMode } from "@app/services/secret-v2-bridge/secret-v2-bridge-types";
@@ -286,22 +285,17 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
environment = scope[0].environment;
workspaceId = req.auth.serviceToken.projectId;
}
} else if (req.permission.type === ActorType.IDENTITY && req.query.workspaceSlug && !workspaceId) {
const workspace = await server.services.project.getAProject({
filter: {
type: ProjectFilterType.SLUG,
orgId: req.permission.orgId,
slug: req.query.workspaceSlug
},
} else {
const projectId = await server.services.project.extractProjectIdFromSlug({
projectSlug: req.query.workspaceSlug,
projectId: workspaceId,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId
});
if (!workspace) throw new NotFoundError({ message: `No project found with slug ${req.query.workspaceSlug}` });
workspaceId = workspace.id;
workspaceId = projectId;
}
if (!workspaceId || !environment) throw new BadRequestError({ message: "Missing workspace id or environment" });
@@ -443,11 +437,23 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
environment = scope[0].environment;
workspaceId = req.auth.serviceToken.projectId;
}
} else {
const projectId = await server.services.project.extractProjectIdFromSlug({
projectSlug: workspaceSlug,
projectId: workspaceId,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId
});
workspaceId = projectId;
}
if (!environment) throw new BadRequestError({ message: "Missing environment" });
if (!workspaceId && !workspaceSlug)
if (!workspaceId) {
throw new BadRequestError({ message: "You must provide workspaceSlug or workspaceId" });
}
const secret = await server.services.secret.getSecretByNameRaw({
actorId: req.permission.id,
@@ -458,7 +464,6 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
environment,
projectId: workspaceId,
viewSecretValue: req.query.viewSecretValue,
projectSlug: workspaceSlug,
path: secretPath,
secretName: req.params.secretName,
type: req.query.type,
@@ -520,7 +525,8 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
secretName: SecretNameSchema.describe(RAW_SECRETS.CREATE.secretName)
}),
body: z.object({
workspaceId: z.string().trim().describe(RAW_SECRETS.CREATE.workspaceId),
workspaceId: z.string().trim().optional().describe(RAW_SECRETS.CREATE.workspaceId),
projectSlug: z.string().trim().optional().describe(RAW_SECRETS.CREATE.projectSlug),
environment: z.string().trim().describe(RAW_SECRETS.CREATE.environment),
secretPath: z
.string()
@@ -560,13 +566,22 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const projectId = await server.services.project.extractProjectIdFromSlug({
projectSlug: req.body.projectSlug,
projectId: req.body.workspaceId,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId
});
const secretOperation = await server.services.secret.createSecretRaw({
actorId: req.permission.id,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
environment: req.body.environment,
actorAuthMethod: req.permission.authMethod,
projectId: req.body.workspaceId,
projectId,
secretPath: req.body.secretPath,
secretName: req.params.secretName,
type: req.body.type,
@@ -584,7 +599,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
const { secret } = secretOperation;
await server.services.auditLog.createAuditLog({
projectId: req.body.workspaceId,
projectId,
...req.auditLogInfo,
event: {
type: EventType.CREATE_SECRET,
@@ -605,7 +620,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
organizationId: req.permission.orgId,
properties: {
numberOfSecrets: 1,
workspaceId: req.body.workspaceId,
workspaceId: projectId,
environment: req.body.environment,
secretPath: req.body.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),
@@ -636,7 +651,8 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
secretName: BaseSecretNameSchema.describe(RAW_SECRETS.UPDATE.secretName)
}),
body: z.object({
workspaceId: z.string().trim().describe(RAW_SECRETS.UPDATE.workspaceId),
workspaceId: z.string().trim().optional().describe(RAW_SECRETS.UPDATE.workspaceId),
projectSlug: z.string().trim().optional().describe(RAW_SECRETS.UPDATE.projectSlug),
environment: z.string().trim().describe(RAW_SECRETS.UPDATE.environment),
secretValue: z
.string()
@@ -682,13 +698,22 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const projectId = await server.services.project.extractProjectIdFromSlug({
projectSlug: req.body.projectSlug,
projectId: req.body.workspaceId,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId
});
const secretOperation = await server.services.secret.updateSecretRaw({
actorId: req.permission.id,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
environment: req.body.environment,
projectId: req.body.workspaceId,
projectId,
secretPath: req.body.secretPath,
secretName: req.params.secretName,
type: req.body.type,
@@ -710,7 +735,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
const { secret } = secretOperation;
await server.services.auditLog.createAuditLog({
projectId: req.body.workspaceId,
projectId,
...req.auditLogInfo,
event: {
type: EventType.UPDATE_SECRET,
@@ -731,7 +756,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
organizationId: req.permission.orgId,
properties: {
numberOfSecrets: 1,
workspaceId: req.body.workspaceId,
workspaceId: projectId,
environment: req.body.environment,
secretPath: req.body.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),
@@ -761,7 +786,8 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
secretName: z.string().min(1).describe(RAW_SECRETS.DELETE.secretName)
}),
body: z.object({
workspaceId: z.string().trim().describe(RAW_SECRETS.DELETE.workspaceId),
workspaceId: z.string().trim().optional().describe(RAW_SECRETS.DELETE.workspaceId),
projectSlug: z.string().trim().optional().describe(RAW_SECRETS.DELETE.projectSlug),
environment: z.string().trim().describe(RAW_SECRETS.DELETE.environment),
secretPath: z
.string()
@@ -784,13 +810,22 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const projectId = await server.services.project.extractProjectIdFromSlug({
projectSlug: req.body.projectSlug,
projectId: req.body.workspaceId,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId
});
const secretOperation = await server.services.secret.deleteSecretRaw({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
environment: req.body.environment,
projectId: req.body.workspaceId,
projectId,
secretPath: req.body.secretPath,
secretName: req.params.secretName,
type: req.body.type
@@ -802,7 +837,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
const { secret } = secretOperation;
await server.services.auditLog.createAuditLog({
projectId: req.body.workspaceId,
projectId,
...req.auditLogInfo,
event: {
type: EventType.DELETE_SECRET,
@@ -822,7 +857,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
organizationId: req.permission.orgId,
properties: {
numberOfSecrets: 1,
workspaceId: req.body.workspaceId,
workspaceId: projectId,
environment: req.body.environment,
secretPath: req.body.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),

View File

@@ -25,7 +25,9 @@ export enum AppConnection {
OnePass = "1password",
Heroku = "heroku",
Render = "render",
Flyio = "flyio"
Flyio = "flyio",
GitLab = "gitlab",
Cloudflare = "cloudflare"
}
export enum AWSRegion {

View File

@@ -51,6 +51,11 @@ import {
validateAzureKeyVaultConnectionCredentials
} from "./azure-key-vault";
import { CamundaConnectionMethod, getCamundaConnectionListItem, validateCamundaConnectionCredentials } from "./camunda";
import { CloudflareConnectionMethod } from "./cloudflare/cloudflare-connection-enum";
import {
getCloudflareConnectionListItem,
validateCloudflareConnectionCredentials
} from "./cloudflare/cloudflare-connection-fns";
import {
DatabricksConnectionMethod,
getDatabricksConnectionListItem,
@@ -64,6 +69,7 @@ import {
GitHubRadarConnectionMethod,
validateGitHubRadarConnectionCredentials
} from "./github-radar";
import { getGitLabConnectionListItem, GitLabConnectionMethod, validateGitLabConnectionCredentials } from "./gitlab";
import {
getHCVaultConnectionListItem,
HCVaultConnectionMethod,
@@ -128,7 +134,9 @@ export const listAppConnectionOptions = () => {
getOnePassConnectionListItem(),
getHerokuConnectionListItem(),
getRenderConnectionListItem(),
getFlyioConnectionListItem()
getFlyioConnectionListItem(),
getGitLabConnectionListItem(),
getCloudflareConnectionListItem()
].sort((a, b) => a.name.localeCompare(b.name));
};
@@ -206,7 +214,9 @@ export const validateAppConnectionCredentials = async (
[AppConnection.OnePass]: validateOnePassConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.Heroku]: validateHerokuConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.Render]: validateRenderConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.Flyio]: validateFlyioConnectionCredentials as TAppConnectionCredentialsValidator
[AppConnection.Flyio]: validateFlyioConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.GitLab]: validateGitLabConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.Cloudflare]: validateCloudflareConnectionCredentials as TAppConnectionCredentialsValidator
};
return VALIDATE_APP_CONNECTION_CREDENTIALS_MAP[appConnection.app](appConnection);
@@ -223,6 +233,7 @@ export const getAppConnectionMethodName = (method: TAppConnection["method"]) =>
case GitHubConnectionMethod.OAuth:
case AzureDevOpsConnectionMethod.OAuth:
case HerokuConnectionMethod.OAuth:
case GitLabConnectionMethod.OAuth:
return "OAuth";
case HerokuConnectionMethod.AuthToken:
return "Auth Token";
@@ -241,6 +252,7 @@ export const getAppConnectionMethodName = (method: TAppConnection["method"]) =>
case TerraformCloudConnectionMethod.ApiToken:
case VercelConnectionMethod.ApiToken:
case OnePassConnectionMethod.ApiToken:
case CloudflareConnectionMethod.APIToken:
return "API Token";
case PostgresConnectionMethod.UsernameAndPassword:
case MsSqlConnectionMethod.UsernameAndPassword:
@@ -318,7 +330,9 @@ export const TRANSITION_CONNECTION_CREDENTIALS_TO_PLATFORM: Record<
[AppConnection.OnePass]: platformManagedCredentialsNotSupported,
[AppConnection.Heroku]: platformManagedCredentialsNotSupported,
[AppConnection.Render]: platformManagedCredentialsNotSupported,
[AppConnection.Flyio]: platformManagedCredentialsNotSupported
[AppConnection.Flyio]: platformManagedCredentialsNotSupported,
[AppConnection.GitLab]: platformManagedCredentialsNotSupported,
[AppConnection.Cloudflare]: platformManagedCredentialsNotSupported
};
export const enterpriseAppCheck = async (

View File

@@ -27,7 +27,9 @@ export const APP_CONNECTION_NAME_MAP: Record<AppConnection, string> = {
[AppConnection.OnePass]: "1Password",
[AppConnection.Heroku]: "Heroku",
[AppConnection.Render]: "Render",
[AppConnection.Flyio]: "Fly.io"
[AppConnection.Flyio]: "Fly.io",
[AppConnection.GitLab]: "GitLab",
[AppConnection.Cloudflare]: "Cloudflare"
};
export const APP_CONNECTION_PLAN_MAP: Record<AppConnection, AppConnectionPlanType> = {
@@ -57,5 +59,7 @@ export const APP_CONNECTION_PLAN_MAP: Record<AppConnection, AppConnectionPlanTyp
[AppConnection.MySql]: AppConnectionPlanType.Regular,
[AppConnection.Heroku]: AppConnectionPlanType.Regular,
[AppConnection.Render]: AppConnectionPlanType.Regular,
[AppConnection.Flyio]: AppConnectionPlanType.Regular
[AppConnection.Flyio]: AppConnectionPlanType.Regular,
[AppConnection.GitLab]: AppConnectionPlanType.Regular,
[AppConnection.Cloudflare]: AppConnectionPlanType.Regular
};

View File

@@ -47,6 +47,8 @@ import { azureDevOpsConnectionService } from "./azure-devops/azure-devops-servic
import { ValidateAzureKeyVaultConnectionCredentialsSchema } from "./azure-key-vault";
import { ValidateCamundaConnectionCredentialsSchema } from "./camunda";
import { camundaConnectionService } from "./camunda/camunda-connection-service";
import { ValidateCloudflareConnectionCredentialsSchema } from "./cloudflare/cloudflare-connection-schema";
import { cloudflareConnectionService } from "./cloudflare/cloudflare-connection-service";
import { ValidateDatabricksConnectionCredentialsSchema } from "./databricks";
import { databricksConnectionService } from "./databricks/databricks-connection-service";
import { ValidateFlyioConnectionCredentialsSchema } from "./flyio";
@@ -56,6 +58,8 @@ import { gcpConnectionService } from "./gcp/gcp-connection-service";
import { ValidateGitHubConnectionCredentialsSchema } from "./github";
import { githubConnectionService } from "./github/github-connection-service";
import { ValidateGitHubRadarConnectionCredentialsSchema } from "./github-radar";
import { ValidateGitLabConnectionCredentialsSchema } from "./gitlab";
import { gitlabConnectionService } from "./gitlab/gitlab-connection-service";
import { ValidateHCVaultConnectionCredentialsSchema } from "./hc-vault";
import { hcVaultConnectionService } from "./hc-vault/hc-vault-connection-service";
import { ValidateHerokuConnectionCredentialsSchema } from "./heroku";
@@ -113,7 +117,9 @@ const VALIDATE_APP_CONNECTION_CREDENTIALS_MAP: Record<AppConnection, TValidateAp
[AppConnection.OnePass]: ValidateOnePassConnectionCredentialsSchema,
[AppConnection.Heroku]: ValidateHerokuConnectionCredentialsSchema,
[AppConnection.Render]: ValidateRenderConnectionCredentialsSchema,
[AppConnection.Flyio]: ValidateFlyioConnectionCredentialsSchema
[AppConnection.Flyio]: ValidateFlyioConnectionCredentialsSchema,
[AppConnection.GitLab]: ValidateGitLabConnectionCredentialsSchema,
[AppConnection.Cloudflare]: ValidateCloudflareConnectionCredentialsSchema
};
export const appConnectionServiceFactory = ({
@@ -521,6 +527,8 @@ export const appConnectionServiceFactory = ({
onepass: onePassConnectionService(connectAppConnectionById),
heroku: herokuConnectionService(connectAppConnectionById, appConnectionDAL, kmsService),
render: renderConnectionService(connectAppConnectionById),
flyio: flyioConnectionService(connectAppConnectionById)
flyio: flyioConnectionService(connectAppConnectionById),
gitlab: gitlabConnectionService(connectAppConnectionById, appConnectionDAL, kmsService),
cloudflare: cloudflareConnectionService(connectAppConnectionById)
};
};

View File

@@ -62,6 +62,12 @@ import {
TCamundaConnectionInput,
TValidateCamundaConnectionCredentialsSchema
} from "./camunda";
import {
TCloudflareConnection,
TCloudflareConnectionConfig,
TCloudflareConnectionInput,
TValidateCloudflareConnectionCredentialsSchema
} from "./cloudflare/cloudflare-connection-types";
import {
TDatabricksConnection,
TDatabricksConnectionConfig,
@@ -92,6 +98,12 @@ import {
TGitHubRadarConnectionInput,
TValidateGitHubRadarConnectionCredentialsSchema
} from "./github-radar";
import {
TGitLabConnection,
TGitLabConnectionConfig,
TGitLabConnectionInput,
TValidateGitLabConnectionCredentialsSchema
} from "./gitlab";
import {
THCVaultConnection,
THCVaultConnectionConfig,
@@ -182,6 +194,8 @@ export type TAppConnection = { id: string } & (
| THerokuConnection
| TRenderConnection
| TFlyioConnection
| TGitLabConnection
| TCloudflareConnection
);
export type TAppConnectionRaw = NonNullable<Awaited<ReturnType<TAppConnectionDALFactory["findById"]>>>;
@@ -216,6 +230,8 @@ export type TAppConnectionInput = { id: string } & (
| THerokuConnectionInput
| TRenderConnectionInput
| TFlyioConnectionInput
| TGitLabConnectionInput
| TCloudflareConnectionInput
);
export type TSqlConnectionInput =
@@ -257,7 +273,9 @@ export type TAppConnectionConfig =
| TOnePassConnectionConfig
| THerokuConnectionConfig
| TRenderConnectionConfig
| TFlyioConnectionConfig;
| TFlyioConnectionConfig
| TGitLabConnectionConfig
| TCloudflareConnectionConfig;
export type TValidateAppConnectionCredentialsSchema =
| TValidateAwsConnectionCredentialsSchema
@@ -286,7 +304,9 @@ export type TValidateAppConnectionCredentialsSchema =
| TValidateOnePassConnectionCredentialsSchema
| TValidateHerokuConnectionCredentialsSchema
| TValidateRenderConnectionCredentialsSchema
| TValidateFlyioConnectionCredentialsSchema;
| TValidateFlyioConnectionCredentialsSchema
| TValidateGitLabConnectionCredentialsSchema
| TValidateCloudflareConnectionCredentialsSchema;
export type TListAwsConnectionKmsKeys = {
connectionId: string;

View File

@@ -0,0 +1,3 @@
export enum CloudflareConnectionMethod {
APIToken = "api-token"
}

View File

@@ -0,0 +1,75 @@
import { AxiosError } from "axios";
import { request } from "@app/lib/config/request";
import { BadRequestError } from "@app/lib/errors";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { IntegrationUrls } from "@app/services/integration-auth/integration-list";
import { CloudflareConnectionMethod } from "./cloudflare-connection-enum";
import {
TCloudflareConnection,
TCloudflareConnectionConfig,
TCloudflarePagesProject
} from "./cloudflare-connection-types";
export const getCloudflareConnectionListItem = () => {
return {
name: "Cloudflare" as const,
app: AppConnection.Cloudflare as const,
methods: Object.values(CloudflareConnectionMethod) as [CloudflareConnectionMethod.APIToken]
};
};
export const listCloudflarePagesProjects = async (
appConnection: TCloudflareConnection
): Promise<TCloudflarePagesProject[]> => {
const {
credentials: { apiToken, accountId }
} = appConnection;
const { data } = await request.get<{ result: { name: string; id: string }[] }>(
`${IntegrationUrls.CLOUDFLARE_API_URL}/client/v4/accounts/${accountId}/pages/projects`,
{
headers: {
Authorization: `Bearer ${apiToken}`,
Accept: "application/json"
}
}
);
return data.result.map((a) => ({
name: a.name,
id: a.id
}));
};
export const validateCloudflareConnectionCredentials = async (config: TCloudflareConnectionConfig) => {
const { apiToken, accountId } = config.credentials;
try {
const resp = await request.get(`${IntegrationUrls.CLOUDFLARE_API_URL}/client/v4/accounts/${accountId}`, {
headers: {
Authorization: `Bearer ${apiToken}`,
Accept: "application/json"
}
});
if (resp.data === null) {
throw new BadRequestError({
message: "Unable to validate connection: Invalid API token provided."
});
}
} catch (error: unknown) {
if (error instanceof AxiosError) {
throw new BadRequestError({
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
message: `Failed to validate credentials: ${error.response?.data?.errors?.[0]?.message || error.message || "Unknown error"}`
});
}
throw new BadRequestError({
message: "Unable to validate connection: verify credentials"
});
}
return config.credentials;
};

View File

@@ -0,0 +1,74 @@
import z from "zod";
import { AppConnections } from "@app/lib/api-docs";
import { CharacterType, characterValidator } from "@app/lib/validator/validate-string";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
BaseAppConnectionSchema,
GenericCreateAppConnectionFieldsSchema,
GenericUpdateAppConnectionFieldsSchema
} from "@app/services/app-connection/app-connection-schemas";
import { CloudflareConnectionMethod } from "./cloudflare-connection-enum";
const accountIdCharacterValidator = characterValidator([
CharacterType.AlphaNumeric,
CharacterType.Underscore,
CharacterType.Hyphen
]);
export const CloudflareConnectionApiTokenCredentialsSchema = z.object({
accountId: z
.string()
.trim()
.min(1, "Account ID required")
.max(256, "Account ID cannot exceed 256 characters")
.refine(
(val) => accountIdCharacterValidator(val),
"Account ID can only contain alphanumeric characters, underscores, and hyphens"
),
apiToken: z.string().trim().min(1, "API token required").max(256, "API token cannot exceed 256 characters")
});
const BaseCloudflareConnectionSchema = BaseAppConnectionSchema.extend({ app: z.literal(AppConnection.Cloudflare) });
export const CloudflareConnectionSchema = BaseCloudflareConnectionSchema.extend({
method: z.literal(CloudflareConnectionMethod.APIToken),
credentials: CloudflareConnectionApiTokenCredentialsSchema
});
export const SanitizedCloudflareConnectionSchema = z.discriminatedUnion("method", [
BaseCloudflareConnectionSchema.extend({
method: z.literal(CloudflareConnectionMethod.APIToken),
credentials: CloudflareConnectionApiTokenCredentialsSchema.pick({ accountId: true })
})
]);
export const ValidateCloudflareConnectionCredentialsSchema = z.discriminatedUnion("method", [
z.object({
method: z
.literal(CloudflareConnectionMethod.APIToken)
.describe(AppConnections.CREATE(AppConnection.Cloudflare).method),
credentials: CloudflareConnectionApiTokenCredentialsSchema.describe(
AppConnections.CREATE(AppConnection.Cloudflare).credentials
)
})
]);
export const CreateCloudflareConnectionSchema = ValidateCloudflareConnectionCredentialsSchema.and(
GenericCreateAppConnectionFieldsSchema(AppConnection.Cloudflare)
);
export const UpdateCloudflareConnectionSchema = z
.object({
credentials: CloudflareConnectionApiTokenCredentialsSchema.optional().describe(
AppConnections.UPDATE(AppConnection.Cloudflare).credentials
)
})
.and(GenericUpdateAppConnectionFieldsSchema(AppConnection.Cloudflare));
export const CloudflareConnectionListItemSchema = z.object({
name: z.literal("Cloudflare"),
app: z.literal(AppConnection.Cloudflare),
methods: z.nativeEnum(CloudflareConnectionMethod).array()
});

View File

@@ -0,0 +1,30 @@
import { logger } from "@app/lib/logger";
import { OrgServiceActor } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import { listCloudflarePagesProjects } from "./cloudflare-connection-fns";
import { TCloudflareConnection } from "./cloudflare-connection-types";
type TGetAppConnectionFunc = (
app: AppConnection,
connectionId: string,
actor: OrgServiceActor
) => Promise<TCloudflareConnection>;
export const cloudflareConnectionService = (getAppConnection: TGetAppConnectionFunc) => {
const listPagesProjects = async (connectionId: string, actor: OrgServiceActor) => {
const appConnection = await getAppConnection(AppConnection.Cloudflare, connectionId, actor);
try {
const projects = await listCloudflarePagesProjects(appConnection);
return projects;
} catch (error) {
logger.error(error, "Failed to list Cloudflare Pages projects for Cloudflare connection");
return [];
}
};
return {
listPagesProjects
};
};

View File

@@ -0,0 +1,30 @@
import z from "zod";
import { DiscriminativePick } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import {
CloudflareConnectionSchema,
CreateCloudflareConnectionSchema,
ValidateCloudflareConnectionCredentialsSchema
} from "./cloudflare-connection-schema";
export type TCloudflareConnection = z.infer<typeof CloudflareConnectionSchema>;
export type TCloudflareConnectionInput = z.infer<typeof CreateCloudflareConnectionSchema> & {
app: AppConnection.Cloudflare;
};
export type TValidateCloudflareConnectionCredentialsSchema = typeof ValidateCloudflareConnectionCredentialsSchema;
export type TCloudflareConnectionConfig = DiscriminativePick<
TCloudflareConnectionInput,
"method" | "app" | "credentials"
> & {
orgId: string;
};
export type TCloudflarePagesProject = {
id: string;
name: string;
};

View File

@@ -1,3 +1,4 @@
import { logger } from "@app/lib/logger";
import { OrgServiceActor } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
@@ -19,6 +20,7 @@ export const gcpConnectionService = (getAppConnection: TGetAppConnectionFunc) =>
return projects;
} catch (error) {
logger.error(error, "Error listing GCP secret manager projects");
return [];
}
};

View File

@@ -0,0 +1,9 @@
export enum GitLabConnectionMethod {
OAuth = "oauth",
AccessToken = "access-token"
}
export enum GitLabAccessTokenType {
Project = "project",
Personal = "personal"
}

View File

@@ -0,0 +1,351 @@
/* eslint-disable no-await-in-loop */
import { GitbeakerRequestError, Gitlab } from "@gitbeaker/rest";
import { AxiosError } from "axios";
import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { BadRequestError, InternalServerError } from "@app/lib/errors";
import { removeTrailingSlash } from "@app/lib/fn";
import { logger } from "@app/lib/logger";
import { blockLocalAndPrivateIpAddresses } from "@app/lib/validator";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { encryptAppConnectionCredentials } from "@app/services/app-connection/app-connection-fns";
import { IntegrationUrls } from "@app/services/integration-auth/integration-list";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TAppConnectionDALFactory } from "../app-connection-dal";
import { GitLabAccessTokenType, GitLabConnectionMethod } from "./gitlab-connection-enums";
import { TGitLabConnection, TGitLabConnectionConfig, TGitLabGroup, TGitLabProject } from "./gitlab-connection-types";
interface GitLabOAuthTokenResponse {
access_token: string;
token_type: string;
expires_in: number;
refresh_token: string;
created_at: number;
scope?: string;
}
export const getGitLabConnectionListItem = () => {
const { INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID } = getConfig();
return {
name: "GitLab" as const,
app: AppConnection.GitLab as const,
methods: Object.values(GitLabConnectionMethod) as [
GitLabConnectionMethod.AccessToken,
GitLabConnectionMethod.OAuth
],
oauthClientId: INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID
};
};
export const getGitLabInstanceUrl = async (instanceUrl?: string) => {
const gitLabInstanceUrl = instanceUrl ? removeTrailingSlash(instanceUrl) : IntegrationUrls.GITLAB_URL;
await blockLocalAndPrivateIpAddresses(gitLabInstanceUrl);
return gitLabInstanceUrl;
};
export const getGitLabClient = async (accessToken: string, instanceUrl?: string, isOAuth = false) => {
const host = await getGitLabInstanceUrl(instanceUrl);
const client = new Gitlab<true>({
host,
...(isOAuth ? { oauthToken: accessToken } : { token: accessToken }),
camelize: true
});
return client;
};
export const refreshGitLabToken = async (
refreshToken: string,
appId: string,
orgId: string,
appConnectionDAL: Pick<TAppConnectionDALFactory, "updateById">,
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">,
instanceUrl?: string
): Promise<string> => {
const { INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID, INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET, SITE_URL } =
getConfig();
if (!INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET || !INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID || !SITE_URL) {
throw new InternalServerError({
message: `GitLab environment variables have not been configured`
});
}
const payload = new URLSearchParams({
grant_type: "refresh_token",
refresh_token: refreshToken,
client_id: INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID,
client_secret: INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET,
redirect_uri: `${SITE_URL}/organization/app-connections/gitlab/oauth/callback`
});
try {
const url = await getGitLabInstanceUrl(instanceUrl);
const { data } = await request.post<GitLabOAuthTokenResponse>(`${url}/oauth/token`, payload.toString(), {
headers: {
"Content-Type": "application/x-www-form-urlencoded",
Accept: "application/json"
}
});
const expiresAt = new Date(Date.now() + data.expires_in * 1000 - 600000);
const encryptedCredentials = await encryptAppConnectionCredentials({
credentials: {
instanceUrl,
tokenType: data.token_type,
createdAt: new Date(data.created_at * 1000).toISOString(),
refreshToken: data.refresh_token,
accessToken: data.access_token,
expiresAt
},
orgId,
kmsService
});
await appConnectionDAL.updateById(appId, { encryptedCredentials });
return data.access_token;
} catch (error: unknown) {
if (error instanceof AxiosError) {
throw new BadRequestError({
message: `Failed to refresh GitLab token: ${error.message}`
});
}
throw new BadRequestError({
message: "Unable to refresh GitLab token"
});
}
};
export const exchangeGitLabOAuthCode = async (
code: string,
instanceUrl?: string
): Promise<GitLabOAuthTokenResponse> => {
const { INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID, INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET, SITE_URL } =
getConfig();
if (!INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET || !INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID || !SITE_URL) {
throw new InternalServerError({
message: `GitLab environment variables have not been configured`
});
}
try {
const payload = new URLSearchParams({
grant_type: "authorization_code",
code,
client_id: INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_ID,
client_secret: INF_APP_CONNECTION_GITLAB_OAUTH_CLIENT_SECRET,
redirect_uri: `${SITE_URL}/organization/app-connections/gitlab/oauth/callback`
});
const url = await getGitLabInstanceUrl(instanceUrl);
const response = await request.post<GitLabOAuthTokenResponse>(`${url}/oauth/token`, payload.toString(), {
headers: {
"Content-Type": "application/x-www-form-urlencoded",
Accept: "application/json"
}
});
if (!response.data) {
throw new InternalServerError({
message: "Failed to exchange OAuth code: Empty response"
});
}
return response.data;
} catch (error: unknown) {
if (error instanceof AxiosError) {
throw new BadRequestError({
message: `Failed to exchange OAuth code: ${error.message}`
});
}
throw new BadRequestError({
message: "Unable to exchange OAuth code"
});
}
};
export const validateGitLabConnectionCredentials = async (config: TGitLabConnectionConfig) => {
const { credentials: inputCredentials, method } = config;
let accessToken: string;
let oauthData: GitLabOAuthTokenResponse | null = null;
if (method === GitLabConnectionMethod.OAuth && "code" in inputCredentials) {
oauthData = await exchangeGitLabOAuthCode(inputCredentials.code, inputCredentials.instanceUrl);
accessToken = oauthData.access_token;
} else if (method === GitLabConnectionMethod.AccessToken && "accessToken" in inputCredentials) {
accessToken = inputCredentials.accessToken;
} else {
throw new BadRequestError({
message: "Invalid credentials for the selected connection method"
});
}
try {
const client = await getGitLabClient(
accessToken,
inputCredentials.instanceUrl,
method === GitLabConnectionMethod.OAuth
);
await client.Users.showCurrentUser();
} catch (error: unknown) {
logger.error(error, "Error validating GitLab connection credentials");
if (error instanceof GitbeakerRequestError) {
throw new BadRequestError({
message: `Failed to validate credentials: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
});
}
throw new BadRequestError({
message: `Failed to validate credentials: ${(error as Error)?.message || "verify credentials"}`
});
}
if (method === GitLabConnectionMethod.OAuth && oauthData) {
return {
accessToken,
instanceUrl: inputCredentials.instanceUrl,
refreshToken: oauthData.refresh_token,
expiresAt: new Date(Date.now() + oauthData.expires_in * 1000 - 60000),
tokenType: oauthData.token_type,
createdAt: new Date(oauthData.created_at * 1000)
};
}
return inputCredentials;
};
export const listGitLabProjects = async ({
appConnection,
appConnectionDAL,
kmsService
}: {
appConnection: TGitLabConnection;
appConnectionDAL: Pick<TAppConnectionDALFactory, "updateById">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
}): Promise<TGitLabProject[]> => {
let { accessToken } = appConnection.credentials;
if (
appConnection.method === GitLabConnectionMethod.OAuth &&
appConnection.credentials.refreshToken &&
new Date(appConnection.credentials.expiresAt) < new Date()
) {
accessToken = await refreshGitLabToken(
appConnection.credentials.refreshToken,
appConnection.id,
appConnection.orgId,
appConnectionDAL,
kmsService,
appConnection.credentials.instanceUrl
);
}
try {
const client = await getGitLabClient(
accessToken,
appConnection.credentials.instanceUrl,
appConnection.method === GitLabConnectionMethod.OAuth
);
const projects = await client.Projects.all({
archived: false,
includePendingDelete: false,
membership: true,
includeHidden: false,
imported: false
});
return projects.map((project) => ({
name: project.pathWithNamespace,
id: project.id.toString()
}));
} catch (error: unknown) {
if (error instanceof GitbeakerRequestError) {
throw new BadRequestError({
message: `Failed to fetch GitLab projects: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
});
}
if (error instanceof InternalServerError) {
throw error;
}
throw new InternalServerError({
message: "Unable to fetch GitLab projects"
});
}
};
export const listGitLabGroups = async ({
appConnection,
appConnectionDAL,
kmsService
}: {
appConnection: TGitLabConnection;
appConnectionDAL: Pick<TAppConnectionDALFactory, "updateById">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
}): Promise<TGitLabGroup[]> => {
let { accessToken } = appConnection.credentials;
if (
appConnection.method === GitLabConnectionMethod.AccessToken &&
appConnection.credentials.accessTokenType === GitLabAccessTokenType.Project
) {
return [];
}
if (
appConnection.method === GitLabConnectionMethod.OAuth &&
appConnection.credentials.refreshToken &&
new Date(appConnection.credentials.expiresAt) < new Date()
) {
accessToken = await refreshGitLabToken(
appConnection.credentials.refreshToken,
appConnection.id,
appConnection.orgId,
appConnectionDAL,
kmsService,
appConnection.credentials.instanceUrl
);
}
try {
const client = await getGitLabClient(
accessToken,
appConnection.credentials.instanceUrl,
appConnection.method === GitLabConnectionMethod.OAuth
);
const groups = await client.Groups.all({
orderBy: "name",
sort: "asc",
minAccessLevel: 50
});
return groups.map((group) => ({
id: group.id.toString(),
name: group.name
}));
} catch (error: unknown) {
if (error instanceof GitbeakerRequestError) {
throw new BadRequestError({
message: `Failed to fetch GitLab groups: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
});
}
if (error instanceof InternalServerError) {
throw error;
}
throw new InternalServerError({
message: "Unable to fetch GitLab groups"
});
}
};

View File

@@ -0,0 +1,138 @@
import z from "zod";
import { AppConnections } from "@app/lib/api-docs";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
BaseAppConnectionSchema,
GenericCreateAppConnectionFieldsSchema,
GenericUpdateAppConnectionFieldsSchema
} from "@app/services/app-connection/app-connection-schemas";
import { GitLabAccessTokenType, GitLabConnectionMethod } from "./gitlab-connection-enums";
export const GitLabConnectionAccessTokenCredentialsSchema = z.object({
accessToken: z
.string()
.trim()
.min(1, "Access Token required")
.describe(AppConnections.CREDENTIALS.GITLAB.accessToken),
instanceUrl: z
.string()
.trim()
.url("Invalid Instance URL")
.optional()
.describe(AppConnections.CREDENTIALS.GITLAB.instanceUrl),
accessTokenType: z.nativeEnum(GitLabAccessTokenType).describe(AppConnections.CREDENTIALS.GITLAB.accessTokenType)
});
export const GitLabConnectionOAuthCredentialsSchema = z.object({
code: z.string().trim().min(1, "OAuth code required").describe(AppConnections.CREDENTIALS.GITLAB.code),
instanceUrl: z
.string()
.trim()
.url("Invalid Instance URL")
.optional()
.describe(AppConnections.CREDENTIALS.GITLAB.instanceUrl)
});
export const GitLabConnectionOAuthOutputCredentialsSchema = z.object({
accessToken: z.string().trim(),
refreshToken: z.string().trim(),
expiresAt: z.date(),
tokenType: z.string().optional().default("bearer"),
createdAt: z.string().optional(),
instanceUrl: z
.string()
.trim()
.url("Invalid Instance URL")
.optional()
.describe(AppConnections.CREDENTIALS.GITLAB.instanceUrl)
});
export const GitLabConnectionRefreshTokenCredentialsSchema = z.object({
refreshToken: z.string().trim().min(1, "Refresh token required"),
instanceUrl: z
.string()
.trim()
.url("Invalid Instance URL")
.optional()
.describe(AppConnections.CREDENTIALS.GITLAB.instanceUrl)
});
const BaseGitLabConnectionSchema = BaseAppConnectionSchema.extend({
app: z.literal(AppConnection.GitLab)
});
export const GitLabConnectionSchema = z.intersection(
BaseGitLabConnectionSchema,
z.discriminatedUnion("method", [
z.object({
method: z.literal(GitLabConnectionMethod.AccessToken),
credentials: GitLabConnectionAccessTokenCredentialsSchema
}),
z.object({
method: z.literal(GitLabConnectionMethod.OAuth),
credentials: GitLabConnectionOAuthOutputCredentialsSchema
})
])
);
export const SanitizedGitLabConnectionSchema = z.discriminatedUnion("method", [
BaseGitLabConnectionSchema.extend({
method: z.literal(GitLabConnectionMethod.AccessToken),
credentials: GitLabConnectionAccessTokenCredentialsSchema.pick({
instanceUrl: true,
accessTokenType: true
})
}),
BaseGitLabConnectionSchema.extend({
method: z.literal(GitLabConnectionMethod.OAuth),
credentials: GitLabConnectionOAuthOutputCredentialsSchema.pick({
instanceUrl: true
})
})
]);
export const ValidateGitLabConnectionCredentialsSchema = z.discriminatedUnion("method", [
z.object({
method: z.literal(GitLabConnectionMethod.AccessToken).describe(AppConnections.CREATE(AppConnection.GitLab).method),
credentials: GitLabConnectionAccessTokenCredentialsSchema.describe(
AppConnections.CREATE(AppConnection.GitLab).credentials
)
}),
z.object({
method: z.literal(GitLabConnectionMethod.OAuth).describe(AppConnections.CREATE(AppConnection.GitLab).method),
credentials: z
.union([
GitLabConnectionOAuthCredentialsSchema,
GitLabConnectionRefreshTokenCredentialsSchema,
GitLabConnectionOAuthOutputCredentialsSchema
])
.describe(AppConnections.CREATE(AppConnection.GitLab).credentials)
})
]);
export const CreateGitLabConnectionSchema = ValidateGitLabConnectionCredentialsSchema.and(
GenericCreateAppConnectionFieldsSchema(AppConnection.GitLab)
);
export const UpdateGitLabConnectionSchema = z
.object({
credentials: z
.union([
GitLabConnectionAccessTokenCredentialsSchema,
GitLabConnectionOAuthOutputCredentialsSchema,
GitLabConnectionRefreshTokenCredentialsSchema,
GitLabConnectionOAuthCredentialsSchema
])
.optional()
.describe(AppConnections.UPDATE(AppConnection.GitLab).credentials)
})
.and(GenericUpdateAppConnectionFieldsSchema(AppConnection.GitLab));
export const GitLabConnectionListItemSchema = z.object({
name: z.literal("GitLab"),
app: z.literal(AppConnection.GitLab),
methods: z.nativeEnum(GitLabConnectionMethod).array(),
oauthClientId: z.string().optional()
});

View File

@@ -0,0 +1,47 @@
import { logger } from "@app/lib/logger";
import { OrgServiceActor } from "@app/lib/types";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TAppConnectionDALFactory } from "../app-connection-dal";
import { AppConnection } from "../app-connection-enums";
import { listGitLabGroups, listGitLabProjects } from "./gitlab-connection-fns";
import { TGitLabConnection } from "./gitlab-connection-types";
type TGetAppConnectionFunc = (
app: AppConnection,
connectionId: string,
actor: OrgServiceActor
) => Promise<TGitLabConnection>;
export const gitlabConnectionService = (
getAppConnection: TGetAppConnectionFunc,
appConnectionDAL: Pick<TAppConnectionDALFactory, "updateById">,
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">
) => {
const listProjects = async (connectionId: string, actor: OrgServiceActor) => {
try {
const appConnection = await getAppConnection(AppConnection.GitLab, connectionId, actor);
const projects = await listGitLabProjects({ appConnection, appConnectionDAL, kmsService });
return projects;
} catch (error) {
logger.error(error, `Failed to establish connection with GitLab for app ${connectionId}`);
return [];
}
};
const listGroups = async (connectionId: string, actor: OrgServiceActor) => {
try {
const appConnection = await getAppConnection(AppConnection.GitLab, connectionId, actor);
const groups = await listGitLabGroups({ appConnection, appConnectionDAL, kmsService });
return groups;
} catch (error) {
logger.error(error, `Failed to establish connection with GitLab for app ${connectionId}`);
return [];
}
};
return {
listProjects,
listGroups
};
};

View File

@@ -0,0 +1,56 @@
import z from "zod";
import { DiscriminativePick } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import {
CreateGitLabConnectionSchema,
GitLabConnectionSchema,
ValidateGitLabConnectionCredentialsSchema
} from "./gitlab-connection-schemas";
export type TGitLabConnection = z.infer<typeof GitLabConnectionSchema>;
export type TGitLabConnectionInput = z.infer<typeof CreateGitLabConnectionSchema> & {
app: AppConnection.GitLab;
};
export type TValidateGitLabConnectionCredentialsSchema = typeof ValidateGitLabConnectionCredentialsSchema;
export type TGitLabConnectionConfig = DiscriminativePick<TGitLabConnectionInput, "method" | "app" | "credentials"> & {
orgId: string;
};
export type TGitLabProject = {
name: string;
id: string;
};
export type TGitLabAccessTokenCredentials = {
accessToken: string;
instanceUrl: string;
};
export type TGitLabOAuthCredentials = {
accessToken: string;
refreshToken: string;
expiresAt: Date;
tokenType?: string;
createdAt?: Date;
instanceUrl: string;
};
export type TGitLabOAuthCodeCredentials = {
code: string;
instanceUrl: string;
};
export type TGitLabRefreshTokenCredentials = {
refreshToken: string;
instanceUrl: string;
};
export interface TGitLabGroup {
id: string;
name: string;
}

View File

@@ -0,0 +1,4 @@
export * from "./gitlab-connection-enums";
export * from "./gitlab-connection-fns";
export * from "./gitlab-connection-schemas";
export * from "./gitlab-connection-types";

View File

@@ -17,70 +17,11 @@ export const identityAccessTokenDALFactory = (db: TDbClient) => {
const doc = await (tx || db.replicaNode())(TableName.IdentityAccessToken)
.where(filter)
.join(TableName.Identity, `${TableName.Identity}.id`, `${TableName.IdentityAccessToken}.identityId`)
.leftJoin(
TableName.IdentityUaClientSecret,
`${TableName.IdentityAccessToken}.identityUAClientSecretId`,
`${TableName.IdentityUaClientSecret}.id`
)
.leftJoin(
TableName.IdentityUniversalAuth,
`${TableName.IdentityUaClientSecret}.identityUAId`,
`${TableName.IdentityUniversalAuth}.id`
)
.leftJoin(TableName.IdentityGcpAuth, `${TableName.Identity}.id`, `${TableName.IdentityGcpAuth}.identityId`)
.leftJoin(
TableName.IdentityAliCloudAuth,
`${TableName.Identity}.id`,
`${TableName.IdentityAliCloudAuth}.identityId`
)
.leftJoin(TableName.IdentityAwsAuth, `${TableName.Identity}.id`, `${TableName.IdentityAwsAuth}.identityId`)
.leftJoin(TableName.IdentityAzureAuth, `${TableName.Identity}.id`, `${TableName.IdentityAzureAuth}.identityId`)
.leftJoin(TableName.IdentityLdapAuth, `${TableName.Identity}.id`, `${TableName.IdentityLdapAuth}.identityId`)
.leftJoin(
TableName.IdentityKubernetesAuth,
`${TableName.Identity}.id`,
`${TableName.IdentityKubernetesAuth}.identityId`
)
.leftJoin(TableName.IdentityOciAuth, `${TableName.Identity}.id`, `${TableName.IdentityOciAuth}.identityId`)
.leftJoin(TableName.IdentityOidcAuth, `${TableName.Identity}.id`, `${TableName.IdentityOidcAuth}.identityId`)
.leftJoin(TableName.IdentityTokenAuth, `${TableName.Identity}.id`, `${TableName.IdentityTokenAuth}.identityId`)
.leftJoin(TableName.IdentityJwtAuth, `${TableName.Identity}.id`, `${TableName.IdentityJwtAuth}.identityId`)
.select(selectAllTableCols(TableName.IdentityAccessToken))
.select(
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityUniversalAuth).as("accessTokenTrustedIpsUa"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityGcpAuth).as("accessTokenTrustedIpsGcp"),
db
.ref("accessTokenTrustedIps")
.withSchema(TableName.IdentityAliCloudAuth)
.as("accessTokenTrustedIpsAliCloud"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityAwsAuth).as("accessTokenTrustedIpsAws"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityAzureAuth).as("accessTokenTrustedIpsAzure"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityKubernetesAuth).as("accessTokenTrustedIpsK8s"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityOciAuth).as("accessTokenTrustedIpsOci"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityOidcAuth).as("accessTokenTrustedIpsOidc"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityTokenAuth).as("accessTokenTrustedIpsToken"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityJwtAuth).as("accessTokenTrustedIpsJwt"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityLdapAuth).as("accessTokenTrustedIpsLdap"),
db.ref("name").withSchema(TableName.Identity)
)
.select(db.ref("name").withSchema(TableName.Identity))
.first();
if (!doc) return;
return {
...doc,
trustedIpsUniversalAuth: doc.accessTokenTrustedIpsUa,
trustedIpsGcpAuth: doc.accessTokenTrustedIpsGcp,
trustedIpsAliCloudAuth: doc.accessTokenTrustedIpsAliCloud,
trustedIpsAwsAuth: doc.accessTokenTrustedIpsAws,
trustedIpsAzureAuth: doc.accessTokenTrustedIpsAzure,
trustedIpsKubernetesAuth: doc.accessTokenTrustedIpsK8s,
trustedIpsOciAuth: doc.accessTokenTrustedIpsOci,
trustedIpsOidcAuth: doc.accessTokenTrustedIpsOidc,
trustedIpsAccessTokenAuth: doc.accessTokenTrustedIpsToken,
trustedIpsAccessJwtAuth: doc.accessTokenTrustedIpsJwt,
trustedIpsAccessLdapAuth: doc.accessTokenTrustedIpsLdap
};
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "IdAccessTokenFindOne" });
}

View File

@@ -7,12 +7,14 @@ import { checkIPAgainstBlocklist, TIp } from "@app/lib/ip";
import { TAccessTokenQueueServiceFactory } from "../access-token-queue/access-token-queue";
import { AuthTokenType } from "../auth/auth-type";
import { TIdentityDALFactory } from "../identity/identity-dal";
import { TIdentityOrgDALFactory } from "../identity/identity-org-dal";
import { TIdentityAccessTokenDALFactory } from "./identity-access-token-dal";
import { TIdentityAccessTokenJwtPayload, TRenewAccessTokenDTO } from "./identity-access-token-types";
type TIdentityAccessTokenServiceFactoryDep = {
identityAccessTokenDAL: TIdentityAccessTokenDALFactory;
identityDAL: Pick<TIdentityDALFactory, "getTrustedIpsByAuthMethod">;
identityOrgMembershipDAL: TIdentityOrgDALFactory;
accessTokenQueue: Pick<
TAccessTokenQueueServiceFactory,
@@ -25,7 +27,8 @@ export type TIdentityAccessTokenServiceFactory = ReturnType<typeof identityAcces
export const identityAccessTokenServiceFactory = ({
identityAccessTokenDAL,
identityOrgMembershipDAL,
accessTokenQueue
accessTokenQueue,
identityDAL
}: TIdentityAccessTokenServiceFactoryDep) => {
const validateAccessTokenExp = async (identityAccessToken: TIdentityAccessTokens) => {
const {
@@ -190,23 +193,11 @@ export const identityAccessTokenServiceFactory = ({
message: "Failed to authorize revoked access token, access token is revoked"
});
const trustedIpsMap: Record<IdentityAuthMethod, unknown> = {
[IdentityAuthMethod.UNIVERSAL_AUTH]: identityAccessToken.trustedIpsUniversalAuth,
[IdentityAuthMethod.GCP_AUTH]: identityAccessToken.trustedIpsGcpAuth,
[IdentityAuthMethod.ALICLOUD_AUTH]: identityAccessToken.trustedIpsAliCloudAuth,
[IdentityAuthMethod.AWS_AUTH]: identityAccessToken.trustedIpsAwsAuth,
[IdentityAuthMethod.OCI_AUTH]: identityAccessToken.trustedIpsOciAuth,
[IdentityAuthMethod.AZURE_AUTH]: identityAccessToken.trustedIpsAzureAuth,
[IdentityAuthMethod.KUBERNETES_AUTH]: identityAccessToken.trustedIpsKubernetesAuth,
[IdentityAuthMethod.OIDC_AUTH]: identityAccessToken.trustedIpsOidcAuth,
[IdentityAuthMethod.TOKEN_AUTH]: identityAccessToken.trustedIpsAccessTokenAuth,
[IdentityAuthMethod.JWT_AUTH]: identityAccessToken.trustedIpsAccessJwtAuth,
[IdentityAuthMethod.LDAP_AUTH]: identityAccessToken.trustedIpsAccessLdapAuth
};
const trustedIps = trustedIpsMap[identityAccessToken.authMethod as IdentityAuthMethod];
if (ipAddress) {
const trustedIps = await identityDAL.getTrustedIpsByAuthMethod(
identityAccessToken.identityId,
identityAccessToken.authMethod as IdentityAuthMethod
);
if (ipAddress && trustedIps) {
checkIPAgainstBlocklist({
ipAddress,
trustedIps: trustedIps as TIp[]

View File

@@ -0,0 +1,10 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify, TOrmify } from "@app/lib/knex";
export type TIdentityTlsCertAuthDALFactory = TOrmify<TableName.IdentityTlsCertAuth>;
export const identityTlsCertAuthDALFactory = (db: TDbClient) => {
const orm = ormify(db, TableName.IdentityTlsCertAuth);
return orm;
};

View File

@@ -0,0 +1,423 @@
import crypto from "node:crypto";
import { ForbiddenError } from "@casl/ability";
import jwt from "jsonwebtoken";
import { IdentityAuthMethod } from "@app/db/schemas";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionIdentityActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import {
constructPermissionErrorMessage,
validatePrivilegeChangeOperation
} from "@app/ee/services/permission/permission-fns";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service-types";
import { extractX509CertFromChain } from "@app/lib/certificates/extract-certificate";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError, NotFoundError, PermissionBoundaryError, UnauthorizedError } from "@app/lib/errors";
import { extractIPDetails, isValidIpOrCidr } from "@app/lib/ip";
import { ActorType, AuthTokenType } from "../auth/auth-type";
import { TIdentityOrgDALFactory } from "../identity/identity-org-dal";
import { TIdentityAccessTokenDALFactory } from "../identity-access-token/identity-access-token-dal";
import { TIdentityAccessTokenJwtPayload } from "../identity-access-token/identity-access-token-types";
import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types";
import { validateIdentityUpdateForSuperAdminPrivileges } from "../super-admin/super-admin-fns";
import { TIdentityTlsCertAuthDALFactory } from "./identity-tls-cert-auth-dal";
import { TIdentityTlsCertAuthServiceFactory } from "./identity-tls-cert-auth-types";
type TIdentityTlsCertAuthServiceFactoryDep = {
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create" | "delete">;
identityTlsCertAuthDAL: Pick<
TIdentityTlsCertAuthDALFactory,
"findOne" | "transaction" | "create" | "updateById" | "delete"
>;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
};
const parseSubjectDetails = (data: string) => {
const values: Record<string, string> = {};
data.split("\n").forEach((el) => {
const [key, value] = el.split("=");
values[key.trim()] = value.trim();
});
return values;
};
export const identityTlsCertAuthServiceFactory = ({
identityAccessTokenDAL,
identityTlsCertAuthDAL,
identityOrgMembershipDAL,
licenseService,
permissionService,
kmsService
}: TIdentityTlsCertAuthServiceFactoryDep): TIdentityTlsCertAuthServiceFactory => {
const login: TIdentityTlsCertAuthServiceFactory["login"] = async ({ identityId, clientCertificate }) => {
const identityTlsCertAuth = await identityTlsCertAuthDAL.findOne({ identityId });
if (!identityTlsCertAuth) {
throw new NotFoundError({
message: "TLS Certificate auth method not found for identity, did you configure TLS Certificate auth?"
});
}
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({
identityId: identityTlsCertAuth.identityId
});
if (!identityMembershipOrg) {
throw new NotFoundError({
message: `Identity organization membership for identity with ID '${identityTlsCertAuth.identityId}' not found`
});
}
const { decryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.Organization,
orgId: identityMembershipOrg.orgId
});
const caCertificate = decryptor({
cipherTextBlob: identityTlsCertAuth.encryptedCaCertificate
}).toString();
const leafCertificate = extractX509CertFromChain(decodeURIComponent(clientCertificate))?.[0];
if (!leafCertificate) {
throw new BadRequestError({ message: "Missing client certificate" });
}
const clientCertificateX509 = new crypto.X509Certificate(leafCertificate);
const caCertificateX509 = new crypto.X509Certificate(caCertificate);
const isValidCertificate = clientCertificateX509.verify(caCertificateX509.publicKey);
if (!isValidCertificate)
throw new UnauthorizedError({
message: "Access denied: Certificate not issued by the provided CA."
});
if (new Date(clientCertificateX509.validTo) < new Date()) {
throw new UnauthorizedError({
message: "Access denied: Certificate has expired."
});
}
if (new Date(clientCertificateX509.validFrom) > new Date()) {
throw new UnauthorizedError({
message: "Access denied: Certificate not yet valid."
});
}
const subjectDetails = parseSubjectDetails(clientCertificateX509.subject);
if (identityTlsCertAuth.allowedCommonNames) {
const isValidCommonName = identityTlsCertAuth.allowedCommonNames.split(",").includes(subjectDetails.CN);
if (!isValidCommonName) {
throw new UnauthorizedError({
message: "Access denied: TLS Certificate Auth common name not allowed."
});
}
}
// Generate the token
const identityAccessToken = await identityTlsCertAuthDAL.transaction(async (tx) => {
const newToken = await identityAccessTokenDAL.create(
{
identityId: identityTlsCertAuth.identityId,
isAccessTokenRevoked: false,
accessTokenTTL: identityTlsCertAuth.accessTokenTTL,
accessTokenMaxTTL: identityTlsCertAuth.accessTokenMaxTTL,
accessTokenNumUses: 0,
accessTokenNumUsesLimit: identityTlsCertAuth.accessTokenNumUsesLimit,
authMethod: IdentityAuthMethod.TLS_CERT_AUTH
},
tx
);
return newToken;
});
const appCfg = getConfig();
const accessToken = jwt.sign(
{
identityId: identityTlsCertAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET,
Number(identityAccessToken.accessTokenTTL) === 0
? undefined
: {
expiresIn: Number(identityAccessToken.accessTokenTTL)
}
);
return {
identityTlsCertAuth,
accessToken,
identityAccessToken,
identityMembershipOrg
};
};
const attachTlsCertAuth: TIdentityTlsCertAuthServiceFactory["attachTlsCertAuth"] = async ({
identityId,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId,
isActorSuperAdmin,
caCertificate,
allowedCommonNames
}) => {
await validateIdentityUpdateForSuperAdminPrivileges(identityId, isActorSuperAdmin);
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new NotFoundError({ message: `Failed to find identity with ID ${identityId}` });
if (identityMembershipOrg.identity.authMethods.includes(IdentityAuthMethod.TLS_CERT_AUTH)) {
throw new BadRequestError({
message: "Failed to add TLS Certificate Auth to already configured identity"
});
}
if (accessTokenMaxTTL > 0 && accessTokenTTL > accessTokenMaxTTL) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionIdentityActions.Create, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const { encryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.Organization,
orgId: identityMembershipOrg.orgId
});
const identityTlsCertAuth = await identityTlsCertAuthDAL.transaction(async (tx) => {
const doc = await identityTlsCertAuthDAL.create(
{
identityId: identityMembershipOrg.identityId,
accessTokenMaxTTL,
allowedCommonNames,
accessTokenTTL,
encryptedCaCertificate: encryptor({ plainText: Buffer.from(caCertificate) }).cipherTextBlob,
accessTokenNumUsesLimit,
accessTokenTrustedIps: JSON.stringify(reformattedAccessTokenTrustedIps)
},
tx
);
return doc;
});
return { ...identityTlsCertAuth, orgId: identityMembershipOrg.orgId };
};
const updateTlsCertAuth: TIdentityTlsCertAuthServiceFactory["updateTlsCertAuth"] = async ({
identityId,
caCertificate,
allowedCommonNames,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new NotFoundError({ message: `Failed to find identity with ID ${identityId}` });
if (!identityMembershipOrg.identity.authMethods.includes(IdentityAuthMethod.TLS_CERT_AUTH)) {
throw new NotFoundError({
message: "The identity does not have TLS Certificate Auth attached"
});
}
const identityTlsCertAuth = await identityTlsCertAuthDAL.findOne({ identityId });
if (
(accessTokenMaxTTL || identityTlsCertAuth.accessTokenMaxTTL) > 0 &&
(accessTokenTTL || identityTlsCertAuth.accessTokenTTL) >
(accessTokenMaxTTL || identityTlsCertAuth.accessTokenMaxTTL)
) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionIdentityActions.Edit, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps?.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const { encryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.Organization,
orgId: identityMembershipOrg.orgId
});
const updatedTlsCertAuth = await identityTlsCertAuthDAL.updateById(identityTlsCertAuth.id, {
allowedCommonNames,
encryptedCaCertificate: caCertificate
? encryptor({ plainText: Buffer.from(caCertificate) }).cipherTextBlob
: undefined,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: reformattedAccessTokenTrustedIps
? JSON.stringify(reformattedAccessTokenTrustedIps)
: undefined
});
return { ...updatedTlsCertAuth, orgId: identityMembershipOrg.orgId };
};
const getTlsCertAuth: TIdentityTlsCertAuthServiceFactory["getTlsCertAuth"] = async ({
identityId,
actorId,
actor,
actorAuthMethod,
actorOrgId
}) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new NotFoundError({ message: `Failed to find identity with ID ${identityId}` });
if (!identityMembershipOrg.identity.authMethods.includes(IdentityAuthMethod.TLS_CERT_AUTH)) {
throw new BadRequestError({
message: "The identity does not have TLS Certificate Auth attached"
});
}
const identityAuth = await identityTlsCertAuthDAL.findOne({ identityId });
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionIdentityActions.Read, OrgPermissionSubjects.Identity);
const { decryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.Organization,
orgId: identityMembershipOrg.orgId
});
let caCertificate = "";
if (identityAuth.encryptedCaCertificate) {
caCertificate = decryptor({ cipherTextBlob: identityAuth.encryptedCaCertificate }).toString();
}
return { ...identityAuth, caCertificate, orgId: identityMembershipOrg.orgId };
};
const revokeTlsCertAuth: TIdentityTlsCertAuthServiceFactory["revokeTlsCertAuth"] = async ({
identityId,
actorId,
actor,
actorAuthMethod,
actorOrgId
}) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new NotFoundError({ message: `Failed to find identity with ID ${identityId}` });
if (!identityMembershipOrg.identity.authMethods.includes(IdentityAuthMethod.TLS_CERT_AUTH)) {
throw new BadRequestError({
message: "The identity does not have TLS Certificate auth"
});
}
const { permission, membership } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionIdentityActions.Edit, OrgPermissionSubjects.Identity);
const { permission: rolePermission } = await permissionService.getOrgPermission(
ActorType.IDENTITY,
identityMembershipOrg.identityId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
const permissionBoundary = validatePrivilegeChangeOperation(
membership.shouldUseNewPrivilegeSystem,
OrgPermissionIdentityActions.RevokeAuth,
OrgPermissionSubjects.Identity,
permission,
rolePermission
);
if (!permissionBoundary.isValid)
throw new PermissionBoundaryError({
message: constructPermissionErrorMessage(
"Failed to revoke TLS Certificate auth of identity with more privileged role",
membership.shouldUseNewPrivilegeSystem,
OrgPermissionIdentityActions.RevokeAuth,
OrgPermissionSubjects.Identity
),
details: { missingPermissions: permissionBoundary.missingPermissions }
});
const revokedIdentityTlsCertAuth = await identityTlsCertAuthDAL.transaction(async (tx) => {
const deletedTlsCertAuth = await identityTlsCertAuthDAL.delete({ identityId }, tx);
await identityAccessTokenDAL.delete({ identityId, authMethod: IdentityAuthMethod.TLS_CERT_AUTH }, tx);
return { ...deletedTlsCertAuth?.[0], orgId: identityMembershipOrg.orgId };
});
return revokedIdentityTlsCertAuth;
};
return {
login,
attachTlsCertAuth,
updateTlsCertAuth,
getTlsCertAuth,
revokeTlsCertAuth
};
};

View File

@@ -0,0 +1,49 @@
import { TIdentityAccessTokens, TIdentityOrgMemberships, TIdentityTlsCertAuths } from "@app/db/schemas";
import { TProjectPermission } from "@app/lib/types";
export type TLoginTlsCertAuthDTO = {
identityId: string;
clientCertificate: string;
};
export type TAttachTlsCertAuthDTO = {
identityId: string;
caCertificate: string;
allowedCommonNames?: string | null;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: { ipAddress: string }[];
isActorSuperAdmin?: boolean;
} & Omit<TProjectPermission, "projectId">;
export type TUpdateTlsCertAuthDTO = {
identityId: string;
caCertificate?: string;
allowedCommonNames?: string | null;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TGetTlsCertAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;
export type TRevokeTlsCertAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;
export type TIdentityTlsCertAuthServiceFactory = {
login: (dto: TLoginTlsCertAuthDTO) => Promise<{
identityTlsCertAuth: TIdentityTlsCertAuths;
accessToken: string;
identityAccessToken: TIdentityAccessTokens;
identityMembershipOrg: TIdentityOrgMemberships;
}>;
attachTlsCertAuth: (dto: TAttachTlsCertAuthDTO) => Promise<TIdentityTlsCertAuths>;
updateTlsCertAuth: (dto: TUpdateTlsCertAuthDTO) => Promise<TIdentityTlsCertAuths>;
revokeTlsCertAuth: (dto: TRevokeTlsCertAuthDTO) => Promise<TIdentityTlsCertAuths>;
getTlsCertAuth: (dto: TGetTlsCertAuthDTO) => Promise<TIdentityTlsCertAuths & { caCertificate: string }>;
};

View File

@@ -1,5 +1,5 @@
import { TDbClient } from "@app/db";
import { TableName, TIdentities } from "@app/db/schemas";
import { IdentityAuthMethod, TableName, TIdentities } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
@@ -8,6 +8,28 @@ export type TIdentityDALFactory = ReturnType<typeof identityDALFactory>;
export const identityDALFactory = (db: TDbClient) => {
const identityOrm = ormify(db, TableName.Identity);
const getTrustedIpsByAuthMethod = async (identityId: string, authMethod: IdentityAuthMethod) => {
const authMethodToTableName = {
[IdentityAuthMethod.TOKEN_AUTH]: TableName.IdentityTokenAuth,
[IdentityAuthMethod.UNIVERSAL_AUTH]: TableName.IdentityUniversalAuth,
[IdentityAuthMethod.KUBERNETES_AUTH]: TableName.IdentityKubernetesAuth,
[IdentityAuthMethod.GCP_AUTH]: TableName.IdentityGcpAuth,
[IdentityAuthMethod.ALICLOUD_AUTH]: TableName.IdentityAliCloudAuth,
[IdentityAuthMethod.AWS_AUTH]: TableName.IdentityAwsAuth,
[IdentityAuthMethod.AZURE_AUTH]: TableName.IdentityAzureAuth,
[IdentityAuthMethod.TLS_CERT_AUTH]: TableName.IdentityTlsCertAuth,
[IdentityAuthMethod.OCI_AUTH]: TableName.IdentityOciAuth,
[IdentityAuthMethod.OIDC_AUTH]: TableName.IdentityOidcAuth,
[IdentityAuthMethod.JWT_AUTH]: TableName.IdentityJwtAuth,
[IdentityAuthMethod.LDAP_AUTH]: TableName.IdentityLdapAuth
} as const;
const tableName = authMethodToTableName[authMethod];
if (!tableName) return;
const data = await db(tableName).where({ identityId }).first();
if (!data) return;
return data.accessTokenTrustedIps;
};
const getIdentitiesByFilter = async ({
limit,
offset,
@@ -38,5 +60,5 @@ export const identityDALFactory = (db: TDbClient) => {
}
};
return { ...identityOrm, getIdentitiesByFilter };
return { ...identityOrm, getTrustedIpsByAuthMethod, getIdentitiesByFilter };
};

View File

@@ -11,7 +11,8 @@ export const buildAuthMethods = ({
azureId,
tokenId,
jwtId,
ldapId
ldapId,
tlsCertId
}: {
uaId?: string;
gcpId?: string;
@@ -24,6 +25,7 @@ export const buildAuthMethods = ({
tokenId?: string;
jwtId?: string;
ldapId?: string;
tlsCertId?: string;
}) => {
return [
...[uaId ? IdentityAuthMethod.UNIVERSAL_AUTH : null],
@@ -36,6 +38,7 @@ export const buildAuthMethods = ({
...[azureId ? IdentityAuthMethod.AZURE_AUTH : null],
...[tokenId ? IdentityAuthMethod.TOKEN_AUTH : null],
...[jwtId ? IdentityAuthMethod.JWT_AUTH : null],
...[ldapId ? IdentityAuthMethod.LDAP_AUTH : null]
...[ldapId ? IdentityAuthMethod.LDAP_AUTH : null],
...[tlsCertId ? IdentityAuthMethod.TLS_CERT_AUTH : null]
].filter((authMethod) => authMethod) as IdentityAuthMethod[];
};

View File

@@ -12,6 +12,7 @@ import {
TIdentityOciAuths,
TIdentityOidcAuths,
TIdentityOrgMemberships,
TIdentityTlsCertAuths,
TIdentityTokenAuths,
TIdentityUniversalAuths,
TOrgRoles
@@ -99,7 +100,11 @@ export const identityOrgDALFactory = (db: TDbClient) => {
`${TableName.IdentityOrgMembership}.identityId`,
`${TableName.IdentityLdapAuth}.identityId`
)
.leftJoin<TIdentityTlsCertAuths>(
TableName.IdentityTlsCertAuth,
`${TableName.IdentityOrgMembership}.identityId`,
`${TableName.IdentityTlsCertAuth}.identityId`
)
.select(
selectAllTableCols(TableName.IdentityOrgMembership),
@@ -114,6 +119,7 @@ export const identityOrgDALFactory = (db: TDbClient) => {
db.ref("id").as("tokenId").withSchema(TableName.IdentityTokenAuth),
db.ref("id").as("jwtId").withSchema(TableName.IdentityJwtAuth),
db.ref("id").as("ldapId").withSchema(TableName.IdentityLdapAuth),
db.ref("id").as("tlsCertId").withSchema(TableName.IdentityTlsCertAuth),
db.ref("name").withSchema(TableName.Identity),
db.ref("hasDeleteProtection").withSchema(TableName.Identity)
);
@@ -238,7 +244,11 @@ export const identityOrgDALFactory = (db: TDbClient) => {
"paginatedIdentity.identityId",
`${TableName.IdentityLdapAuth}.identityId`
)
.leftJoin<TIdentityTlsCertAuths>(
TableName.IdentityTlsCertAuth,
"paginatedIdentity.identityId",
`${TableName.IdentityTlsCertAuth}.identityId`
)
.select(
db.ref("id").withSchema("paginatedIdentity"),
db.ref("role").withSchema("paginatedIdentity"),
@@ -260,7 +270,8 @@ export const identityOrgDALFactory = (db: TDbClient) => {
db.ref("id").as("azureId").withSchema(TableName.IdentityAzureAuth),
db.ref("id").as("tokenId").withSchema(TableName.IdentityTokenAuth),
db.ref("id").as("jwtId").withSchema(TableName.IdentityJwtAuth),
db.ref("id").as("ldapId").withSchema(TableName.IdentityLdapAuth)
db.ref("id").as("ldapId").withSchema(TableName.IdentityLdapAuth),
db.ref("id").as("tlsCertId").withSchema(TableName.IdentityTlsCertAuth)
)
// cr stands for custom role
.select(db.ref("id").as("crId").withSchema(TableName.OrgRoles))
@@ -306,6 +317,7 @@ export const identityOrgDALFactory = (db: TDbClient) => {
azureId,
tokenId,
ldapId,
tlsCertId,
createdAt,
updatedAt
}) => ({
@@ -313,7 +325,6 @@ export const identityOrgDALFactory = (db: TDbClient) => {
roleId,
identityId,
id,
orgId,
createdAt,
updatedAt,
@@ -341,7 +352,8 @@ export const identityOrgDALFactory = (db: TDbClient) => {
azureId,
tokenId,
jwtId,
ldapId
ldapId,
tlsCertId
})
}
}),
@@ -380,7 +392,12 @@ export const identityOrgDALFactory = (db: TDbClient) => {
.join(TableName.Identity, `${TableName.Identity}.id`, `${TableName.IdentityOrgMembership}.identityId`)
.where(`${TableName.IdentityOrgMembership}.orgId`, orgId)
.leftJoin(TableName.OrgRoles, `${TableName.IdentityOrgMembership}.roleId`, `${TableName.OrgRoles}.id`)
.orderBy(`${TableName.Identity}.${orderBy}`, orderDirection)
.orderBy(
orderBy === OrgIdentityOrderBy.Role
? `${TableName.IdentityOrgMembership}.${orderBy}`
: `${TableName.Identity}.${orderBy}`,
orderDirection
)
.select(`${TableName.IdentityOrgMembership}.id`)
.select<{ id: string; total_count: string }>(
db.raw(
@@ -511,6 +528,23 @@ export const identityOrgDALFactory = (db: TDbClient) => {
if (orderBy === OrgIdentityOrderBy.Name) {
void query.orderBy("identityName", orderDirection);
} else if (orderBy === OrgIdentityOrderBy.Role) {
void query.orderByRaw(
`
CASE
WHEN ??.role = ?
THEN ??.slug
ELSE ??.role
END ?
`,
[
TableName.IdentityOrgMembership,
"custom",
TableName.OrgRoles,
TableName.IdentityOrgMembership,
db.raw(orderDirection)
]
);
}
const docs = await query;

View File

@@ -46,8 +46,8 @@ export type TListOrgIdentitiesByOrgIdDTO = {
} & TOrgPermission;
export enum OrgIdentityOrderBy {
Name = "name"
// Role = "role"
Name = "name",
Role = "role"
}
export type TSearchOrgIdentitiesByOrgIdDAL = {

View File

@@ -84,6 +84,8 @@ export enum IntegrationUrls {
QOVERY_API_URL = "https://api.qovery.com",
TERRAFORM_CLOUD_API_URL = "https://app.terraform.io",
CLOUDFLARE_PAGES_API_URL = "https://api.cloudflare.com",
// eslint-disable-next-line @typescript-eslint/no-duplicate-enum-values
CLOUDFLARE_API_URL = "https://api.cloudflare.com",
// eslint-disable-next-line
CLOUDFLARE_WORKERS_API_URL = "https://api.cloudflare.com",
BITBUCKET_API_URL = "https://api.bitbucket.org",

View File

@@ -42,7 +42,7 @@ import { TProjectPermission } from "@app/lib/types";
import { TQueueServiceFactory } from "@app/queue";
import { TPkiSubscriberDALFactory } from "@app/services/pki-subscriber/pki-subscriber-dal";
import { ActorType } from "../auth/auth-type";
import { ActorAuthMethod, ActorType } from "../auth/auth-type";
import { TCertificateDALFactory } from "../certificate/certificate-dal";
import { TCertificateAuthorityDALFactory } from "../certificate-authority/certificate-authority-dal";
import { expandInternalCa } from "../certificate-authority/certificate-authority-fns";
@@ -82,6 +82,7 @@ import { assignWorkspaceKeysToMembers, bootstrapSshProject, createProjectKey } f
import { TProjectQueueFactory } from "./project-queue";
import { TProjectSshConfigDALFactory } from "./project-ssh-config-dal";
import {
ProjectFilterType,
TCreateProjectDTO,
TDeleteProjectDTO,
TDeleteProjectWorkflowIntegration,
@@ -866,6 +867,39 @@ export const projectServiceFactory = ({
});
};
const extractProjectIdFromSlug = async ({
projectSlug,
projectId,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: {
projectSlug?: string;
projectId?: string;
actorId: string;
actorAuthMethod: ActorAuthMethod;
actor: ActorType;
actorOrgId: string;
}) => {
if (projectId) return projectId;
if (!projectSlug) throw new BadRequestError({ message: "You must provide projectSlug or workspaceId" });
const project = await getAProject({
filter: {
type: ProjectFilterType.SLUG,
orgId: actorOrgId,
slug: projectSlug
},
actorId,
actorAuthMethod,
actor,
actorOrgId
});
if (!project) throw new NotFoundError({ message: `No project found with slug ${projectSlug}` });
return project.id;
};
const getProjectUpgradeStatus = async ({
projectId,
actor,
@@ -2006,6 +2040,7 @@ export const projectServiceFactory = ({
getProjectSshConfig,
updateProjectSshConfig,
requestProjectAccess,
searchProjects
searchProjects,
extractProjectIdFromSlug
};
};

View File

@@ -6,6 +6,7 @@ import { ActionProjectType, TSecretFoldersInsert } from "@app/db/schemas";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service-types";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { TSecretSnapshotServiceFactory } from "@app/ee/services/secret-snapshot/secret-snapshot-service";
import { PgSqlLock } from "@app/keystore/keystore";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { OrderByDirection, OrgServiceActor } from "@app/lib/types";
import { buildFolderPath } from "@app/services/secret-folder/secret-folder-fns";
@@ -83,36 +84,75 @@ export const secretFolderServiceFactory = ({
// that is this request must be idempotent
// so we do a tricky move. we try to find the to be created folder path if that is exactly match return that
// else we get some path before that then we will start creating remaining folder
await tx.raw("SELECT pg_advisory_xact_lock(?)", [PgSqlLock.CreateFolder(env.id, env.projectId)]);
const pathWithFolder = path.join(secretPath, name);
const parentFolder = await folderDAL.findClosestFolder(projectId, environment, pathWithFolder, tx);
// no folder found is not possible root should be their
if (!parentFolder) {
throw new NotFoundError({
message: `Folder with path '${pathWithFolder}' in environment with slug '${environment}' not found`
message: `Parent folder for path '${pathWithFolder}' not found`
});
}
// exact folder
if (parentFolder.path === pathWithFolder) return parentFolder;
let parentFolderId = parentFolder.id;
// check if the exact folder already exists
const existingFolder = await folderDAL.findOne(
{
envId: env.id,
parentId: parentFolder.id,
name,
isReserved: false
},
tx
);
if (existingFolder) {
return existingFolder;
}
// exact folder case
if (parentFolder.path === pathWithFolder) {
return parentFolder;
}
let currentParentId = parentFolder.id;
// build the full path we need by processing each segment
if (parentFolder.path !== secretPath) {
// this is upsert folder in a path
// we are not taking snapshots of this because
// snapshot will be removed from automatic for all commits to user click or cron based
const missingSegment = secretPath.substring(parentFolder.path.length).split("/").filter(Boolean);
if (missingSegment.length) {
const newFolders: Array<TSecretFoldersInsert & { id: string }> = missingSegment.map((segment) => {
const missingSegments = secretPath.substring(parentFolder.path.length).split("/").filter(Boolean);
const newFolders: TSecretFoldersInsert[] = [];
// process each segment sequentially
for await (const segment of missingSegments) {
const existingSegment = await folderDAL.findOne(
{
name: segment,
parentId: currentParentId,
envId: env.id,
isReserved: false
},
tx
);
if (existingSegment) {
// use existing folder and update the path / parent
currentParentId = existingSegment.id;
} else {
const newFolder = {
name: segment,
parentId: parentFolderId,
parentId: currentParentId,
id: uuidv4(),
envId: env.id,
version: 1
};
parentFolderId = newFolder.id;
return newFolder;
});
parentFolderId = newFolders.at(-1)?.id as string;
currentParentId = newFolder.id;
newFolders.push(newFolder);
}
}
if (newFolders.length) {
const docs = await folderDAL.insertMany(newFolders, tx);
const folderVersions = await folderVersionDAL.insertMany(
docs.map((doc) => ({
@@ -133,7 +173,7 @@ export const secretFolderServiceFactory = ({
}
},
message: "Folder created",
folderId: parentFolderId,
folderId: currentParentId,
changes: folderVersions.map((fv) => ({
type: CommitType.ADD,
folderVersionId: fv.id
@@ -145,9 +185,10 @@ export const secretFolderServiceFactory = ({
}
const doc = await folderDAL.create(
{ name, envId: env.id, version: 1, parentId: parentFolderId, description },
{ name, envId: env.id, version: 1, parentId: currentParentId, description },
tx
);
const folderVersion = await folderVersionDAL.create(
{
name: doc.name,
@@ -158,6 +199,7 @@ export const secretFolderServiceFactory = ({
},
tx
);
await folderCommitService.createCommit(
{
actor: {
@@ -167,7 +209,7 @@ export const secretFolderServiceFactory = ({
}
},
message: "Folder created",
folderId: parentFolderId,
folderId: doc.id,
changes: [
{
type: CommitType.ADD,
@@ -177,6 +219,7 @@ export const secretFolderServiceFactory = ({
},
tx
);
return doc;
});

View File

@@ -307,7 +307,6 @@ export const AwsParameterStoreSyncFns = {
awsParameterStoreSecretsRecord,
Boolean(syncOptions.tags?.length || syncOptions.syncSecretMetadataAsTags)
);
const syncTagsRecord = Object.fromEntries(syncOptions.tags?.map((tag) => [tag.key, tag.value]) ?? []);
for await (const entry of Object.entries(secretMap)) {
const [key, { value, secretMetadata }] = entry;
@@ -342,13 +341,13 @@ export const AwsParameterStoreSyncFns = {
}
}
if (shouldManageTags) {
if ((syncOptions.tags !== undefined || syncOptions.syncSecretMetadataAsTags) && shouldManageTags) {
const { tagsToAdd, tagKeysToRemove } = processParameterTags({
syncTagsRecord: {
// configured sync tags take preference over secret metadata
...(syncOptions.syncSecretMetadataAsTags &&
Object.fromEntries(secretMetadata?.map((tag) => [tag.key, tag.value]) ?? [])),
...syncTagsRecord
...(syncOptions.tags && Object.fromEntries(syncOptions.tags?.map((tag) => [tag.key, tag.value]) ?? []))
},
awsTagsRecord: awsParameterStoreTagsRecord[key] ?? {}
});

View File

@@ -366,37 +366,39 @@ export const AwsSecretsManagerSyncFns = {
}
}
const { tagsToAdd, tagKeysToRemove } = processTags({
syncTagsRecord: {
// configured sync tags take preference over secret metadata
...(syncOptions.syncSecretMetadataAsTags &&
Object.fromEntries(secretMetadata?.map((tag) => [tag.key, tag.value]) ?? [])),
...syncTagsRecord
},
awsTagsRecord: Object.fromEntries(
awsDescriptionsRecord[key]?.Tags?.map((tag) => [tag.Key!, tag.Value!]) ?? []
)
});
if (syncOptions.tags !== undefined || syncOptions.syncSecretMetadataAsTags) {
const { tagsToAdd, tagKeysToRemove } = processTags({
syncTagsRecord: {
// configured sync tags take preference over secret metadata
...(syncOptions.syncSecretMetadataAsTags &&
Object.fromEntries(secretMetadata?.map((tag) => [tag.key, tag.value]) ?? [])),
...(syncOptions.tags !== undefined && syncTagsRecord)
},
awsTagsRecord: Object.fromEntries(
awsDescriptionsRecord[key]?.Tags?.map((tag) => [tag.Key!, tag.Value!]) ?? []
)
});
if (tagsToAdd.length) {
try {
await addTags(client, key, tagsToAdd);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
if (tagsToAdd.length) {
try {
await addTags(client, key, tagsToAdd);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
}
if (tagKeysToRemove.length) {
try {
await removeTags(client, key, tagKeysToRemove);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
if (tagKeysToRemove.length) {
try {
await removeTags(client, key, tagKeysToRemove);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
}
}
@@ -439,32 +441,34 @@ export const AwsSecretsManagerSyncFns = {
});
}
const { tagsToAdd, tagKeysToRemove } = processTags({
syncTagsRecord,
awsTagsRecord: Object.fromEntries(
awsDescriptionsRecord[destinationConfig.secretName]?.Tags?.map((tag) => [tag.Key!, tag.Value!]) ?? []
)
});
if (syncOptions.tags !== undefined) {
const { tagsToAdd, tagKeysToRemove } = processTags({
syncTagsRecord,
awsTagsRecord: Object.fromEntries(
awsDescriptionsRecord[destinationConfig.secretName]?.Tags?.map((tag) => [tag.Key!, tag.Value!]) ?? []
)
});
if (tagsToAdd.length) {
try {
await addTags(client, destinationConfig.secretName, tagsToAdd);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: destinationConfig.secretName
});
if (tagsToAdd.length) {
try {
await addTags(client, destinationConfig.secretName, tagsToAdd);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: destinationConfig.secretName
});
}
}
}
if (tagKeysToRemove.length) {
try {
await removeTags(client, destinationConfig.secretName, tagKeysToRemove);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: destinationConfig.secretName
});
if (tagKeysToRemove.length) {
try {
await removeTags(client, destinationConfig.secretName, tagKeysToRemove);
} catch (error) {
throw new SecretSyncError({
error,
secretKey: destinationConfig.secretName
});
}
}
}
}

View File

@@ -0,0 +1,10 @@
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { TSecretSyncListItem } from "@app/services/secret-sync/secret-sync-types";
export const CLOUDFLARE_PAGES_SYNC_LIST_OPTION: TSecretSyncListItem = {
name: "Cloudflare Pages",
destination: SecretSync.CloudflarePages,
connection: AppConnection.Cloudflare,
canImportSecrets: false
};

View File

@@ -0,0 +1,138 @@
import { request } from "@app/lib/config/request";
import { IntegrationUrls } from "@app/services/integration-auth/integration-list";
import { matchesSchema } from "@app/services/secret-sync/secret-sync-fns";
import { TSecretMap } from "@app/services/secret-sync/secret-sync-types";
import { SECRET_SYNC_NAME_MAP } from "../secret-sync-maps";
import { TCloudflarePagesSyncWithCredentials } from "./cloudflare-pages-types";
const getProjectEnvironmentSecrets = async (secretSync: TCloudflarePagesSyncWithCredentials) => {
const {
destinationConfig,
connection: {
credentials: { apiToken, accountId }
}
} = secretSync;
const secrets = (
await request.get<{
result: {
deployment_configs: Record<
string,
{
env_vars: Record<string, { type: "plain_text" | "secret_text"; value: string }>;
}
>;
};
}>(
`${IntegrationUrls.CLOUDFLARE_PAGES_API_URL}/client/v4/accounts/${accountId}/pages/projects/${destinationConfig.projectName}`,
{
headers: {
Authorization: `Bearer ${apiToken}`,
Accept: "application/json"
}
}
)
).data.result.deployment_configs[destinationConfig.environment].env_vars;
return Object.entries(secrets ?? {}).map(([key, envVar]) => ({
key,
value: envVar.value
}));
};
export const CloudflarePagesSyncFns = {
syncSecrets: async (secretSync: TCloudflarePagesSyncWithCredentials, secretMap: TSecretMap) => {
const {
destinationConfig,
connection: {
credentials: { apiToken, accountId }
}
} = secretSync;
// Create/update secret entries
let secretEntries: [string, object | null][] = Object.entries(secretMap).map(([key, val]) => [
key,
{ type: "secret_text", value: val.value }
]);
// Handle deletions if not disabled
if (!secretSync.syncOptions.disableSecretDeletion) {
const existingSecrets = await getProjectEnvironmentSecrets(secretSync);
const toDeleteKeys = existingSecrets
.filter(
(secret) =>
matchesSchema(secret.key, secretSync.environment?.slug || "", secretSync.syncOptions.keySchema) &&
!secretMap[secret.key]
)
.map((secret) => secret.key);
const toDeleteEntries: [string, null][] = toDeleteKeys.map((key) => [key, null]);
secretEntries = [...secretEntries, ...toDeleteEntries];
}
const data = {
deployment_configs: {
[destinationConfig.environment]: {
env_vars: Object.fromEntries(secretEntries)
}
}
};
await request.patch(
`${IntegrationUrls.CLOUDFLARE_PAGES_API_URL}/client/v4/accounts/${accountId}/pages/projects/${destinationConfig.projectName}`,
data,
{
headers: {
Authorization: `Bearer ${apiToken}`,
Accept: "application/json"
}
}
);
},
getSecrets: async (secretSync: TCloudflarePagesSyncWithCredentials): Promise<TSecretMap> => {
throw new Error(`${SECRET_SYNC_NAME_MAP[secretSync.destination]} does not support importing secrets.`);
},
removeSecrets: async (secretSync: TCloudflarePagesSyncWithCredentials, secretMap: TSecretMap) => {
const {
destinationConfig,
connection: {
credentials: { apiToken, accountId }
}
} = secretSync;
const secrets = await getProjectEnvironmentSecrets(secretSync);
const toDeleteKeys = secrets
.filter(
(secret) =>
matchesSchema(secret.key, secretSync.environment?.slug || "", secretSync.syncOptions.keySchema) &&
secret.key in secretMap
)
.map((secret) => secret.key);
if (toDeleteKeys.length === 0) return;
const secretEntries: [string, null][] = toDeleteKeys.map((key) => [key, null]);
const data = {
deployment_configs: {
[destinationConfig.environment]: {
env_vars: Object.fromEntries(secretEntries)
}
}
};
await request.patch(
`${IntegrationUrls.CLOUDFLARE_PAGES_API_URL}/client/v4/accounts/${accountId}/pages/projects/${destinationConfig.projectName}`,
data,
{
headers: {
Authorization: `Bearer ${apiToken}`,
Accept: "application/json"
}
}
);
}
};

View File

@@ -0,0 +1,53 @@
import { z } from "zod";
import { SecretSyncs } from "@app/lib/api-docs";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import {
BaseSecretSyncSchema,
GenericCreateSecretSyncFieldsSchema,
GenericUpdateSecretSyncFieldsSchema
} from "@app/services/secret-sync/secret-sync-schemas";
import { TSyncOptionsConfig } from "@app/services/secret-sync/secret-sync-types";
const CloudflarePagesSyncDestinationConfigSchema = z.object({
projectName: z
.string()
.min(1, "Project name is required")
.describe(SecretSyncs.DESTINATION_CONFIG.CLOUDFLARE_PAGES.projectName),
environment: z
.string()
.min(1, "Environment is required")
.describe(SecretSyncs.DESTINATION_CONFIG.CLOUDFLARE_PAGES.environment)
});
const CloudflarePagesSyncOptionsConfig: TSyncOptionsConfig = { canImportSecrets: false };
export const CloudflarePagesSyncSchema = BaseSecretSyncSchema(
SecretSync.CloudflarePages,
CloudflarePagesSyncOptionsConfig
).extend({
destination: z.literal(SecretSync.CloudflarePages),
destinationConfig: CloudflarePagesSyncDestinationConfigSchema
});
export const CreateCloudflarePagesSyncSchema = GenericCreateSecretSyncFieldsSchema(
SecretSync.CloudflarePages,
CloudflarePagesSyncOptionsConfig
).extend({
destinationConfig: CloudflarePagesSyncDestinationConfigSchema
});
export const UpdateCloudflarePagesSyncSchema = GenericUpdateSecretSyncFieldsSchema(
SecretSync.CloudflarePages,
CloudflarePagesSyncOptionsConfig
).extend({
destinationConfig: CloudflarePagesSyncDestinationConfigSchema.optional()
});
export const CloudflarePagesSyncListItemSchema = z.object({
name: z.literal("Cloudflare Pages"),
connection: z.literal(AppConnection.Cloudflare),
destination: z.literal(SecretSync.CloudflarePages),
canImportSecrets: z.literal(false)
});

View File

@@ -0,0 +1,19 @@
import z from "zod";
import { TCloudflareConnection } from "@app/services/app-connection/cloudflare/cloudflare-connection-types";
import {
CloudflarePagesSyncListItemSchema,
CloudflarePagesSyncSchema,
CreateCloudflarePagesSyncSchema
} from "./cloudflare-pages-schema";
export type TCloudflarePagesSyncListItem = z.infer<typeof CloudflarePagesSyncListItemSchema>;
export type TCloudflarePagesSync = z.infer<typeof CloudflarePagesSyncSchema>;
export type TCloudflarePagesSyncInput = z.infer<typeof CreateCloudflarePagesSyncSchema>;
export type TCloudflarePagesSyncWithCredentials = TCloudflarePagesSync & {
connection: TCloudflareConnection;
};

View File

@@ -0,0 +1,10 @@
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { TSecretSyncListItem } from "@app/services/secret-sync/secret-sync-types";
export const GITLAB_SYNC_LIST_OPTION: TSecretSyncListItem = {
name: "GitLab",
destination: SecretSync.GitLab,
connection: AppConnection.GitLab,
canImportSecrets: false
};

View File

@@ -0,0 +1,4 @@
export enum GitLabSyncScope {
Project = "project",
Group = "group"
}

View File

@@ -0,0 +1,452 @@
/* eslint-disable no-await-in-loop */
import { GitbeakerRequestError } from "@gitbeaker/rest";
import { TAppConnectionDALFactory } from "@app/services/app-connection/app-connection-dal";
import {
getGitLabClient,
GitLabConnectionMethod,
refreshGitLabToken,
TGitLabConnection
} from "@app/services/app-connection/gitlab";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TGitLabSyncWithCredentials, TGitLabVariable } from "@app/services/secret-sync/gitlab/gitlab-sync-types";
import { SecretSyncError } from "@app/services/secret-sync/secret-sync-errors";
import { matchesSchema } from "@app/services/secret-sync/secret-sync-fns";
import { TSecretMap } from "@app/services/secret-sync/secret-sync-types";
import { SECRET_SYNC_NAME_MAP } from "../secret-sync-maps";
import { GitLabSyncScope } from "./gitlab-sync-enums";
interface TGitLabVariablePayload {
key?: string;
value: string;
variable_type?: "env_var" | "file";
environment_scope?: string;
protected?: boolean;
masked?: boolean;
masked_and_hidden?: boolean;
description?: string;
}
interface TGitLabVariableCreate extends TGitLabVariablePayload {
key: string;
}
interface TGitLabVariableUpdate extends Omit<TGitLabVariablePayload, "key"> {}
type TGitLabSyncFactoryDeps = {
appConnectionDAL: Pick<TAppConnectionDALFactory, "updateById">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
};
const getValidAccessToken = async (
connection: TGitLabConnection,
appConnectionDAL: Pick<TAppConnectionDALFactory, "updateById">,
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">
): Promise<string> => {
if (
connection.method === GitLabConnectionMethod.OAuth &&
connection.credentials.refreshToken &&
new Date(connection.credentials.expiresAt) < new Date()
) {
const accessToken = await refreshGitLabToken(
connection.credentials.refreshToken,
connection.id,
connection.orgId,
appConnectionDAL,
kmsService,
connection.credentials.instanceUrl
);
return accessToken;
}
return connection.credentials.accessToken;
};
const getGitLabVariables = async ({
accessToken,
connection,
scope,
resourceId,
targetEnvironment
}: {
accessToken: string;
connection: TGitLabConnection;
scope: GitLabSyncScope;
resourceId: string;
targetEnvironment?: string;
}): Promise<TGitLabVariable[]> => {
try {
const client = await getGitLabClient(
accessToken,
connection.credentials.instanceUrl,
connection.method === GitLabConnectionMethod.OAuth
);
let variables: TGitLabVariable[] = [];
if (scope === GitLabSyncScope.Project) {
variables = await client.ProjectVariables.all(resourceId);
} else {
variables = await client.GroupVariables.all(resourceId);
}
if (targetEnvironment) {
variables = variables.filter((v) => v.environmentScope === targetEnvironment);
}
return variables;
} catch (error) {
if (error instanceof GitbeakerRequestError) {
throw new SecretSyncError({
error: new Error(
`Failed to fetch variables: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
)
});
}
throw new SecretSyncError({
error
});
}
};
const createGitLabVariable = async ({
accessToken,
connection,
scope,
resourceId,
variable
}: {
accessToken: string;
connection: TGitLabConnection;
scope: GitLabSyncScope;
resourceId: string;
variable: TGitLabVariableCreate;
}): Promise<void> => {
try {
const client = await getGitLabClient(
accessToken,
connection.credentials.instanceUrl,
connection.method === GitLabConnectionMethod.OAuth
);
const payload = {
key: variable.key,
value: variable.value,
variableType: "env_var",
environmentScope: variable.environment_scope || "*",
protected: variable.protected || false,
masked: variable.masked || false,
masked_and_hidden: variable.masked_and_hidden || false,
raw: false
};
if (scope === GitLabSyncScope.Project) {
await client.ProjectVariables.create(resourceId, payload.key, payload.value, {
variableType: "env_var",
environmentScope: payload.environmentScope,
protected: payload.protected,
masked: payload.masked,
masked_and_hidden: payload.masked_and_hidden,
raw: false
});
} else {
await client.GroupVariables.create(resourceId, payload.key, payload.value, {
variableType: "env_var",
environmentScope: payload.environmentScope,
protected: payload.protected,
masked: payload.masked,
...(payload.masked_and_hidden && { masked_and_hidden: payload.masked_and_hidden }),
raw: false
});
}
} catch (error) {
if (error instanceof GitbeakerRequestError) {
throw new SecretSyncError({
error: new Error(
`Failed to create variable: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
),
secretKey: variable.key
});
}
throw new SecretSyncError({
error,
secretKey: variable.key
});
}
};
const updateGitLabVariable = async ({
accessToken,
connection,
scope,
resourceId,
key,
variable,
targetEnvironment
}: {
accessToken: string;
connection: TGitLabConnection;
scope: GitLabSyncScope;
resourceId: string;
key: string;
variable: TGitLabVariableUpdate;
targetEnvironment?: string;
}): Promise<void> => {
try {
const client = await getGitLabClient(
accessToken,
connection.credentials.instanceUrl,
connection.method === GitLabConnectionMethod.OAuth
);
const options = {
...(variable.environment_scope && { environmentScope: variable.environment_scope }),
...(variable.protected !== undefined && { protected: variable.protected }),
...(variable.masked !== undefined && { masked: variable.masked })
};
if (targetEnvironment) {
options.environmentScope = targetEnvironment;
}
if (scope === GitLabSyncScope.Project) {
await client.ProjectVariables.edit(resourceId, key, variable.value, {
...options,
filter: { environment_scope: targetEnvironment || "*" }
});
} else {
await client.GroupVariables.edit(resourceId, key, variable.value, {
...options,
filter: { environment_scope: targetEnvironment || "*" }
});
}
} catch (error) {
if (error instanceof GitbeakerRequestError) {
throw new SecretSyncError({
error: new Error(
`Failed to update variable: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
),
secretKey: key
});
}
throw new SecretSyncError({
error,
secretKey: key
});
}
};
const deleteGitLabVariable = async ({
accessToken,
connection,
scope,
resourceId,
key,
targetEnvironment,
allVariables
}: {
accessToken: string;
connection: TGitLabConnection;
scope: GitLabSyncScope;
resourceId: string;
key: string;
targetEnvironment?: string;
allVariables?: TGitLabVariable[];
}): Promise<void> => {
if (allVariables && !allVariables.find((v) => v.key === key)) {
return;
}
try {
const client = await getGitLabClient(
accessToken,
connection.credentials.instanceUrl,
connection.method === GitLabConnectionMethod.OAuth
);
const options: { filter?: { environment_scope: string } } = {};
if (targetEnvironment) {
options.filter = { environment_scope: targetEnvironment || "*" };
}
if (scope === GitLabSyncScope.Project) {
await client.ProjectVariables.remove(resourceId, key, options);
} else {
await client.GroupVariables.remove(resourceId, key);
}
} catch (error: unknown) {
if (error instanceof GitbeakerRequestError) {
throw new SecretSyncError({
error: new Error(
`Failed to delete variable: ${error.message ?? "Unknown error"}${error.cause?.description && error.message !== "Unauthorized" ? `. Cause: ${error.cause.description}` : ""}`
),
secretKey: key
});
}
throw new SecretSyncError({
error,
secretKey: key
});
}
};
export const GitLabSyncFns = {
syncSecrets: async (
secretSync: TGitLabSyncWithCredentials,
secretMap: TSecretMap,
{ appConnectionDAL, kmsService }: TGitLabSyncFactoryDeps
): Promise<void> => {
const { connection, environment, destinationConfig } = secretSync;
const { scope, targetEnvironment } = destinationConfig;
const resourceId = scope === GitLabSyncScope.Project ? destinationConfig.projectId : destinationConfig.groupId;
const accessToken = await getValidAccessToken(connection, appConnectionDAL, kmsService);
try {
const currentVariables = await getGitLabVariables({
accessToken,
connection,
scope,
resourceId,
targetEnvironment
});
const currentVariableMap = new Map(currentVariables.map((v) => [v.key, v]));
for (const [key, { value }] of Object.entries(secretMap)) {
if (value?.length < 8 && destinationConfig.shouldMaskSecrets) {
throw new SecretSyncError({
message: `Secret ${key} is too short to be masked. GitLab requires a minimum of 8 characters for masked secrets.`,
secretKey: key
});
}
try {
const existingVariable = currentVariableMap.get(key);
if (existingVariable) {
if (
existingVariable.value !== value ||
existingVariable.environmentScope !== targetEnvironment ||
existingVariable.protected !== destinationConfig.shouldProtectSecrets ||
existingVariable.masked !== destinationConfig.shouldMaskSecrets
) {
await updateGitLabVariable({
accessToken,
connection,
scope,
resourceId,
key,
variable: {
value,
environment_scope: targetEnvironment,
protected: destinationConfig.shouldProtectSecrets,
masked: destinationConfig.shouldMaskSecrets || existingVariable.hidden
},
targetEnvironment
});
}
} else {
await createGitLabVariable({
accessToken,
connection,
scope,
resourceId,
variable: {
key,
value,
variable_type: "env_var",
environment_scope: targetEnvironment || "*",
protected: destinationConfig.shouldProtectSecrets || false,
masked: destinationConfig.shouldMaskSecrets || false,
masked_and_hidden: destinationConfig.shouldHideSecrets || false
}
});
}
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
if (!secretSync.syncOptions.disableSecretDeletion) {
for (const variable of currentVariables) {
try {
const shouldDelete =
matchesSchema(variable.key, environment?.slug || "", secretSync.syncOptions.keySchema) &&
!(variable.key in secretMap);
if (shouldDelete) {
await deleteGitLabVariable({
accessToken,
connection,
scope,
resourceId,
key: variable.key,
targetEnvironment
});
}
} catch (error) {
throw new SecretSyncError({
error,
secretKey: variable.key
});
}
}
}
} catch (error) {
if (error instanceof SecretSyncError) {
throw error;
}
throw new SecretSyncError({
message: "Failed to sync secrets",
error
});
}
},
removeSecrets: async (
secretSync: TGitLabSyncWithCredentials,
secretMap: TSecretMap,
{ appConnectionDAL, kmsService }: TGitLabSyncFactoryDeps
): Promise<void> => {
const { connection, destinationConfig } = secretSync;
const { scope, targetEnvironment } = destinationConfig;
const resourceId = scope === GitLabSyncScope.Project ? destinationConfig.projectId : destinationConfig.groupId;
const accessToken = await getValidAccessToken(connection, appConnectionDAL, kmsService);
const allVariables = await getGitLabVariables({
accessToken,
connection,
scope,
resourceId,
targetEnvironment
});
for (const key of Object.keys(secretMap)) {
try {
await deleteGitLabVariable({
accessToken,
connection,
scope,
resourceId,
key,
targetEnvironment,
allVariables
});
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
},
getSecrets: async (secretSync: TGitLabSyncWithCredentials): Promise<TSecretMap> => {
throw new Error(`${SECRET_SYNC_NAME_MAP[secretSync.destination]} does not support importing secrets.`);
}
};

View File

@@ -0,0 +1,97 @@
import { z } from "zod";
import { SecretSyncs } from "@app/lib/api-docs";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import {
BaseSecretSyncSchema,
GenericCreateSecretSyncFieldsSchema,
GenericUpdateSecretSyncFieldsSchema
} from "@app/services/secret-sync/secret-sync-schemas";
import { TSyncOptionsConfig } from "@app/services/secret-sync/secret-sync-types";
import { GitLabSyncScope } from "./gitlab-sync-enums";
const GitLabSyncDestinationConfigSchema = z.discriminatedUnion("scope", [
z.object({
scope: z.literal(GitLabSyncScope.Project).describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.scope),
projectId: z.string().min(1, "Project ID is required").describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.projectId),
projectName: z
.string()
.min(1, "Project name is required")
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.projectName),
targetEnvironment: z
.string()
.optional()
.default("*")
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.targetEnvironment),
shouldProtectSecrets: z
.boolean()
.optional()
.default(false)
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.shouldProtectSecrets),
shouldMaskSecrets: z
.boolean()
.optional()
.default(false)
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.shouldMaskSecrets),
shouldHideSecrets: z
.boolean()
.optional()
.default(false)
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.shouldHideSecrets)
}),
z.object({
scope: z.literal(GitLabSyncScope.Group).describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.scope),
groupId: z.string().min(1, "Group ID is required").describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.groupId),
groupName: z.string().min(1, "Group name is required").describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.groupName),
targetEnvironment: z
.string()
.optional()
.default("*")
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.targetEnvironment),
shouldProtectSecrets: z
.boolean()
.optional()
.default(false)
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.shouldProtectSecrets),
shouldMaskSecrets: z
.boolean()
.optional()
.default(false)
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.shouldMaskSecrets),
shouldHideSecrets: z
.boolean()
.optional()
.default(false)
.describe(SecretSyncs.DESTINATION_CONFIG.GITLAB.shouldHideSecrets)
})
]);
const GitLabSyncOptionsConfig: TSyncOptionsConfig = { canImportSecrets: false };
export const GitLabSyncSchema = BaseSecretSyncSchema(SecretSync.GitLab, GitLabSyncOptionsConfig).extend({
destination: z.literal(SecretSync.GitLab),
destinationConfig: GitLabSyncDestinationConfigSchema
});
export const CreateGitLabSyncSchema = GenericCreateSecretSyncFieldsSchema(
SecretSync.GitLab,
GitLabSyncOptionsConfig
).extend({
destinationConfig: GitLabSyncDestinationConfigSchema
});
export const UpdateGitLabSyncSchema = GenericUpdateSecretSyncFieldsSchema(
SecretSync.GitLab,
GitLabSyncOptionsConfig
).extend({
destinationConfig: GitLabSyncDestinationConfigSchema.optional()
});
export const GitLabSyncListItemSchema = z.object({
name: z.literal("GitLab"),
connection: z.literal(AppConnection.GitLab),
destination: z.literal(SecretSync.GitLab),
canImportSecrets: z.literal(false)
});

View File

@@ -0,0 +1,58 @@
import { z } from "zod";
import { TGitLabConnection } from "@app/services/app-connection/gitlab";
import { CreateGitLabSyncSchema, GitLabSyncListItemSchema, GitLabSyncSchema } from "./gitlab-sync-schemas";
export type TGitLabSync = z.infer<typeof GitLabSyncSchema>;
export type TGitLabSyncInput = z.infer<typeof CreateGitLabSyncSchema>;
export type TGitLabSyncListItem = z.infer<typeof GitLabSyncListItemSchema>;
export type TGitLabSyncWithCredentials = TGitLabSync & {
connection: TGitLabConnection;
};
export type TGitLabVariable = {
key: string;
value: string;
protected: boolean;
masked: boolean;
environmentScope?: string;
hidden?: boolean;
};
export type TGitLabVariableCreate = {
key: string;
value: string;
variable_type?: "env_var" | "file";
protected?: boolean;
masked?: boolean;
raw?: boolean;
environment_scope?: string;
description?: string;
};
export type TGitLabVariableUpdate = {
value: string;
variable_type?: "env_var" | "file";
protected?: boolean;
masked?: boolean;
raw?: boolean;
environment_scope?: string;
description?: string | null;
};
export type TGitLabListVariables = {
accessToken: string;
projectId: string;
environmentScope?: string;
};
export type TGitLabCreateVariable = TGitLabListVariables & {
variable: TGitLabVariableCreate;
};
export type TGitLabUpdateVariable = TGitLabListVariables & {
key: string;
variable: TGitLabVariableUpdate;
};

View File

@@ -0,0 +1,4 @@
export * from "./gitlab-sync-constants";
export * from "./gitlab-sync-fns";
export * from "./gitlab-sync-schemas";
export * from "./gitlab-sync-types";

View File

@@ -18,7 +18,9 @@ export enum SecretSync {
OnePass = "1password",
Heroku = "heroku",
Render = "render",
Flyio = "flyio"
Flyio = "flyio",
GitLab = "gitlab",
CloudflarePages = "cloudflare-pages"
}
export enum SecretSyncInitialSyncBehavior {

View File

@@ -29,9 +29,12 @@ import { AZURE_APP_CONFIGURATION_SYNC_LIST_OPTION, azureAppConfigurationSyncFact
import { AZURE_DEVOPS_SYNC_LIST_OPTION, azureDevOpsSyncFactory } from "./azure-devops";
import { AZURE_KEY_VAULT_SYNC_LIST_OPTION, azureKeyVaultSyncFactory } from "./azure-key-vault";
import { CAMUNDA_SYNC_LIST_OPTION, camundaSyncFactory } from "./camunda";
import { CLOUDFLARE_PAGES_SYNC_LIST_OPTION } from "./cloudflare-pages/cloudflare-pages-constants";
import { CloudflarePagesSyncFns } from "./cloudflare-pages/cloudflare-pages-fns";
import { FLYIO_SYNC_LIST_OPTION, FlyioSyncFns } from "./flyio";
import { GCP_SYNC_LIST_OPTION } from "./gcp";
import { GcpSyncFns } from "./gcp/gcp-sync-fns";
import { GITLAB_SYNC_LIST_OPTION, GitLabSyncFns } from "./gitlab";
import { HC_VAULT_SYNC_LIST_OPTION, HCVaultSyncFns } from "./hc-vault";
import { HEROKU_SYNC_LIST_OPTION, HerokuSyncFns } from "./heroku";
import { HUMANITEC_SYNC_LIST_OPTION } from "./humanitec";
@@ -63,7 +66,9 @@ const SECRET_SYNC_LIST_OPTIONS: Record<SecretSync, TSecretSyncListItem> = {
[SecretSync.OnePass]: ONEPASS_SYNC_LIST_OPTION,
[SecretSync.Heroku]: HEROKU_SYNC_LIST_OPTION,
[SecretSync.Render]: RENDER_SYNC_LIST_OPTION,
[SecretSync.Flyio]: FLYIO_SYNC_LIST_OPTION
[SecretSync.Flyio]: FLYIO_SYNC_LIST_OPTION,
[SecretSync.GitLab]: GITLAB_SYNC_LIST_OPTION,
[SecretSync.CloudflarePages]: CLOUDFLARE_PAGES_SYNC_LIST_OPTION
};
export const listSecretSyncOptions = () => {
@@ -227,6 +232,10 @@ export const SecretSyncFns = {
return RenderSyncFns.syncSecrets(secretSync, schemaSecretMap);
case SecretSync.Flyio:
return FlyioSyncFns.syncSecrets(secretSync, schemaSecretMap);
case SecretSync.GitLab:
return GitLabSyncFns.syncSecrets(secretSync, schemaSecretMap, { appConnectionDAL, kmsService });
case SecretSync.CloudflarePages:
return CloudflarePagesSyncFns.syncSecrets(secretSync, schemaSecretMap);
default:
throw new Error(
`Unhandled sync destination for sync secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}`
@@ -313,6 +322,12 @@ export const SecretSyncFns = {
case SecretSync.Flyio:
secretMap = await FlyioSyncFns.getSecrets(secretSync);
break;
case SecretSync.GitLab:
secretMap = await GitLabSyncFns.getSecrets(secretSync);
break;
case SecretSync.CloudflarePages:
secretMap = await CloudflarePagesSyncFns.getSecrets(secretSync);
break;
default:
throw new Error(
`Unhandled sync destination for get secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}`
@@ -386,6 +401,10 @@ export const SecretSyncFns = {
return RenderSyncFns.removeSecrets(secretSync, schemaSecretMap);
case SecretSync.Flyio:
return FlyioSyncFns.removeSecrets(secretSync, schemaSecretMap);
case SecretSync.GitLab:
return GitLabSyncFns.removeSecrets(secretSync, schemaSecretMap, { appConnectionDAL, kmsService });
case SecretSync.CloudflarePages:
return CloudflarePagesSyncFns.removeSecrets(secretSync, schemaSecretMap);
default:
throw new Error(
`Unhandled sync destination for remove secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}`

View File

@@ -21,7 +21,9 @@ export const SECRET_SYNC_NAME_MAP: Record<SecretSync, string> = {
[SecretSync.OnePass]: "1Password",
[SecretSync.Heroku]: "Heroku",
[SecretSync.Render]: "Render",
[SecretSync.Flyio]: "Fly.io"
[SecretSync.Flyio]: "Fly.io",
[SecretSync.GitLab]: "GitLab",
[SecretSync.CloudflarePages]: "Cloudflare Pages"
};
export const SECRET_SYNC_CONNECTION_MAP: Record<SecretSync, AppConnection> = {
@@ -44,7 +46,9 @@ export const SECRET_SYNC_CONNECTION_MAP: Record<SecretSync, AppConnection> = {
[SecretSync.OnePass]: AppConnection.OnePass,
[SecretSync.Heroku]: AppConnection.Heroku,
[SecretSync.Render]: AppConnection.Render,
[SecretSync.Flyio]: AppConnection.Flyio
[SecretSync.Flyio]: AppConnection.Flyio,
[SecretSync.GitLab]: AppConnection.GitLab,
[SecretSync.CloudflarePages]: AppConnection.Cloudflare
};
export const SECRET_SYNC_PLAN_MAP: Record<SecretSync, SecretSyncPlanType> = {
@@ -67,5 +71,7 @@ export const SECRET_SYNC_PLAN_MAP: Record<SecretSync, SecretSyncPlanType> = {
[SecretSync.OnePass]: SecretSyncPlanType.Regular,
[SecretSync.Heroku]: SecretSyncPlanType.Regular,
[SecretSync.Render]: SecretSyncPlanType.Regular,
[SecretSync.Flyio]: SecretSyncPlanType.Regular
[SecretSync.Flyio]: SecretSyncPlanType.Regular,
[SecretSync.GitLab]: SecretSyncPlanType.Regular,
[SecretSync.CloudflarePages]: SecretSyncPlanType.Regular
};

View File

@@ -72,8 +72,15 @@ import {
TAzureKeyVaultSyncListItem,
TAzureKeyVaultSyncWithCredentials
} from "./azure-key-vault";
import {
TCloudflarePagesSync,
TCloudflarePagesSyncInput,
TCloudflarePagesSyncListItem,
TCloudflarePagesSyncWithCredentials
} from "./cloudflare-pages/cloudflare-pages-types";
import { TFlyioSync, TFlyioSyncInput, TFlyioSyncListItem, TFlyioSyncWithCredentials } from "./flyio/flyio-sync-types";
import { TGcpSync, TGcpSyncInput, TGcpSyncListItem, TGcpSyncWithCredentials } from "./gcp";
import { TGitLabSync, TGitLabSyncInput, TGitLabSyncListItem, TGitLabSyncWithCredentials } from "./gitlab";
import {
THCVaultSync,
THCVaultSyncInput,
@@ -127,7 +134,9 @@ export type TSecretSync =
| TOnePassSync
| THerokuSync
| TRenderSync
| TFlyioSync;
| TFlyioSync
| TGitLabSync
| TCloudflarePagesSync;
export type TSecretSyncWithCredentials =
| TAwsParameterStoreSyncWithCredentials
@@ -149,7 +158,9 @@ export type TSecretSyncWithCredentials =
| TOnePassSyncWithCredentials
| THerokuSyncWithCredentials
| TRenderSyncWithCredentials
| TFlyioSyncWithCredentials;
| TFlyioSyncWithCredentials
| TGitLabSyncWithCredentials
| TCloudflarePagesSyncWithCredentials;
export type TSecretSyncInput =
| TAwsParameterStoreSyncInput
@@ -171,7 +182,9 @@ export type TSecretSyncInput =
| TOnePassSyncInput
| THerokuSyncInput
| TRenderSyncInput
| TFlyioSyncInput;
| TFlyioSyncInput
| TGitLabSyncInput
| TCloudflarePagesSyncInput;
export type TSecretSyncListItem =
| TAwsParameterStoreSyncListItem
@@ -193,7 +206,9 @@ export type TSecretSyncListItem =
| TOnePassSyncListItem
| THerokuSyncListItem
| TRenderSyncListItem
| TFlyioSyncListItem;
| TFlyioSyncListItem
| TGitLabSyncListItem
| TCloudflarePagesSyncListItem;
export type TSyncOptionsConfig = {
canImportSecrets: boolean;

View File

@@ -1543,9 +1543,8 @@ export const secretServiceFactory = ({
actor,
environment,
viewSecretValue,
projectId: workspaceId,
projectId,
expandSecretReferences,
projectSlug,
actorId,
actorOrgId,
actorAuthMethod,
@@ -1553,7 +1552,6 @@ export const secretServiceFactory = ({
includeImports,
version
}: TGetASecretRawDTO) => {
const projectId = workspaceId || (await projectDAL.findProjectBySlug(projectSlug as string, actorOrgId)).id;
const { botKey, shouldUseSecretV2Bridge } = await projectBotService.getBotKey(projectId);
if (shouldUseSecretV2Bridge) {
const secret = await secretV2BridgeService.getSecretByName({

View File

@@ -229,8 +229,7 @@ export type TGetASecretRawDTO = {
type: "shared" | "personal";
includeImports?: boolean;
version?: number;
projectSlug?: string;
projectId?: string;
projectId: string;
} & Omit<TProjectPermission, "projectId">;
export type TGetASecretByIdRawDTO = {

View File

@@ -1,53 +0,0 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { TPosthogAggregatedEvents } from "@app/db/schemas/posthog-aggregated-events";
import { DatabaseError } from "@app/lib/errors";
import { buildFindFilter, ormify, selectAllTableCols } from "@app/lib/knex";
export type TPosthogAggregatedEventsDALFactory = ReturnType<typeof posthogAggregatedEventsDALFactory>;
export const posthogAggregatedEventsDALFactory = (db: TDbClient) => {
const posthogAggregatedEventsOrm = ormify(db, TableName.PosthogAggregatedEvents);
const getAggregatedEvent = async (
batchId: string,
distinctId: string,
event: string,
tx?: Knex
): Promise<TPosthogAggregatedEvents | undefined> => {
try {
const doc = await (tx || db.replicaNode())(TableName.PosthogAggregatedEvents)
// eslint-disable-next-line @typescript-eslint/no-misused-promises
.where(buildFindFilter({ batchId, distinctId, event }, TableName.PosthogAggregatedEvents))
.select(selectAllTableCols(TableName.PosthogAggregatedEvents))
.first();
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "Get last hour aggregated events" });
}
};
const getAllAggregatedEvent = async (
batchId: string,
event: string,
tx?: Knex
): Promise<TPosthogAggregatedEvents[]> => {
try {
const docs = await (tx || db.replicaNode())(TableName.PosthogAggregatedEvents)
// eslint-disable-next-line @typescript-eslint/no-misused-promises
.where(buildFindFilter({ batchId, event }, TableName.PosthogAggregatedEvents))
.select(selectAllTableCols(TableName.PosthogAggregatedEvents));
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "Get last hour aggregated events" });
}
};
return {
...posthogAggregatedEventsOrm,
getAggregatedEvent,
getAllAggregatedEvent
};
};

View File

@@ -55,11 +55,7 @@ export const telemetryQueueServiceFactory = ({
});
queueService.start(QueueName.TelemetryAggregatedEvents, async () => {
try {
await telemetryService.processAggregatedEvents();
} catch (error) {
logger.error(error, "Failed to process aggregated telemetry events");
}
await telemetryService.processAggregatedEvents();
});
// every day at midnight a telemetry job executes on self-hosted instances
@@ -75,11 +71,11 @@ export const telemetryQueueServiceFactory = ({
QueueName.TelemetryInstanceStats // just a job id
);
// clear previous hourly job
// clear previous aggregated events job
await queueService.stopRepeatableJob(
QueueName.TelemetryAggregatedEvents,
QueueJobs.TelemetryAggregatedEvents,
{ pattern: "0 * * * *", utc: true },
{ pattern: "*/5 * * * *", utc: true },
QueueName.TelemetryAggregatedEvents // just a job id
);
@@ -89,10 +85,10 @@ export const telemetryQueueServiceFactory = ({
repeat: { pattern: "0 0 * * *", utc: true }
});
// Start hourly aggregated events job (runs every hour at minute 0)
// Start aggregated events job (runs every five minutes)
await queueService.queue(QueueName.TelemetryAggregatedEvents, QueueJobs.TelemetryAggregatedEvents, undefined, {
jobId: QueueName.TelemetryAggregatedEvents,
repeat: { pattern: "0 * * * *", utc: true }
repeat: { pattern: "*/5 * * * *", utc: true }
});
}
};

View File

@@ -1,3 +1,4 @@
import { createHash, randomUUID } from "crypto";
import { PostHog } from "posthog-node";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
@@ -7,41 +8,55 @@ import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { logger } from "@app/lib/logger";
import { TPosthogAggregatedEventsDALFactory } from "./posthog-aggregated-events-dal";
import { PostHogEventTypes, TPostHogEvent, TSecretModifiedEvent } from "./telemetry-types";
export const TELEMETRY_SECRET_PROCESSED_KEY = "telemetry-secret-processed";
export const TELEMETRY_SECRET_OPERATIONS_KEY = "telemetry-secret-operations";
export const HOURLY_AGGREGATED_EVENTS = [PostHogEventTypes.SecretPulled];
export const POSTHOG_AGGREGATED_EVENTS = [PostHogEventTypes.SecretPulled];
const TELEMETRY_AGGREGATED_KEY_EXP = 900; // 15mins
interface AggregatedEventData {
[key: string]: string | number | boolean | undefined | Record<string, number>;
}
// Bucket configuration
const TELEMETRY_BUCKET_COUNT = 30;
const TELEMETRY_BUCKET_NAMES = Array.from(
{ length: TELEMETRY_BUCKET_COUNT },
(_, i) => `bucket-${i.toString().padStart(2, "0")}`
);
type AggregatedEventData = Record<string, unknown>;
type SingleEventData = {
distinctId: string;
event: string;
properties: unknown;
organizationId: string;
};
export type TTelemetryServiceFactory = ReturnType<typeof telemetryServiceFactory>;
export type TTelemetryServiceFactoryDep = {
keyStore: Pick<TKeyStoreFactory, "getItem" | "incrementBy" | "setItem" | "deleteItem" | "setItemWithExpiry">;
keyStore: Pick<
TKeyStoreFactory,
"incrementBy" | "deleteItemsByKeyIn" | "setItemWithExpiry" | "getKeysByPattern" | "getItems"
>;
licenseService: Pick<TLicenseServiceFactory, "getInstanceType">;
posthogAggregatedEventsDAL: TPosthogAggregatedEventsDALFactory;
};
const getCurrentHourKey = () => {
const now = new Date();
return `${now.getFullYear()}-${String(now.getMonth() + 1).padStart(2, "0")}-${String(now.getDate()).padStart(2, "0")}-${String(now.getHours()).padStart(2, "0")}`;
const getBucketForDistinctId = (distinctId: string): string => {
// Use SHA-256 hash for consistent distribution
const hash = createHash("sha256").update(distinctId).digest("hex");
// Take first 8 characters and convert to number for better distribution
const hashNumber = parseInt(hash.substring(0, 8), 16);
const bucketIndex = hashNumber % TELEMETRY_BUCKET_COUNT;
return TELEMETRY_BUCKET_NAMES[bucketIndex];
};
const getPreviousHourKey = () => {
const now = new Date();
now.setHours(now.getHours());
return `${now.getFullYear()}-${String(now.getMonth() + 1).padStart(2, "0")}-${String(now.getDate()).padStart(2, "0")}-${String(now.getHours()).padStart(2, "0")}`;
export const createTelemetryEventKey = (event: string, distinctId: string): string => {
const bucketId = getBucketForDistinctId(distinctId);
return `telemetry-event-${event}-${bucketId}-${distinctId}-${randomUUID()}`;
};
export const telemetryServiceFactory = ({
keyStore,
licenseService,
posthogAggregatedEventsDAL
}: TTelemetryServiceFactoryDep) => {
export const telemetryServiceFactory = ({ keyStore, licenseService }: TTelemetryServiceFactoryDep) => {
const appCfg = getConfig();
if (appCfg.isProductionMode && !appCfg.TELEMETRY_ENABLED) {
@@ -82,115 +97,36 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
}
};
const aggregateEventData = async (event: TPostHogEvent, organizationId?: string) => {
const currentHour = getCurrentHourKey();
try {
const lastHourAggregatedEvent = await posthogAggregatedEventsDAL.getAggregatedEvent(
currentHour,
event.distinctId,
event.event
);
const existingData = lastHourAggregatedEvent?.properties;
let hourlyData: AggregatedEventData = {};
if (existingData) {
hourlyData = existingData as AggregatedEventData;
}
hourlyData.count = Number(lastHourAggregatedEvent?.eventCount || 0) + 1;
if (event.properties) {
Object.entries(event.properties).forEach(([key, value]: [string, unknown]) => {
if (Array.isArray(value)) {
// For arrays, count occurrences of each item
const existingCounts =
hourlyData[key] && typeof hourlyData[key] === "object" && hourlyData[key]?.constructor === Object
? (hourlyData[key] as Record<string, number>)
: {};
value.forEach((item) => {
const itemKey = typeof item === "object" ? JSON.stringify(item) : String(item);
existingCounts[itemKey] = (existingCounts[itemKey] || 0) + 1;
});
hourlyData[key] = existingCounts;
} else if (typeof value === "object" && value?.constructor === Object) {
// For objects, count occurrences of each field value
const existingCounts =
hourlyData[key] && typeof hourlyData[key] === "object" && hourlyData[key]?.constructor === Object
? (hourlyData[key] as Record<string, number>)
: {};
if (value) {
Object.values(value).forEach((fieldValue) => {
const valueKey = typeof fieldValue === "object" ? JSON.stringify(fieldValue) : String(fieldValue);
existingCounts[valueKey] = (existingCounts[valueKey] || 0) + 1;
});
}
hourlyData[key] = existingCounts;
} else if (typeof value === "number") {
// For numbers, add to existing sum
hourlyData[key] = ((hourlyData[key] as number) || 0) + value;
} else if (value !== undefined && value !== null) {
// For other types (strings, booleans, etc.), count occurrences
const stringValue = String(value);
const existingValue = hourlyData[key];
if (!existingValue) {
hourlyData[key] = { [stringValue]: 1 };
} else if (existingValue && typeof existingValue === "object" && existingValue.constructor === Object) {
const countObject = existingValue;
countObject[stringValue] = (countObject[stringValue] || 0) + 1;
} else {
const oldValue = String(existingValue);
hourlyData[key] = {
[oldValue]: 1,
[stringValue]: 1
};
}
}
});
}
if (lastHourAggregatedEvent) {
await posthogAggregatedEventsDAL.updateById(lastHourAggregatedEvent.id, {
properties: hourlyData,
eventCount: Number(lastHourAggregatedEvent.eventCount) + 1,
organizationId: lastHourAggregatedEvent?.organizationId || organizationId
});
return;
}
await posthogAggregatedEventsDAL.create({
batchId: currentHour,
distinctId: event.distinctId,
event: event.event,
properties: hourlyData,
eventCount: 1,
organizationId
});
} catch (error) {
logger.error(error, `Failed to aggregate event data for ${event.event}`);
}
};
const sendPostHogEvents = async (event: TPostHogEvent) => {
if (postHog) {
const instanceType = licenseService.getInstanceType();
// capture posthog only when its cloud or signup event happens in self-hosted
if (instanceType === InstanceType.Cloud || event.event === PostHogEventTypes.UserSignedUp) {
if (event.organizationId) {
postHog.groupIdentify({ groupType: "organization", groupKey: event.organizationId });
try {
postHog.groupIdentify({ groupType: "organization", groupKey: event.organizationId });
} catch (error) {
logger.error(error, "Failed to identify PostHog organization");
}
}
if (HOURLY_AGGREGATED_EVENTS.includes(event.event)) {
await aggregateEventData(event, event.organizationId);
if (POSTHOG_AGGREGATED_EVENTS.includes(event.event)) {
const eventKey = createTelemetryEventKey(event.event, event.distinctId);
await keyStore.setItemWithExpiry(
eventKey,
TELEMETRY_AGGREGATED_KEY_EXP,
JSON.stringify({
distinctId: event.distinctId,
event: event.event,
properties: event.properties,
organizationId: event.organizationId
})
);
} else {
postHog.capture({
event: event.event,
distinctId: event.distinctId,
properties: event.properties,
groups: event.organizationId ? { organization: event.organizationId } : undefined
...(event.organizationId ? { groups: { organization: event.organizationId } } : {})
});
}
return;
@@ -213,40 +149,157 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
}
};
const aggregateGroupProperties = (events: SingleEventData[]): AggregatedEventData => {
const aggregatedData: AggregatedEventData = {};
// Set the total count
aggregatedData.count = events.length;
events.forEach((event) => {
if (!event.properties) return;
Object.entries(event.properties as Record<string, unknown>).forEach(([key, value]: [string, unknown]) => {
if (Array.isArray(value)) {
// For arrays, count occurrences of each item
const existingCounts =
aggregatedData[key] &&
typeof aggregatedData[key] === "object" &&
aggregatedData[key]?.constructor === Object
? (aggregatedData[key] as Record<string, number>)
: {};
value.forEach((item) => {
const itemKey = typeof item === "object" ? JSON.stringify(item) : String(item);
existingCounts[itemKey] = (existingCounts[itemKey] || 0) + 1;
});
aggregatedData[key] = existingCounts;
} else if (typeof value === "object" && value?.constructor === Object) {
// For objects, count occurrences of each field value
const existingCounts =
aggregatedData[key] &&
typeof aggregatedData[key] === "object" &&
aggregatedData[key]?.constructor === Object
? (aggregatedData[key] as Record<string, number>)
: {};
if (value) {
Object.values(value).forEach((fieldValue) => {
const valueKey = typeof fieldValue === "object" ? JSON.stringify(fieldValue) : String(fieldValue);
existingCounts[valueKey] = (existingCounts[valueKey] || 0) + 1;
});
}
aggregatedData[key] = existingCounts;
} else if (typeof value === "number") {
// For numbers, add to existing sum
aggregatedData[key] = ((aggregatedData[key] as number) || 0) + value;
} else if (value !== undefined && value !== null) {
// For other types (strings, booleans, etc.), count occurrences
const stringValue = String(value);
const existingValue = aggregatedData[key];
if (!existingValue) {
aggregatedData[key] = { [stringValue]: 1 };
} else if (existingValue && typeof existingValue === "object" && existingValue.constructor === Object) {
const countObject = existingValue as Record<string, number>;
countObject[stringValue] = (countObject[stringValue] || 0) + 1;
} else {
const oldValue = String(existingValue);
aggregatedData[key] = {
[oldValue]: 1,
[stringValue]: 1
};
}
}
});
});
return aggregatedData;
};
const processBucketEvents = async (eventType: string, bucketId: string) => {
if (!postHog) return 0;
try {
const bucketPattern = `telemetry-event-${eventType}-${bucketId}-*`;
const bucketKeys = await keyStore.getKeysByPattern(bucketPattern);
if (bucketKeys.length === 0) return 0;
const bucketEvents = await keyStore.getItems(bucketKeys);
let bucketEventsParsed: SingleEventData[] = [];
try {
bucketEventsParsed = bucketEvents
.filter((event) => event !== null)
.map((event) => JSON.parse(event as string) as SingleEventData);
} catch (error) {
logger.error(error, `Failed to parse bucket events for ${eventType} in ${bucketId}`);
return 0;
}
const eventsGrouped = new Map<string, SingleEventData[]>();
bucketEventsParsed.forEach((event) => {
const key = JSON.stringify({ id: event.distinctId, org: event.organizationId });
if (!eventsGrouped.has(key)) {
eventsGrouped.set(key, []);
}
eventsGrouped.get(key)!.push(event);
});
if (eventsGrouped.size === 0) return 0;
for (const [eventsKey, events] of eventsGrouped) {
const key = JSON.parse(eventsKey) as { id: string; org?: string };
if (key.org) {
try {
postHog.groupIdentify({ groupType: "organization", groupKey: key.org });
} catch (error) {
logger.error(error, "Failed to identify PostHog organization");
}
}
const properties = aggregateGroupProperties(events);
postHog.capture({
event: `${eventType} aggregated`,
distinctId: key.id,
properties,
...(key.org ? { groups: { organization: key.org } } : {})
});
}
// Clean up processed data for this bucket
await keyStore.deleteItemsByKeyIn(bucketKeys);
logger.info(`Processed ${bucketEventsParsed.length} events from bucket ${bucketId} for ${eventType}`);
return bucketEventsParsed.length;
} catch (error) {
logger.error(error, `Failed to process bucket ${bucketId} for ${eventType}`);
return 0;
}
};
const processAggregatedEvents = async () => {
if (!postHog) return;
const previousHourKey = getPreviousHourKey();
for (const eventType of HOURLY_AGGREGATED_EVENTS) {
try {
// eslint-disable-next-line no-await-in-loop
const events = await posthogAggregatedEventsDAL.getAllAggregatedEvent(previousHourKey, eventType);
// eslint-disable-next-line no-continue
if (events.length === 0) continue;
for (const eventType of POSTHOG_AGGREGATED_EVENTS) {
let totalProcessed = 0;
const processedEvents: string[] = [];
logger.info(`Starting bucket processing for ${eventType}`);
for (const event of events) {
if (event.organizationId) {
postHog.groupIdentify({ groupType: "organization", groupKey: event.organizationId });
}
postHog.capture({
event: `${eventType} aggregated`,
distinctId: event.distinctId,
properties: event.properties as Record<string, unknown> | undefined,
groups: event.organizationId ? { organization: event.organizationId } : undefined
});
processedEvents.push(event.id);
// Process each bucket sequentially to control memory usage
for (const bucketId of TELEMETRY_BUCKET_NAMES) {
try {
// eslint-disable-next-line no-await-in-loop
const processed = await processBucketEvents(eventType, bucketId);
totalProcessed += processed;
} catch (error) {
logger.error(error, `Failed to process bucket ${bucketId} for ${eventType}`);
}
// Clean up processed data
// eslint-disable-next-line no-await-in-loop
await posthogAggregatedEventsDAL.delete({ $in: { id: processedEvents } });
logger.info(`Processed aggregated events for ${eventType} at hour ${previousHourKey}`);
} catch (error) {
logger.error(error, `Failed to process aggregated events for ${eventType} at hour ${previousHourKey}`);
}
logger.info(`Completed processing ${totalProcessed} total events for ${eventType}`);
}
};
@@ -260,6 +313,7 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
sendLoopsEvent,
sendPostHogEvents,
processAggregatedEvents,
flushAll
flushAll,
getBucketForDistinctId
};
};

View File

@@ -40,6 +40,9 @@ require (
golang.org/x/term v0.30.0
gopkg.in/yaml.v2 v2.4.0
gopkg.in/yaml.v3 v3.0.1
k8s.io/api v0.31.4
k8s.io/apimachinery v0.31.4
k8s.io/client-go v0.31.4
)
require (
@@ -70,16 +73,25 @@ require (
github.com/danieljoos/wincred v1.2.0 // indirect
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc // indirect
github.com/dvsekhvalnov/jose2go v1.6.0 // indirect
github.com/emicklei/go-restful/v3 v3.11.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/fsnotify/fsnotify v1.4.9 // indirect
github.com/fxamacker/cbor/v2 v2.7.0 // indirect
github.com/go-logr/logr v1.4.2 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/go-openapi/errors v0.20.2 // indirect
github.com/go-openapi/jsonpointer v0.21.0 // indirect
github.com/go-openapi/jsonreference v0.20.2 // indirect
github.com/go-openapi/strfmt v0.21.3 // indirect
github.com/go-openapi/swag v0.23.0 // indirect
github.com/go-task/slim-sprig/v3 v3.0.0 // indirect
github.com/godbus/dbus/v5 v5.1.0 // indirect
github.com/gogo/protobuf v1.3.2 // indirect
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
github.com/golang/protobuf v1.5.4 // indirect
github.com/google/gnostic-models v0.6.9 // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/gofuzz v1.2.0 // indirect
github.com/google/pprof v0.0.0-20250302191652-9094ed2288e7 // indirect
github.com/google/s2a-go v0.1.7 // indirect
github.com/google/uuid v1.6.0 // indirect
@@ -90,17 +102,23 @@ require (
github.com/hashicorp/golang-lru/v2 v2.0.7 // indirect
github.com/hashicorp/hcl v1.0.0 // indirect
github.com/huandu/xstrings v1.5.0 // indirect
github.com/josharian/intern v1.0.0 // indirect
github.com/json-iterator/go v1.1.12 // indirect
github.com/lucasb-eyer/go-colorful v1.2.0 // indirect
github.com/magiconair/properties v1.8.5 // indirect
github.com/mailru/easyjson v0.7.7 // indirect
github.com/mattn/go-colorable v0.1.13 // indirect
github.com/mattn/go-runewidth v0.0.15 // indirect
github.com/mitchellh/copystructure v1.2.0 // indirect
github.com/mitchellh/mapstructure v1.4.1 // indirect
github.com/mitchellh/reflectwalk v1.0.2 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/mtibben/percent v0.2.1 // indirect
github.com/muesli/mango v0.1.0 // indirect
github.com/muesli/mango-pflag v0.1.0 // indirect
github.com/muesli/termenv v0.15.2 // indirect
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/oklog/ulid v1.3.1 // indirect
github.com/onsi/ginkgo/v2 v2.22.2 // indirect
github.com/pelletier/go-toml v1.9.3 // indirect
@@ -117,6 +135,7 @@ require (
github.com/tetratelabs/wazero v1.9.0 // indirect
github.com/wasilibs/wazero-helpers v0.0.0-20240620070341-3dff1577cd52 // indirect
github.com/wlynxg/anet v0.0.5 // indirect
github.com/x448/float16 v0.8.4 // indirect
github.com/xtgo/uuid v0.0.0-20140804021211-a0b114877d4c // indirect
go.mongodb.org/mongo-driver v1.10.0 // indirect
go.opencensus.io v0.24.0 // indirect
@@ -127,18 +146,26 @@ require (
go.opentelemetry.io/otel/trace v1.24.0 // indirect
go.uber.org/mock v0.5.0 // indirect
golang.org/x/mod v0.23.0 // indirect
golang.org/x/net v0.35.0 // indirect
golang.org/x/oauth2 v0.21.0 // indirect
golang.org/x/net v0.38.0 // indirect
golang.org/x/oauth2 v0.27.0 // indirect
golang.org/x/sync v0.12.0 // indirect
golang.org/x/text v0.23.0 // indirect
golang.org/x/time v0.6.0 // indirect
golang.org/x/time v0.9.0 // indirect
golang.org/x/tools v0.30.0 // indirect
google.golang.org/api v0.188.0 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20240701130421-f6361c86f094 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20240708141625-4ad9e859172b // indirect
google.golang.org/grpc v1.64.1 // indirect
google.golang.org/protobuf v1.36.1 // indirect
google.golang.org/protobuf v1.36.5 // indirect
gopkg.in/inf.v0 v0.9.1 // indirect
gopkg.in/ini.v1 v1.62.0 // indirect
k8s.io/klog/v2 v2.130.1 // indirect
k8s.io/kube-openapi v0.0.0-20250318190949-c8a335a9a2ff // indirect
k8s.io/utils v0.0.0-20241104100929-3ea5e8cea738 // indirect
sigs.k8s.io/json v0.0.0-20241010143419-9aa6b5e7a4b3 // indirect
sigs.k8s.io/randfill v1.0.0 // indirect
sigs.k8s.io/structured-merge-diff/v4 v4.6.0 // indirect
sigs.k8s.io/yaml v1.4.0 // indirect
)
require (

View File

@@ -134,6 +134,8 @@ github.com/denisbrodbeck/machineid v1.0.1 h1:geKr9qtkB876mXguW2X6TU4ZynleN6ezuMS
github.com/denisbrodbeck/machineid v1.0.1/go.mod h1:dJUwb7PTidGDeYyUBmXZ2GphQBbjJCrnectwCyxcUSI=
github.com/dvsekhvalnov/jose2go v1.6.0 h1:Y9gnSnP4qEI0+/uQkHvFXeD2PLPJeXEL+ySMEA2EjTY=
github.com/dvsekhvalnov/jose2go v1.6.0/go.mod h1:QsHjhyTlD/lAVqn/NSbVZmSCGeDehTB/mPZadG+mhXU=
github.com/emicklei/go-restful/v3 v3.11.0 h1:rAQeMHw1c7zTmncogyy8VvRZwtkmkZ4FxERmMY4rD+g=
github.com/emicklei/go-restful/v3 v3.11.0/go.mod h1:6n3XBCmQQb25CM2LCACGz8ukIrRry+4bhvbpWn3mrbc=
github.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98=
@@ -152,6 +154,8 @@ github.com/frankban/quicktest v1.14.6 h1:7Xjx+VpznH+oBnejlPUj8oUpdxnVs4f8XU8WnHk
github.com/frankban/quicktest v1.14.6/go.mod h1:4ptaffx2x8+WTWXmUCuVU6aPUX1/Mz7zb5vbUoiM6w0=
github.com/fsnotify/fsnotify v1.4.9 h1:hsms1Qyu0jgnwNXIxa+/V/PDsU6CfLf6CNO8H7IWoS4=
github.com/fsnotify/fsnotify v1.4.9/go.mod h1:znqG4EE+3YCdAaPaxE2ZRY/06pZUdp0tY4IgpuI1SZQ=
github.com/fxamacker/cbor/v2 v2.7.0 h1:iM5WgngdRBanHcxugY4JySA0nk1wZorNOpTgCMedv5E=
github.com/fxamacker/cbor/v2 v2.7.0/go.mod h1:pxXPTn3joSm21Gbwsv0w9OSA2y1HFR9qXEeXQVeNoDQ=
github.com/ghodss/yaml v1.0.0/go.mod h1:4dBDuWmgqj2HViK6kFavaiC9ZROes6MMH2rRYeMEF04=
github.com/gitleaks/go-gitdiff v0.9.1 h1:ni6z6/3i9ODT685OLCTf+s/ERlWUNWQF4x1pvoNICw0=
github.com/gitleaks/go-gitdiff v0.9.1/go.mod h1:pKz0X4YzCKZs30BL+weqBIG7mx0jl4tF1uXV9ZyNvrA=
@@ -165,8 +169,16 @@ github.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=
github.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=
github.com/go-openapi/errors v0.20.2 h1:dxy7PGTqEh94zj2E3h1cUmQQWiM1+aeCROfAr02EmK8=
github.com/go-openapi/errors v0.20.2/go.mod h1:cM//ZKUKyO06HSwqAelJ5NsEMMcpa6VpXe8DOa1Mi1M=
github.com/go-openapi/jsonpointer v0.19.6/go.mod h1:osyAmYz/mB/C3I+WsTTSgw1ONzaLJoLCyoi6/zppojs=
github.com/go-openapi/jsonpointer v0.21.0 h1:YgdVicSA9vH5RiHs9TZW5oyafXZFc6+2Vc1rr/O9oNQ=
github.com/go-openapi/jsonpointer v0.21.0/go.mod h1:IUyH9l/+uyhIYQ/PXVA41Rexl+kOkAPDdXEYns6fzUY=
github.com/go-openapi/jsonreference v0.20.2 h1:3sVjiK66+uXK/6oQ8xgcRKcFgQ5KXa2KvnJRumpMGbE=
github.com/go-openapi/jsonreference v0.20.2/go.mod h1:Bl1zwGIM8/wsvqjsOQLJ/SH+En5Ap4rVB5KVcIDZG2k=
github.com/go-openapi/strfmt v0.21.3 h1:xwhj5X6CjXEZZHMWy1zKJxvW9AfHC9pkyUjLvHtKG7o=
github.com/go-openapi/strfmt v0.21.3/go.mod h1:k+RzNO0Da+k3FrrynSNN8F7n/peCmQQqbbXjtDfvmGg=
github.com/go-openapi/swag v0.22.3/go.mod h1:UzaqsxGiab7freDnrUUra0MwWfN/q7tE4j+VcZ0yl14=
github.com/go-openapi/swag v0.23.0 h1:vsEVJDUo2hPJ2tu0/Xc+4noaxyEffXNIs3cOULZ+GrE=
github.com/go-openapi/swag v0.23.0/go.mod h1:esZ8ITTYEsH1V2trKHjAN8Ai7xHb8RV+YSZ577vPjgQ=
github.com/go-resty/resty/v2 v2.16.5 h1:hBKqmWrr7uRc3euHVqmh1HTHcKn99Smr7o5spptdhTM=
github.com/go-resty/resty/v2 v2.16.5/go.mod h1:hkJtXbA2iKHzJheXYvQ8snQES5ZLGKMwQ07xAwp/fiA=
github.com/go-task/slim-sprig/v3 v3.0.0 h1:sUs3vkvUymDpBKi3qH1YSqBQk9+9D/8M2mN1vB6EwHI=
@@ -174,6 +186,7 @@ github.com/go-task/slim-sprig/v3 v3.0.0/go.mod h1:W848ghGpv3Qj3dhTPRyJypKRiqCdHZ
github.com/godbus/dbus/v5 v5.0.4/go.mod h1:xhWf0FNVPg57R7Z0UbKHbJfkEywrmjJnf7w5xrFpKfA=
github.com/godbus/dbus/v5 v5.1.0 h1:4KLkAxT3aOY8Li4FRJe/KvhoNFFxo0m6fNuFUO8QJUk=
github.com/godbus/dbus/v5 v5.1.0/go.mod h1:xhWf0FNVPg57R7Z0UbKHbJfkEywrmjJnf7w5xrFpKfA=
github.com/gogo/protobuf v1.3.2 h1:Ov1cvc58UF3b5XjBnZv7+opcTcQFZebYjWzi34vdm4Q=
github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q=
github.com/golang/glog v0.0.0-20160126235308-23def4e6c14b/go.mod h1:SBH7ygxi8pfUlaOkMMuAQtPIUF8ecWP5IEl/CR7VP2Q=
github.com/golang/groupcache v0.0.0-20190702054246-869f871628b6/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=
@@ -211,6 +224,8 @@ github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6
github.com/golang/snappy v0.0.1/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/google/btree v0.0.0-20180813153112-4030bb1f1f0c/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/google/btree v1.0.0/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/google/gnostic-models v0.6.9 h1:MU/8wDLif2qCXZmzncUQ/BOfxWfthHi63KqpoNbWqVw=
github.com/google/gnostic-models v0.6.9/go.mod h1:CiWsm0s6BSQd1hRn8/QmxqB6BesYcbSZxsz9b0KuDBw=
github.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=
github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
@@ -222,9 +237,12 @@ github.com/google/go-cmp v0.5.2/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/
github.com/google/go-cmp v0.5.3/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.4/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/google/gofuzz v1.2.0 h1:xRy4A+RhZaiKjJ1bPfwQ8sedCA+YS2YcCHW6ec7JMi0=
github.com/google/gofuzz v1.2.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/google/martian v2.1.0+incompatible/go.mod h1:9I4somxYTbIHy5NJKHRl3wXiIaQGbYVAs8BPL6v8lEs=
github.com/google/martian/v3 v3.0.0/go.mod h1:y5Zk1BBys9G+gd6Jrk0W3cC1+ELVxBWuIGO+w/tUAp0=
github.com/google/martian/v3 v3.1.0/go.mod h1:y5Zk1BBys9G+gd6Jrk0W3cC1+ELVxBWuIGO+w/tUAp0=
@@ -298,7 +316,11 @@ github.com/infisical/infisical-kmip v0.3.5 h1:QM3s0e18B+mYv3a9HQNjNAlbwZJBzXq5BA
github.com/infisical/infisical-kmip v0.3.5/go.mod h1:bO1M4YtKyutNg1bREPmlyZspC5duSR7hyQ3lPmLzrIs=
github.com/jedib0t/go-pretty v4.3.0+incompatible h1:CGs8AVhEKg/n9YbUenWmNStRW2PHJzaeDodcfvRAbIo=
github.com/jedib0t/go-pretty v4.3.0+incompatible/go.mod h1:XemHduiw8R651AF9Pt4FwCTKeG3oo7hrHJAoznj9nag=
github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8HmY=
github.com/josharian/intern v1.0.0/go.mod h1:5DoeVV0s6jJacbCEi61lwdGj/aVlrQvzHFFd8Hwg//Y=
github.com/json-iterator/go v1.1.11/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=
github.com/json-iterator/go v1.1.12 h1:PV8peI4a0ysnczrg+LtxykD8LfKY9ML6u2jnxaEnrnM=
github.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo=
github.com/jstemmer/go-junit-report v0.0.0-20190106144839-af01ea7f8024/go.mod h1:6v2b51hI/fHJwM22ozAgKL4VKDeJcHhJFhtBdhmNjmU=
github.com/jstemmer/go-junit-report v0.9.1/go.mod h1:Brl9GWCQeLvo8nXZwPNNblvFj/XSXhF0NWZEnDohbsk=
github.com/jtolds/gls v4.20.0+incompatible h1:xdiiI2gbIgH/gLH7ADydsJ1uDOEzR8yvV7C0MuV77Wo=
@@ -308,6 +330,7 @@ github.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+o
github.com/klauspost/compress v1.13.6/go.mod h1:/3/Vjq9QcHkK5uEr5lBEmyoZ1iFhe47etQ6QUkpK6sk=
github.com/kr/fs v0.1.0/go.mod h1:FFnZGqtBN9Gxj7eW1uZ42v5BccTP0vu6NEaFoC2HwRg=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pretty v0.2.1/go.mod h1:ipq/a2n7PKx3OHsz4KJII5eveXtPO4qwEXGdVfWzfnI=
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
@@ -318,6 +341,8 @@ github.com/lucasb-eyer/go-colorful v1.2.0 h1:1nnpGOrhyZZuNyfu1QjKiUICQ74+3FNCN69
github.com/lucasb-eyer/go-colorful v1.2.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=
github.com/magiconair/properties v1.8.5 h1:b6kJs+EmPFMYGkow9GiUyCyOvIwYetYJ3fSaWak/Gls=
github.com/magiconair/properties v1.8.5/go.mod h1:y3VJvCyxH9uVvJTWEGAELF3aiYNyPKd5NZ3oSwXrF60=
github.com/mailru/easyjson v0.7.7 h1:UGYAvKxe3sBsEDzO8ZeWOSlIQfWFlxbzLZe7hwFURr0=
github.com/mailru/easyjson v0.7.7/go.mod h1:xzfreul335JAWq5oZzymOObrkdz5UnU4kGfJJLY9Nlc=
github.com/manifoldco/promptui v0.9.0 h1:3V4HzJk1TtXW1MTZMP7mdlwbBpIinw3HztaIlYthEiA=
github.com/manifoldco/promptui v0.9.0/go.mod h1:ka04sppxSGFAtxX0qhlYQjISsg9mR4GWtQEhdbn6Pgg=
github.com/mattn/go-colorable v0.0.9/go.mod h1:9vuHe8Xs5qXnSaW/c/ABM9alt+Vo+STaOChaDxuIBZU=
@@ -346,8 +371,12 @@ github.com/mitchellh/mapstructure v1.4.1/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RR
github.com/mitchellh/reflectwalk v1.0.2 h1:G2LzWKi524PWgd3mLHV8Y5k7s6XUvT0Gef6zxSIeXaQ=
github.com/mitchellh/reflectwalk v1.0.2/go.mod h1:mSTlrgnPZtwu0c4WaC2kGObEpuNDbx0jmZXqmk4esnw=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=
github.com/modern-go/reflect2 v1.0.1/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=
github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M=
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
github.com/montanaflynn/stats v0.0.0-20171201202039-1bf9dbcd8cbe/go.mod h1:wL8QJuTMNUDYhXwkmfOly8iTdp5TEcJFWZD2D7SIkUc=
github.com/mtibben/percent v0.2.1 h1:5gssi8Nqo8QU/r2pynCm+hBQHpkB/uNK7BJCFogWdzs=
github.com/mtibben/percent v0.2.1/go.mod h1:KG9uO+SZkUp+VkRHsCdYQV3XSZrrSpR3O9ibNBTZrns=
@@ -365,7 +394,8 @@ github.com/muesli/roff v0.1.0 h1:YD0lalCotmYuF5HhZliKWlIx7IEhiXeSfq7hNjFqGF8=
github.com/muesli/roff v0.1.0/go.mod h1:pjAHQM9hdUUwm/krAfrLGgJkXJ+YuhtsfZ42kieB2Ig=
github.com/muesli/termenv v0.15.2 h1:GohcuySI0QmI3wN8Ok9PtKGkgkFIk7y6Vpb5PvrY+Wo=
github.com/muesli/termenv v0.15.2/go.mod h1:Epx+iuz8sNs7mNKhxzH4fWXGNpZwUaJKRS1noLXviQ8=
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e h1:fD57ERR4JtEqsWbfPhv4DMiApHyliiK5xCTNVSPiaAs=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno=
github.com/oklog/ulid v1.3.1 h1:EGfNDEx6MqHz8B3uNV6QAib1UR2Lm97sHi3ocA6ESJ4=
github.com/oklog/ulid v1.3.1/go.mod h1:CirwcVhetQ6Lv90oh/F+FBtV6XMibvdAFo93nm5qn4U=
@@ -406,8 +436,8 @@ github.com/rivo/uniseg v0.2.0 h1:S1pD9weZBuJdFmowNwbpi7BJ8TNftyUImj/0WQi72jY=
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
github.com/rogpeppe/fastuuid v1.2.0/go.mod h1:jVj6XXZzXRy/MSR5jhDC/2q6DgLz+nrA6LYCDYWNEvQ=
github.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.9.0 h1:73kH8U+JUqXU8lRuOHeVHaa/SZPifC7BkcraZVejAe8=
github.com/rogpeppe/go-internal v1.9.0/go.mod h1:WtVeX8xhTBvf0smdhujwtBcq4Qrzq/fJaraNFVN+nFs=
github.com/rogpeppe/go-internal v1.12.0 h1:exVL4IDcn6na9z1rAb56Vxr+CgyK3nn3O+epU5NdKM8=
github.com/rogpeppe/go-internal v1.12.0/go.mod h1:E+RYuTGaKKdloAfM02xzb0FW3Paa99yedzYV+kq4uf4=
github.com/rs/cors v1.11.0 h1:0B9GE/r9Bc2UxRMMtymBkHTenPkHDv0CW4Y98GBY+po=
github.com/rs/cors v1.11.0/go.mod h1:XyqrcTp5zjWr1wsJ8PIRZssZ8b/WMcMf71DJnit4EMU=
github.com/rs/xid v1.3.0/go.mod h1:trrq9SKmegXys3aeAKXMUTdJsYXVwGY3RLcfgqegfbg=
@@ -467,6 +497,8 @@ github.com/wasilibs/wazero-helpers v0.0.0-20240620070341-3dff1577cd52 h1:OvLBa8S
github.com/wasilibs/wazero-helpers v0.0.0-20240620070341-3dff1577cd52/go.mod h1:jMeV4Vpbi8osrE/pKUxRZkVaA0EX7NZN0A9/oRzgpgY=
github.com/wlynxg/anet v0.0.5 h1:J3VJGi1gvo0JwZ/P1/Yc/8p63SoW98B5dHkYDmpgvvU=
github.com/wlynxg/anet v0.0.5/go.mod h1:eay5PRQr7fIVAMbTbchTnO9gG65Hg/uYGdc7mguHxoA=
github.com/x448/float16 v0.8.4 h1:qLwI1I70+NjRFUR3zs1JPUCgaCXSh3SW62uAKT1mSBM=
github.com/x448/float16 v0.8.4/go.mod h1:14CWIYCyZA/cWjXOioeEpHeN/83MdbZDRQHoFcYsOfg=
github.com/xdg-go/pbkdf2 v1.0.0/go.mod h1:jrpuAogTd400dnrH08LKmI/xc1MbPOebTwRqcT5RDeI=
github.com/xdg-go/scram v1.1.1/go.mod h1:RaEWvsqvNKKvBPvcKeFjrG2cJqOkHTiyTpzz23ni57g=
github.com/xdg-go/stringprep v1.0.3/go.mod h1:W3f5j4i+9rC0kuIEJL0ky1VpHXQU3ocBgklLGvcBnW8=
@@ -596,8 +628,8 @@ golang.org/x/net v0.0.0-20210316092652-d523dce5a7f4/go.mod h1:RBQZq4jEuRlivfhVLd
golang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM=
golang.org/x/net v0.0.0-20210805182204-aaa1db679c0d/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.35.0 h1:T5GQRQb2y08kTAByq9L4/bz8cipCdA8FbRTXewonqY8=
golang.org/x/net v0.35.0/go.mod h1:EglIi67kWsHKlRzzVMUD93VMSWGFOMSZgxFjparz1Qk=
golang.org/x/net v0.38.0 h1:vRMAPTMaeGqVhG5QyLJHqNDwecKTomGeqbnfZyKlBI8=
golang.org/x/net v0.38.0/go.mod h1:ivrbrMbzFq5J41QOQh0siUuly180yBYtLp+CKbEaFx8=
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
golang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
@@ -610,8 +642,8 @@ golang.org/x/oauth2 v0.0.0-20210218202405-ba52d332ba99/go.mod h1:KelEdhl1UZF7XfJ
golang.org/x/oauth2 v0.0.0-20210220000619-9bb904979d93/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
golang.org/x/oauth2 v0.0.0-20210313182246-cd4f82c27b84/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
golang.org/x/oauth2 v0.0.0-20210402161424-2e8d93401602/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
golang.org/x/oauth2 v0.21.0 h1:tsimM75w1tF/uws5rbeHzIWxEqElMehnc+iW793zsZs=
golang.org/x/oauth2 v0.21.0/go.mod h1:XYTD2NtWslqkgxebSiOHnXEap4TF09sJSc7H1sXbhtI=
golang.org/x/oauth2 v0.27.0 h1:da9Vo7/tDv5RH/7nZDz1eMGS/q1Vv1N/7FCrBhI9I3M=
golang.org/x/oauth2 v0.27.0/go.mod h1:onh5ek6nERTohokkhCD/y2cV4Do3fxFHFuAejCkRWT8=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@@ -693,8 +725,8 @@ golang.org/x/text v0.23.0/go.mod h1:/BLNzu4aZCJ1+kcD0DNRotWKage4q2rGVAg4o22unh4=
golang.org/x/time v0.0.0-20181108054448-85acf8d2951c/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.0.0-20190308202827-9d24e82272b4/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.0.0-20191024005414-555d28b269f0/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.6.0 h1:eTDhh4ZXt5Qf0augr54TN6suAUudPcawVZeIAPU7D4U=
golang.org/x/time v0.6.0/go.mod h1:3BpzKBy/shNhVucY/MWOyx10tF3SFh9QdLuxbVysPQM=
golang.org/x/time v0.9.0 h1:EsRrnYcQiGH+5FfbgvV4AP7qEZstoyrHB0DzarOQ4ZY=
golang.org/x/time v0.9.0/go.mod h1:3BpzKBy/shNhVucY/MWOyx10tF3SFh9QdLuxbVysPQM=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190226205152-f727befe758c/go.mod h1:9Yl7xja0Znq3iFh3HoIrodX9oNMXvdceNzlUR8zjMvY=
@@ -863,14 +895,17 @@ google.golang.org/protobuf v1.24.0/go.mod h1:r/3tXBNzIEhYS9I1OUVjXDlt8tc493IdKGj
google.golang.org/protobuf v1.25.0/go.mod h1:9JNX74DMeImyA3h4bdi1ymwjUzf21/xIlbajtzgsN7c=
google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=
google.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
google.golang.org/protobuf v1.36.1 h1:yBPeRvTftaleIgM3PZ/WBIZ7XM/eEYAaEyCwvyjq/gk=
google.golang.org/protobuf v1.36.1/go.mod h1:9fA7Ob0pmnwhb644+1+CVWFRbNajQ6iRojtC/QF5bRE=
google.golang.org/protobuf v1.36.5 h1:tPhr+woSbjfYvY6/GPufUoYizxw1cF/yFoxJ2fmpwlM=
google.golang.org/protobuf v1.36.5/go.mod h1:9fA7Ob0pmnwhb644+1+CVWFRbNajQ6iRojtC/QF5bRE=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20200902074654-038fdea0a05b h1:QRR6H1YWRnHb4Y/HeNFCTJLFVxaq6wH4YuVdsUOr75U=
gopkg.in/check.v1 v1.0.0-20200902074654-038fdea0a05b/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=
gopkg.in/inf.v0 v0.9.1 h1:73M5CoZyi3ZLMOyDlQh031Cx6N9NDJ2Vvfl76EDAgDc=
gopkg.in/inf.v0 v0.9.1/go.mod h1:cWUDdTG/fYaXco+Dcufb5Vnc6Gp2YChqWtbxRZE0mXw=
gopkg.in/ini.v1 v1.62.0 h1:duBzk771uxoUuOlyRLkHsygud9+5lrlGjdFBb4mSKDU=
gopkg.in/ini.v1 v1.62.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
@@ -890,6 +925,27 @@ honnef.co/go/tools v0.0.0-20190523083050-ea95bdfd59fc/go.mod h1:rf3lG4BRIbNafJWh
honnef.co/go/tools v0.0.1-2019.2.3/go.mod h1:a3bituU0lyd329TUQxRnasdCoJDkEUEAqEt0JzvZhAg=
honnef.co/go/tools v0.0.1-2020.1.3/go.mod h1:X/FiERA/W4tHapMX5mGpAtMSVEeEUOyHaw9vFzvIQ3k=
honnef.co/go/tools v0.0.1-2020.1.4/go.mod h1:X/FiERA/W4tHapMX5mGpAtMSVEeEUOyHaw9vFzvIQ3k=
k8s.io/api v0.31.4 h1:I2QNzitPVsPeLQvexMEsj945QumYraqv9m74isPDKhM=
k8s.io/api v0.31.4/go.mod h1:d+7vgXLvmcdT1BCo79VEgJxHHryww3V5np2OYTr6jdw=
k8s.io/apimachinery v0.31.4 h1:8xjE2C4CzhYVm9DGf60yohpNUh5AEBnPxCryPBECmlM=
k8s.io/apimachinery v0.31.4/go.mod h1:rsPdaZJfTfLsNJSQzNHQvYoTmxhoOEofxtOsF3rtsMo=
k8s.io/client-go v0.31.4 h1:t4QEXt4jgHIkKKlx06+W3+1JOwAFU/2OPiOo7H92eRQ=
k8s.io/client-go v0.31.4/go.mod h1:kvuMro4sFYIa8sulL5Gi5GFqUPvfH2O/dXuKstbaaeg=
k8s.io/klog/v2 v2.130.1 h1:n9Xl7H1Xvksem4KFG4PYbdQCQxqc/tTUyrgXaOhHSzk=
k8s.io/klog/v2 v2.130.1/go.mod h1:3Jpz1GvMt720eyJH1ckRHK1EDfpxISzJ7I9OYgaDtPE=
k8s.io/kube-openapi v0.0.0-20250318190949-c8a335a9a2ff h1:/usPimJzUKKu+m+TE36gUyGcf03XZEP0ZIKgKj35LS4=
k8s.io/kube-openapi v0.0.0-20250318190949-c8a335a9a2ff/go.mod h1:5jIi+8yX4RIb8wk3XwBo5Pq2ccx4FP10ohkbSKCZoK8=
k8s.io/utils v0.0.0-20241104100929-3ea5e8cea738 h1:M3sRQVHv7vB20Xc2ybTt7ODCeFj6JSWYFzOFnYeS6Ro=
k8s.io/utils v0.0.0-20241104100929-3ea5e8cea738/go.mod h1:OLgZIPagt7ERELqWJFomSt595RzquPNLL48iOWgYOg0=
rsc.io/binaryregexp v0.2.0/go.mod h1:qTv7/COck+e2FymRvadv62gMdZztPaShugOCi3I+8D8=
rsc.io/quote/v3 v3.1.0/go.mod h1:yEA65RcK8LyAZtP9Kv3t0HmxON59tX3rD+tICJqUlj0=
rsc.io/sampler v1.3.0/go.mod h1:T1hPZKmBbMNahiBKFy5HrXp6adAjACjK9JXDnKaTXpA=
sigs.k8s.io/json v0.0.0-20241010143419-9aa6b5e7a4b3 h1:/Rv+M11QRah1itp8VhT6HoVx1Ray9eB4DBr+K+/sCJ8=
sigs.k8s.io/json v0.0.0-20241010143419-9aa6b5e7a4b3/go.mod h1:18nIHnGi6636UCz6m8i4DhaJ65T6EruyzmoQqI2BVDo=
sigs.k8s.io/randfill v0.0.0-20250304075658-069ef1bbf016/go.mod h1:XeLlZ/jmk4i1HRopwe7/aU3H5n1zNUcX6TM94b3QxOY=
sigs.k8s.io/randfill v1.0.0 h1:JfjMILfT8A6RbawdsK2JXGBR5AQVfd+9TbzrlneTyrU=
sigs.k8s.io/randfill v1.0.0/go.mod h1:XeLlZ/jmk4i1HRopwe7/aU3H5n1zNUcX6TM94b3QxOY=
sigs.k8s.io/structured-merge-diff/v4 v4.6.0 h1:IUA9nvMmnKWcj5jl84xn+T5MnlZKThmUW1TdblaLVAc=
sigs.k8s.io/structured-merge-diff/v4 v4.6.0/go.mod h1:dDy58f92j70zLsuZVuUX5Wp9vtxXpaZnkPGWeqDfCps=
sigs.k8s.io/yaml v1.4.0 h1:Mk1wCc2gy/F0THH0TAp1QYyJNzRm2KCLy3o5ASXVI5E=
sigs.k8s.io/yaml v1.4.0/go.mod h1:Ejl7/uTz7PSA4eKMyQCUTnhZYNmLIl+5c2lQPGR2BPY=

View File

@@ -631,8 +631,8 @@ func CallGatewayHeartBeatV1(httpClient *resty.Client) error {
return nil
}
func CallBootstrapInstance(httpClient *resty.Client, request BootstrapInstanceRequest) (map[string]interface{}, error) {
var resBody map[string]interface{}
func CallBootstrapInstance(httpClient *resty.Client, request BootstrapInstanceRequest) (BootstrapInstanceResponse, error) {
var resBody BootstrapInstanceResponse
response, err := httpClient.
R().
SetResult(&resBody).
@@ -641,11 +641,11 @@ func CallBootstrapInstance(httpClient *resty.Client, request BootstrapInstanceRe
Post(fmt.Sprintf("%v/v1/admin/bootstrap", request.Domain))
if err != nil {
return nil, NewGenericRequestError(operationCallBootstrapInstance, err)
return BootstrapInstanceResponse{}, NewGenericRequestError(operationCallBootstrapInstance, err)
}
if response.IsError() {
return nil, NewAPIErrorWithResponse(operationCallBootstrapInstance, response, nil)
return BootstrapInstanceResponse{}, NewAPIErrorWithResponse(operationCallBootstrapInstance, response, nil)
}
return resBody, nil

View File

@@ -655,3 +655,35 @@ type BootstrapInstanceRequest struct {
Organization string `json:"organization"`
Domain string `json:"domain"`
}
type BootstrapInstanceResponse struct {
Message string `json:"message"`
Identity BootstrapIdentity `json:"identity"`
Organization BootstrapOrganization `json:"organization"`
User BootstrapUser `json:"user"`
}
type BootstrapIdentity struct {
ID string `json:"id"`
Name string `json:"name"`
Credentials BootstrapIdentityCredentials `json:"credentials"`
}
type BootstrapIdentityCredentials struct {
Token string `json:"token"`
}
type BootstrapOrganization struct {
ID string `json:"id"`
Name string `json:"name"`
Slug string `json:"slug"`
}
type BootstrapUser struct {
ID string `json:"id"`
Email string `json:"email"`
FirstName string `json:"firstName"`
LastName string `json:"lastName"`
Username string `json:"username"`
SuperAdmin bool `json:"superAdmin"`
}

View File

@@ -4,16 +4,127 @@ Copyright (c) 2023 Infisical Inc.
package cmd
import (
"bytes"
"context"
"encoding/base64"
"encoding/json"
"fmt"
"os"
"text/template"
"github.com/Infisical/infisical-merge/packages/api"
"github.com/Infisical/infisical-merge/packages/util"
"github.com/rs/zerolog/log"
"github.com/spf13/cobra"
corev1 "k8s.io/api/core/v1"
"k8s.io/apimachinery/pkg/api/errors"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/client-go/kubernetes"
"k8s.io/client-go/rest"
)
// handleK8SecretOutput processes the k8-secret output type by creating a Kubernetes secret
func handleK8SecretOutput(bootstrapResponse api.BootstrapInstanceResponse, k8SecretTemplate, k8SecretName, k8SecretNamespace string) error {
// Create in-cluster config
config, err := rest.InClusterConfig()
if err != nil {
return fmt.Errorf("failed to create in-cluster config: %v", err)
}
// Create Kubernetes client
clientset, err := kubernetes.NewForConfig(config)
if err != nil {
return fmt.Errorf("failed to create Kubernetes client: %v", err)
}
// Parse and execute the template to render the data/stringData section
tmpl, err := template.New("k8-secret-template").Funcs(template.FuncMap{
"encodeBase64": func(s string) string {
return base64.StdEncoding.EncodeToString([]byte(s))
},
}).Parse(k8SecretTemplate)
if err != nil {
return fmt.Errorf("failed to parse output template: %v", err)
}
var renderedDataSection bytes.Buffer
err = tmpl.Execute(&renderedDataSection, bootstrapResponse)
if err != nil {
return fmt.Errorf("failed to execute output template: %v", err)
}
// Parse the rendered template as JSON to validate it's valid
var dataSection map[string]interface{}
if err := json.Unmarshal(renderedDataSection.Bytes(), &dataSection); err != nil {
return fmt.Errorf("template output is not valid JSON: %v", err)
}
// Prepare the secret data and stringData maps
secretData := make(map[string][]byte)
secretStringData := make(map[string]string)
// Process the dataSection to separate data and stringData
if data, exists := dataSection["data"]; exists {
if dataMap, ok := data.(map[string]interface{}); ok {
for key, value := range dataMap {
if strValue, ok := value.(string); ok {
secretData[key] = []byte(strValue)
}
}
}
}
if stringData, exists := dataSection["stringData"]; exists {
if stringDataMap, ok := stringData.(map[string]interface{}); ok {
for key, value := range stringDataMap {
if strValue, ok := value.(string); ok {
secretStringData[key] = strValue
}
}
}
}
// Create the Kubernetes secret object
k8sSecret := &corev1.Secret{
ObjectMeta: metav1.ObjectMeta{
Name: k8SecretName,
Namespace: k8SecretNamespace,
},
Type: corev1.SecretTypeOpaque,
Data: secretData,
StringData: secretStringData,
}
ctx := context.Background()
secretsClient := clientset.CoreV1().Secrets(k8SecretNamespace)
// Check if secret already exists
existingSecret, err := secretsClient.Get(ctx, k8SecretName, metav1.GetOptions{})
if err != nil {
if errors.IsNotFound(err) {
// Secret doesn't exist, create it
_, err = secretsClient.Create(ctx, k8sSecret, metav1.CreateOptions{})
if err != nil {
return fmt.Errorf("failed to create Kubernetes secret: %v", err)
}
log.Info().Msgf("Successfully created Kubernetes secret '%s' in namespace '%s'", k8SecretName, k8SecretNamespace)
} else {
return fmt.Errorf("failed to check if Kubernetes secret exists: %v", err)
}
} else {
// Secret exists, update it
k8sSecret.ObjectMeta.ResourceVersion = existingSecret.ObjectMeta.ResourceVersion
_, err = secretsClient.Update(ctx, k8sSecret, metav1.UpdateOptions{})
if err != nil {
return fmt.Errorf("failed to update Kubernetes secret: %v", err)
}
log.Info().Msgf("Successfully updated Kubernetes secret '%s' in namespace '%s'", k8SecretName, k8SecretNamespace)
}
return nil
}
var bootstrapCmd = &cobra.Command{
Use: "bootstrap",
Short: "Used to bootstrap your Infisical instance",
@@ -23,7 +134,7 @@ var bootstrapCmd = &cobra.Command{
Run: func(cmd *cobra.Command, args []string) {
email, _ := cmd.Flags().GetString("email")
if email == "" {
if envEmail, ok := os.LookupEnv("INFISICAL_ADMIN_EMAIL"); ok {
if envEmail, ok := os.LookupEnv(util.INFISICAL_BOOTSTRAP_EMAIL_NAME); ok {
email = envEmail
}
}
@@ -35,7 +146,7 @@ var bootstrapCmd = &cobra.Command{
password, _ := cmd.Flags().GetString("password")
if password == "" {
if envPassword, ok := os.LookupEnv("INFISICAL_ADMIN_PASSWORD"); ok {
if envPassword, ok := os.LookupEnv(util.INFISICAL_BOOTSTRAP_PASSWORD_NAME); ok {
password = envPassword
}
}
@@ -47,7 +158,7 @@ var bootstrapCmd = &cobra.Command{
organization, _ := cmd.Flags().GetString("organization")
if organization == "" {
if envOrganization, ok := os.LookupEnv("INFISICAL_ADMIN_ORGANIZATION"); ok {
if envOrganization, ok := os.LookupEnv(util.INFISICAL_BOOTSTRAP_ORGANIZATION_NAME); ok {
organization = envOrganization
}
}
@@ -69,11 +180,56 @@ var bootstrapCmd = &cobra.Command{
return
}
outputType, err := cmd.Flags().GetString("output")
if err != nil {
log.Error().Msgf("Failed to get output type: %v", err)
return
}
k8SecretTemplate, err := cmd.Flags().GetString("k8-secret-template")
if err != nil {
log.Error().Msgf("Failed to get k8-secret-template: %v", err)
}
k8SecretName, err := cmd.Flags().GetString("k8-secret-name")
if err != nil {
log.Error().Msgf("Failed to get k8-secret-name: %v", err)
}
k8SecretNamespace, err := cmd.Flags().GetString("k8-secret-namespace")
if err != nil {
log.Error().Msgf("Failed to get k8-secret-namespace: %v", err)
}
if outputType == "k8-secret" {
if k8SecretTemplate == "" {
log.Error().Msg("k8-secret-template is required when using k8-secret output type")
return
}
if k8SecretName == "" {
log.Error().Msg("k8-secret-name is required when using k8-secret output type")
return
}
if k8SecretNamespace == "" {
log.Error().Msg("k8-secret-namespace is required when using k8-secret output type")
return
}
}
httpClient, err := util.GetRestyClientWithCustomHeaders()
if err != nil {
log.Error().Msgf("Failed to get resty client with custom headers: %v", err)
return
}
ignoreIfBootstrapped, err := cmd.Flags().GetBool("ignore-if-bootstrapped")
if err != nil {
log.Error().Msgf("Failed to get ignore-if-bootstrapped flag: %v", err)
return
}
httpClient.SetHeader("Accept", "application/json")
bootstrapResponse, err := api.CallBootstrapInstance(httpClient, api.BootstrapInstanceRequest{
@@ -84,16 +240,26 @@ var bootstrapCmd = &cobra.Command{
})
if err != nil {
log.Error().Msgf("Failed to bootstrap instance: %v", err)
if !ignoreIfBootstrapped {
log.Error().Msgf("Failed to bootstrap instance: %v", err)
}
return
}
responseJSON, err := json.MarshalIndent(bootstrapResponse, "", " ")
if err != nil {
log.Fatal().Msgf("Failed to convert response to JSON: %v", err)
return
if outputType == "k8-secret" {
if err := handleK8SecretOutput(bootstrapResponse, k8SecretTemplate, k8SecretName, k8SecretNamespace); err != nil {
log.Error().Msgf("Failed to handle k8-secret output: %v", err)
return
}
} else {
responseJSON, err := json.MarshalIndent(bootstrapResponse, "", " ")
if err != nil {
log.Fatal().Msgf("Failed to convert response to JSON: %v", err)
return
}
fmt.Println(string(responseJSON))
}
fmt.Println(string(responseJSON))
},
}
@@ -102,6 +268,10 @@ func init() {
bootstrapCmd.Flags().String("email", "", "The desired email address of the instance admin")
bootstrapCmd.Flags().String("password", "", "The desired password of the instance admin")
bootstrapCmd.Flags().String("organization", "", "The name of the organization to create for the instance")
bootstrapCmd.Flags().String("output", "", "The type of output to use for the bootstrap command (json or k8-secret)")
bootstrapCmd.Flags().Bool("ignore-if-bootstrapped", false, "Whether to continue on error if the instance has already been bootstrapped")
bootstrapCmd.Flags().String("k8-secret-template", "{\"data\":{\"token\":\"{{.Identity.Credentials.Token}}\"}}", "The template to use for rendering the Kubernetes secret (entire secret JSON)")
bootstrapCmd.Flags().String("k8-secret-namespace", "", "The namespace to create the Kubernetes secret in")
bootstrapCmd.Flags().String("k8-secret-name", "", "The name of the Kubernetes secret to create")
rootCmd.AddCommand(bootstrapCmd)
}

View File

@@ -10,6 +10,10 @@ const (
INFISICAL_UNIVERSAL_AUTH_ACCESS_TOKEN_NAME = "INFISICAL_UNIVERSAL_AUTH_ACCESS_TOKEN"
INFISICAL_VAULT_FILE_PASSPHRASE_ENV_NAME = "INFISICAL_VAULT_FILE_PASSPHRASE" // This works because we've forked the keyring package and added support for this env variable. This explains why you won't find any occurrences of it in the CLI codebase.
INFISICAL_BOOTSTRAP_EMAIL_NAME = "INFISICAL_ADMIN_EMAIL"
INFISICAL_BOOTSTRAP_PASSWORD_NAME = "INFISICAL_ADMIN_PASSWORD"
INFISICAL_BOOTSTRAP_ORGANIZATION_NAME = "INFISICAL_ADMIN_ORGANIZATION"
VAULT_BACKEND_AUTO_MODE = "auto"
VAULT_BACKEND_FILE_MODE = "file"
@@ -47,6 +51,11 @@ const (
INFISICAL_BACKUP_SECRET = "infisical-backup-secrets" // akhilmhdh: @depreciated remove in version v0.30
INFISICAL_BACKUP_SECRET_ENCRYPTION_KEY = "infisical-backup-secret-encryption-key"
KUBERNETES_SERVICE_HOST_ENV_NAME = "KUBERNETES_SERVICE_HOST"
KUBERNETES_SERVICE_PORT_HTTPS_ENV_NAME = "KUBERNETES_SERVICE_PORT_HTTPS"
KUBERNETES_SERVICE_ACCOUNT_CA_CERT_PATH = "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt"
KUBERNETES_SERVICE_ACCOUNT_TOKEN_PATH = "/var/run/secrets/kubernetes.io/serviceaccount/token"
)
var (

View File

@@ -11,9 +11,9 @@ Fairly frequently, you might run into situations when you need to spend company
As a perk of working at Infisical, we cover some of your meal expenses.
HQ team members: meals and unlimited snacks are provided on-site at no cost.
**HQ team members**: meals and unlimited snacks are provided **on-site** at no cost.
Remote team members: a food stipend is allocated based on location.
**Remote team members**: a food stipend is allocated based on location.
# Trivial expenses
@@ -27,21 +27,28 @@ This means expenses that are:
Please spend money in a way that you think is in the best interest of the company.
</Note>
## Saving receipts
Make sure you keep copies for all receipts. If you expense something on a company card and cannot provide a receipt, this may be deducted from your pay.
# Travel
You should default to using your company card in all cases - it has no transaction fees. If using your personal card is unavoidable, please reach out to Maidul to get it reimbursed manually.
If you need to travel on Infisicals behalf for in-person onboarding, meeting customers, and offsites, again please spend money in the best interests of the company.
## Training
We do not pre-approve your travel expenses, and trust team members to make the right decisions here. Some guidance:
- Please find a flight ticket that is reasonably priced. We all travel economy by default we cannot afford for folks to fly premium or business class. Feel free to upgrade using your personal money/airmiles if youd like to.
- Feel free to pay for the Uber/subway/bus to and from the airport with your Brex card.
- For business travel, Infisical will cover reasonable expenses for breakfast, lunch, and dinner.
- When traveling internationally, Infisical does not cover roaming charges for your phone. You can expense a reasonable eSIM, which usually is no more than $20.
<Note>
Note that this only applies to business travel. It is not applicable for personal travel or day-to-day commuting.
</Note>
For engineers, youre welcome to take an approved Udemy course. Please reach out to Maidul. For the GTM team, you may buy a book a month if its relevant to your work.
# Equipment
Infisical is a remote first company so we understand the importance of having a comfortable work setup. To support this, we provide allowances for essential office equipment.
### Desk & Chair
### 1. Desk & Chair
Most people already have a comfortable desk and chair, but if you need an upgrade, we offer the following allowances.
While we're not yet able to provide the latest and greatest, we strive to be reasonable given the stage of our company.
@@ -50,10 +57,10 @@ While we're not yet able to provide the latest and greatest, we strive to be rea
**Chair**: $150 USD
### Laptop
### 2. Laptop
Each team member will receive a company-issued Macbook Pro before they start their first day.
### Notes
### 3. Notes
1. All equipment purchased using company allowances remains the property of Infisical.
2. Keep all receipts for equipment purchases and submit them for reimbursement.
@@ -65,6 +72,28 @@ This is because we don't yet have a formal HR department to handle such logistic
For any equipment related questions, please reach out to Maidul.
## Brex
# Brex
We use Brex as our primary credit card provider. Don't have a company card yet? Reach out to Maidul.
### Budgets
You will generally have multiple budgets assigned to you. "General Company Expenses" primarily covers quick SaaS purchases (not food). Remote team members should have a "Lunch Stipend" budget that applies to food.
If your position involves a lot of travel, you may also have a "Travel" budget that applies to expenses related to business travel (e.g., you can not use it for transportation or food during personal travel).
### Saving receipts
Make sure you keep copies for all receipts. If you expense something on a company card and cannot provide a receipt, this may be deducted from your pay.
You should default to using your company card in all cases - it has no transaction fees. If using your personal card is unavoidable, please reach out to Maidul to get it reimbursed manually.
### Need a one-off budget increase?
You can do this directly within Brex - just request the amount and duration for the relevant budget in the app, and your hiring manager will automatically be notified for approval.
# Training
For engineers, youre welcome to take an approved Udemy course. Please reach out to Maidul. For the GTM team, you may buy a book a month if its relevant to your work.

View File

@@ -4,7 +4,7 @@ services:
nginx:
container_name: infisical-dev-nginx
image: nginx
restart: always
restart: "always"
ports:
- 8080:80
- 8443:443

View File

@@ -0,0 +1,4 @@
---
title: "Available"
openapi: "GET /api/v1/app-connections/cloudflare/available"
---

View File

@@ -0,0 +1,10 @@
---
title: "Create"
openapi: "POST /api/v1/app-connections/cloudflare"
---
<Note>
Check out the configuration docs for [Cloudflare
Connections](/integrations/app-connections/cloudflare) to learn how to obtain
the required credentials.
</Note>

View File

@@ -0,0 +1,4 @@
---
title: "Delete"
openapi: "DELETE /api/v1/app-connections/cloudflare/{connectionId}"
---

View File

@@ -0,0 +1,4 @@
---
title: "Get by ID"
openapi: "GET /api/v1/app-connections/cloudflare/{connectionId}"
---

View File

@@ -0,0 +1,4 @@
---
title: "Get by Name"
openapi: "GET /api/v1/app-connections/cloudflare/connection-name/{connectionName}"
---

Some files were not shown because too many files have changed in this diff Show More