1
0
mirror of https://github.com/Infisical/infisical.git synced 2025-03-21 17:02:49 +00:00

Compare commits

..

190 Commits

Author SHA1 Message Date
40e7ab33cb fix: resolved lint issue 2024-05-29 13:35:32 +00:00
75bb651b1d docs: updated the cause of option1 and option2 in role update commit 2024-05-22 00:19:04 +05:30
303edadb1e Merge pull request from Infisical/feat/add-integration-sync-status
feat: added integration sync status
2024-05-22 01:19:36 +08:00
50155a610d Merge pull request from Infisical/misc/address-digital-ocean-env-encryption
misc: made digital ocean envs encrypted by default
2024-05-21 13:15:09 -04:00
c2830a56b6 misc: made digital ocean envs encrypted by default 2024-05-22 01:12:28 +08:00
b9a9b6b4d9 misc: applied ui/ux changes 2024-05-22 00:06:06 +08:00
e7f7f271c8 Merge pull request from Infisical/misc/added-pino-logger-redaction
misc: added logger redaction
2024-05-21 11:49:36 -04:00
b26e96c5a2 misc: added logger redaction 2024-05-21 23:04:11 +08:00
9b404c215b adjustment: ui changes to sync button 2024-05-21 16:04:36 +08:00
d6dae04959 misc: removed unnecessary notification 2024-05-21 14:01:15 +08:00
629bd9b7c6 added support for manual syncing of integrations 2024-05-21 13:56:44 +08:00
3d4aa0fdc9 Merge pull request from Infisical/daniel/jenkins-docs
Docs: Jenkins Plugin
2024-05-20 20:13:42 -07:00
711e30a6be Docs: Plugin installation 2024-05-21 04:58:24 +02:00
7b1462fdee Docs: Updated Jenkins docs to reflect new plugin 2024-05-21 04:56:26 +02:00
50915833ff Images 2024-05-21 04:56:15 +02:00
44e37fd531 update distinct id for service tokens 2024-05-20 19:50:52 -04:00
fa3f957738 count for null actor 2024-05-20 19:26:03 -04:00
224b26ced6 Merge pull request from Infisical/rate-limit-based-on-identity
rate limit based on identity
2024-05-20 19:04:41 -04:00
e833d9e67c revert secret read limit 2024-05-20 19:01:01 -04:00
dc08edb7d2 rate limit based on identity 2024-05-20 18:52:23 -04:00
0b78e30848 Delete mongo infisical helm 2024-05-20 16:27:15 -04:00
9253c69325 misc: finalized ui design of integration sync status 2024-05-21 02:35:23 +08:00
7d3a62cc4c feat: added integration sync status 2024-05-20 20:56:29 +08:00
7e2147f14e Adjust aws iam auth docs 2024-05-19 22:05:38 -07:00
32f39c98a7 Merge pull request from akhilmhdh/feat/membership-by-id
Endpoints for retreiving membership details
2024-05-19 23:51:30 +05:30
ddf6db5a7e small rephrase 2024-05-19 14:19:42 -04:00
554dbf6c23 Merge pull request from Infisical/create-pull-request/patch-1716042374
GH Action: rename new migration file timestamp
2024-05-18 07:33:38 -07:00
d1997f04c0 chore: renamed new migration files to latest timestamp (gh-action) 2024-05-18 14:26:13 +00:00
deefaa0961 Merge pull request from Infisical/k8s-auth
Kubernetes Native Authentication Method
2024-05-18 07:25:52 -07:00
a392c9f022 Move k8s migration to front 2024-05-17 22:41:33 -07:00
34222b83ee review fixes for k8s auth 2024-05-17 21:44:02 -04:00
ef36852a47 Add access token trusted ip support to k8s auth 2024-05-17 15:41:32 -07:00
d79fd826a4 Merge remote-tracking branch 'origin' into k8s-auth 2024-05-17 15:39:52 -07:00
18aaa423a9 Merge pull request from Infisical/patch-gcp-id-token-auth
Patch Identity Access Token Trusted IPs validation for AWS/GCP Auth
2024-05-17 18:38:15 -04:00
32c33eaf6e Patch identity token trusted ips validation for aws/gcp auths 2024-05-17 11:58:08 -07:00
702699b4f0 Update faq.mdx 2024-05-17 12:13:11 -04:00
35ee03d347 Merge pull request from akhilmhdh/fix/validation-permission
feat: added validation for project permission body in identity specific privilege
2024-05-17 11:50:35 -04:00
=
9c5deee688 feat: added validation for project permission body in identity specific privilege 2024-05-17 21:09:50 +05:30
=
ce4cb39a2d docs: added doc for new endpoints of getting membership and some title change 2024-05-17 20:49:58 +05:30
=
84724e5f65 feat: added endpoints to fetch a particule project user membership and identity 2024-05-17 20:45:31 +05:30
=
56c2e12760 feat: added create identity project membership to api reference and support for roles 2024-05-17 17:09:35 +05:30
=
21656a7ab6 docs: seperate project user and identities api into seperate 2024-05-17 16:15:52 +05:30
=
2ccc77ef40 feat: split project api description for identities and users into seperate 2024-05-17 16:15:05 +05:30
1438415d0c Merge pull request from Cristobal-M/feat-support-imports-in-cli-export
feat(cli): support of include-imports in export command
2024-05-17 14:21:34 +05:30
eca0e62764 Merge pull request from akhilmhdh/feat/revoke-access-token
Revoke access token endpoint
2024-05-16 23:41:38 +05:30
e4186f0317 Merge pull request from akhilmhdh/fix/aws-parameter-stoer
fix: get all secrets from aws ssm
2024-05-16 12:27:20 -04:00
=
704c630797 feat: added rate limit for sync secrets 2024-05-16 21:34:31 +05:30
f398fee2b8 make var readable 2024-05-16 11:43:32 -04:00
=
7fce51e8c1 fix: get all secrets from aws ssm 2024-05-16 20:51:07 +05:30
a6fe233122 Feat: missing documentation for include-imports in export and run command 2024-05-16 11:44:29 +02:00
5e678b1ad2 Merge pull request from akhilmhdh/fix/create-secret-fail-reference
fix: resolved create secret failing for reference
2024-05-15 22:34:37 -04:00
cf453e87d8 Merge pull request from Infisical/daniel/fix-expansion
Fix: Fix secret expansion II
2024-05-16 08:02:41 +05:30
=
4af703df5b fix: resolved create secret failing for reference 2024-05-16 07:35:05 +05:30
75b8b521b3 Update secret-service.ts 2024-05-16 03:31:01 +02:00
58c1d3b0ac Merge pull request from Infisical/daniel/fix-secret-expand-with-recursive
Fix: Secret expansion with recursive mode enabled
2024-05-16 02:33:28 +02:00
6b5cafa631 Merge pull request from Infisical/patch-update-project-identity
patch project identity update
2024-05-15 20:23:09 -04:00
4a35623956 remove for of with for await 2024-05-15 20:19:10 -04:00
74fe673724 patch project identity update 2024-05-15 20:12:45 -04:00
2f92719771 Fix: Secret expansion with recursive mode 2024-05-16 00:29:07 +02:00
399ca7a221 Merge pull request from justin1121/patch-1
Update secret-versioning.mdx
2024-05-15 15:34:03 +05:30
=
29f37295e1 docs: added revoke token api to api-reference 2024-05-15 15:27:26 +05:30
=
e3184a5f40 feat(api): added revoke access token endpoint 2024-05-15 15:26:38 +05:30
ace008f44e Make rejectUnauthorized true if ca cert is passed for k8s auth method 2024-05-14 22:49:37 -07:00
4afd95fe1a Merge pull request from akhilmhdh/feat/sync-integration-inline
Secret reference and integration sync support
2024-05-15 01:36:19 -04:00
3cd719f6b0 update index secret references button 2024-05-15 09:57:24 +05:30
c6352cc970 updated texts and comments 2024-05-15 09:57:24 +05:30
=
d4555f9698 feat: ui for reindex secret reference 2024-05-15 09:57:24 +05:30
=
393964c4ae feat: implemented inline secret reference integration sync 2024-05-15 09:57:23 +05:30
e4afbe8662 Update k8s auth docs 2024-05-14 20:44:09 -07:00
0d89aa8607 Add docs for K8s auth method 2024-05-14 18:02:05 -07:00
2b91ec5ae9 Fix merge conflicts 2024-05-14 13:37:39 -07:00
c438479246 update prod pipeline names 2024-05-14 16:14:42 -04:00
9828cbbfbe Update secret-versioning.mdx 2024-05-14 16:28:43 -03:00
cd910a2fac Update k8s auth impl to be able to test ca, tokenReviewerjwt locally 2024-05-14 11:42:26 -07:00
fc1dffd7e2 Merge pull request from Infisical/snyk-fix-a2a4b055e42c14d5cbdb505e7670d300
[Snyk] Security upgrade bullmq from 5.3.3 to 5.4.2
2024-05-14 12:02:13 -04:00
55f8198a2d Merge pull request from matthewaerose/patch-1
Fix: Correct typo from 'Halm' to 'Helm'
2024-05-14 11:46:49 -04:00
4d166402df Merge pull request from Infisical/create-pull-request/patch-1715660210
GH Action: rename new migration file timestamp
2024-05-14 00:17:34 -04:00
19edf83dbc chore: renamed new migration files to latest timestamp (gh-action) 2024-05-14 04:16:49 +00:00
13f6b238e7 fix: backend/package.json & backend/package-lock.json to reduce vulnerabilities
The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-JS-BRACES-6838727
- https://snyk.io/vuln/SNYK-JS-MICROMATCH-6838728
2024-05-14 04:16:40 +00:00
8dee1f8fc7 Merge pull request from Infisical/gcp-iam-auth
GCP Native Authentication Method
2024-05-14 00:16:28 -04:00
3b23035dfb disable secret scanning 2024-05-13 23:12:36 -04:00
0c8ef13d8d Fix: Correct typo from 'Halm' to 'Helm' 2024-05-13 13:38:09 -05:00
389d51fa5c Merge pull request from akhilmhdh/feat/hide-secret-scanner
feat: added secret-scanning disable option
2024-05-13 13:53:35 -04:00
638208e9fa update secret scanning text 2024-05-13 13:48:23 -04:00
c176d1e4f7 Merge pull request from akhilmhdh/fix/patches-v2
Improvised secret input component and fontawesome performance improvment
2024-05-13 13:42:30 -04:00
=
91a23a608e feat: added secret-scanning disable option 2024-05-13 21:55:37 +05:30
=
c6a25271dd fix: changed cross key to check for submission for save secret changes 2024-05-13 19:50:38 +05:30
=
0f5c1340d3 feat: dashboard optimized on font awesome levels using symbols technique 2024-05-13 13:40:59 +05:30
=
ecbdae110d feat: simplified secret input with auto completion 2024-05-13 13:40:59 +05:30
=
8ef727b4ec fix: resolved typo in dashboard nav header redirection 2024-05-13 13:40:59 +05:30
=
c6f24dbb5e fix: resolved unique key error secret input rendering 2024-05-13 13:40:59 +05:30
c45dae4137 Merge remote-tracking branch 'origin' into k8s-auth 2024-05-12 16:16:44 -07:00
18c0d2fd6f Merge pull request from Infisical/aws-integration-patch
Allow updating tags in AWS Secret Manager integration
2024-05-12 15:03:19 -07:00
c1fb8f47bf Add UntagResource IAM policy requirement for AWS SM integration docs 2024-05-12 08:57:41 -07:00
bd57a068d1 Fix merge conflicts 2024-05-12 08:43:29 -07:00
990eddeb32 Merge pull request from akhilmhdh/fix/remove-migration-notice
fix: removed migration notice
2024-05-11 13:43:04 -04:00
=
ce01f8d099 fix: removed migration notice 2024-05-11 23:04:43 +05:30
faf6708b00 Merge pull request from akhilmhdh/fix/migration-mode-patch-v1
feat: maintaince mode enable machine identity login and renew
2024-05-11 11:26:21 -04:00
=
a58d6ebdac feat: maintaince mode enable machine identity login and renew 2024-05-11 20:54:00 +05:30
818b136836 Make app and appId optional in update integration endpoint 2024-05-10 19:17:40 -07:00
0cdade6a2d Update AWS SM integration to allow updating tags 2024-05-10 19:07:44 -07:00
bcf9b68e2b Update GCP auth method description 2024-05-10 10:28:29 -07:00
6aa9fb6ecd Updated docs 2024-05-10 10:24:29 -07:00
38e7382d85 Remove GCP audit log space 2024-05-10 10:15:43 -07:00
95e12287c2 Minor edits to renaming GCE -> ID Token 2024-05-10 10:14:13 -07:00
c6d14a4bea Update 2024-05-10 10:10:51 -07:00
0a91586904 Remove service account JSON requirement from GCP Auth 2024-05-10 09:56:35 -07:00
6561a9c7be Merge pull request from Infisical/feat/add-support-for-secret-folder-rename-overview
Feature: add support for secret folder rename in the overview page
2024-05-10 23:07:14 +08:00
86aaa486b4 Update secret-folder-service.ts 2024-05-10 17:00:30 +02:00
9880977098 misc: addressed naming suggestion 2024-05-10 22:52:08 +08:00
b93aaffe77 adjustment: updated to use project slug 2024-05-10 22:34:16 +08:00
1ea0d55dd1 Merge pull request from Infisical/misc/update-documentation-for-github-integration
misc: updated documentation for github integration to include official action
2024-05-10 09:14:14 -04:00
0866a90c8e misc: updated documentation for github integration 2024-05-10 16:29:12 +08:00
3fff272cb3 feat: added snapshot for batch 2024-05-10 15:46:31 +08:00
2559809eac misc: addressed formatting issues 2024-05-10 14:41:35 +08:00
f32abbdc25 feat: integrate overview folder rename with new batch endpoint 2024-05-10 14:00:49 +08:00
a6f750fafb feat: added batch update endpoint for folders 2024-05-10 13:57:00 +08:00
610f474ecc Rename migration file 2024-05-09 16:58:39 -07:00
03f4a699e6 Improve GCP docs 2024-05-09 16:53:08 -07:00
533d49304a Update GCP documentation 2024-05-09 15:35:50 -07:00
184b59ad1d Resolve merge conflicts 2024-05-09 12:51:24 -07:00
b4a2123fa3 Merge pull request from Infisical/delete-pg-migrator
Delete PG migrator folder
2024-05-09 15:19:04 -04:00
79cacfa89c Delete PG migrator folder 2024-05-09 12:16:13 -07:00
44531487d6 Merge pull request from Infisical/maidul-pacth233
revert schema name for memberships-unique-constraint
2024-05-09 13:46:32 -04:00
7c77a4f049 revert schema name 2024-05-09 13:42:23 -04:00
9dfb587032 Merge pull request from Infisical/check-saml-email-verification
Update isEmailVerified field upon invite signups
2024-05-09 13:03:52 -04:00
3952ad9a2e Update isEmailVerified field upon invite signups 2024-05-09 09:51:53 -07:00
9c15cb407d Merge pull request from Infisical/aws-non-delete
Add option to not delete secrets in parameter store
2024-05-09 21:56:48 +05:30
cb17efa10b Merge pull request from akhilmhdh/fix/patches-v2
Workspace slug support in secret v3 Get Key
2024-05-09 12:17:14 -04:00
4adc2c4927 update api descriptions 2024-05-09 12:11:46 -04:00
1a26b34ad8 Merge pull request from Infisical/revise-aws-auth
Reframe AWS IAM auth to AWS Auth with IAM type
2024-05-09 12:06:31 -04:00
=
21c339d27a fix: better error message on ua based login error 2024-05-09 21:32:09 +05:30
1da4cf85f8 rename schema file 2024-05-09 11:59:47 -04:00
=
20f29c752d fix: added workspaceSlug support get secret by key 2024-05-09 21:23:57 +05:30
29ea12f8b1 Merge pull request from Infisical/mermaid-universal-auth
Add mermaid diagram for Universal Auth
2024-05-08 22:05:12 -07:00
b4f1cce587 Add mermaid diagram for universal auth 2024-05-08 22:03:57 -07:00
5a92520ca3 Update build-staging-and-deploy-aws.yml 2024-05-09 00:53:42 -04:00
42471b22bb Finish AWS Auth mermaid diagram 2024-05-08 21:52:56 -07:00
79704e9c98 add option to not delete secrets in parameter store 2024-05-08 21:49:09 -07:00
1165d11816 Update build-staging-and-deploy-aws.yml 2024-05-09 00:27:21 -04:00
15ea96815c Rename AWS IAM auth to AWS Auth with IAM type 2024-05-08 21:22:23 -07:00
86d4d88b58 package json lock 2024-05-09 00:19:44 -04:00
a12ad91e59 Update build-staging-and-deploy-aws.yml 2024-05-09 00:15:42 -04:00
3113e40d0b Add mermaid diagrams to gcp auth docs 2024-05-08 20:09:08 -07:00
2406d3d904 Update GCP auth docs 2024-05-08 17:03:26 -07:00
e99182c141 Complete adding GCP GCE auth 2024-05-08 15:51:09 -07:00
522dd0836e feat: added validation for folder name duplicates 2024-05-08 23:25:33 +08:00
e461787c78 feat: added support for renaming folders in the overview page 2024-05-08 23:24:33 +08:00
f74993e850 Merge pull request from Infisical/misc/improved-select-path-component-ux-1
misc: added handling of input focus to select path component
2024-05-08 22:00:02 +08:00
d0036a5656 Merge remote-tracking branch 'origin/main' into misc/improved-select-path-component-ux-1 2024-05-08 17:28:31 +08:00
e7f19421ef misc: resolved auto-popup of suggestions 2024-05-08 17:24:06 +08:00
e18d830fe8 Merge pull request from Infisical/daniel/k8-recursive
Feat: Recursive support for K8 operaetor
2024-05-08 00:44:07 +02:00
be2fc4fec4 Update Chart.yaml 2024-05-08 00:42:38 +02:00
829dbb9970 Update values.yaml 2024-05-08 00:41:53 +02:00
0b012c5dfb Chore: Helm 2024-05-08 00:23:50 +02:00
b0421ccad0 Docs: Add recursive to example 2024-05-08 00:21:08 +02:00
6b83326d00 Feat: Recursive mode support 2024-05-08 00:18:53 +02:00
1f6abc7f27 Feat: Recursive mode and fix error formatting 2024-05-08 00:18:40 +02:00
4a02520147 Update sample 2024-05-08 00:18:26 +02:00
14f38eb961 Feat: Recursive mode types 2024-05-08 00:16:51 +02:00
ac469dbe4f Update GCP auth docs 2024-05-07 14:58:14 -07:00
d98430fe07 Merge remote-tracking branch 'origin' into gcp-iam-auth 2024-05-07 14:29:08 -07:00
82bafd02bb Fix merge conflicts 2024-05-07 14:28:41 -07:00
37a59b2576 Merge pull request from Infisical/create-pull-request/patch-1715116016
GH Action: rename new migration file timestamp
2024-05-07 14:27:45 -07:00
cebd22da8e chore: renamed new migration files to latest timestamp (gh-action) 2024-05-07 21:06:55 +00:00
d200405c6e Merge pull request from Infisical/aws-iam-auth
AWS IAM Authentication Method
2024-05-07 14:06:30 -07:00
3a1cdc4f44 Delete backend/src/db/migrations/20240507162149_test.ts 2024-05-07 15:41:09 -04:00
1d40d9e448 Begin frontend for GCP IAM Auth 2024-05-07 12:40:19 -07:00
e96ca8d355 Draft GCP IAM Auth docs 2024-05-07 12:15:18 -07:00
2929d94f0a Merge pull request from Infisical/maidul98-patch-10
test
2024-05-07 14:28:03 -04:00
0383ae9e8b Create 20240507162149_test.ts 2024-05-07 14:27:44 -04:00
00faa6257f Delete backend/src/db/migrations/20240507162149_test.ts 2024-05-07 14:27:33 -04:00
183bde55ca correctly fetch merged by user login 2024-05-07 14:26:56 -04:00
c96fc1f798 Merge pull request from Infisical/maidul98-patch-9
test
2024-05-07 14:09:49 -04:00
80f7ff1ea8 Create 20240507162149_test.ts 2024-05-07 14:09:38 -04:00
c87620109b Rename 20240507162141_access to 20240507162141_access.ts 2024-05-07 13:58:10 -04:00
02c158b4ed Delete backend/src/db/migrations/20240507162180_test 2024-05-07 13:47:25 -04:00
588f4bdb09 Fix merge conflict 2024-05-07 10:45:07 -07:00
4d74d264dd Finish preliminary backend endpoints for GCP IAM Auth method 2024-05-07 10:42:39 -07:00
ddfa64eb33 Merge pull request from Infisical/maidul98-patch-8
testing-ignore
2024-05-07 13:27:19 -04:00
8bab14a672 misc: added handling of input focus 2024-05-08 00:43:14 +08:00
e3c80309c3 Move aws auth migration file to front 2024-05-06 23:03:45 -07:00
ec3d6c20e8 Merge remote-tracking branch 'origin' into aws-iam-auth 2024-05-06 22:58:47 -07:00
5d7c0f30c8 Fix typo universal auth 2024-05-06 22:58:35 -07:00
0b089e6fa6 Update aws iam auth fns filename 2024-05-06 18:35:34 -07:00
c276c44c08 Finish preliminary backend endpoints / db structure for k8s auth 2024-05-05 19:14:49 -07:00
cbf8e041e9 Finish docs for AWS IAM Auth, update ARN regex 2024-05-03 17:20:44 -07:00
5c4d35e30a Merge remote-tracking branch 'origin' into aws-iam-auth 2024-05-02 22:53:14 -07:00
d5c74d558a Start docs for AWS IAM auth 2024-05-02 22:52:37 -07:00
9c002ad645 Finish preliminary AWS IAM Auth method 2024-05-02 22:42:02 -07:00
345 changed files with 9168 additions and 15640 deletions
.github/workflows
backend
package-lock.jsonpackage.json
src
@types
db
ee
lib
api-docs
config
logger
queue
server
services
docker-compose.dev.yml
docs
frontend/src
components
hooks/api
pages
integrations/aws-parameter-store
org/[id]
overview
secret-scanning
styles
views
IntegrationsPage/components/IntegrationsSection
Org/MembersPage/components/OrgIdentityTab/components/IdentitySection
SecretMainPage/components/SecretListView
SecretOverviewPage
SecretOverviewPage.tsx
components/SecretOverviewFolderRow
Settings/ProjectSettingsPage/components
helm-charts
k8-operator
pg-migrator
.gitignorepackage-lock.jsonpackage.json
src
@types
audit-log-migrator.tsfolder.tsindex.ts
migrations
models
rollback.ts
schemas
utils.ts
tsconfig.json

@ -74,21 +74,21 @@ jobs:
uses: pr-mpt/actions-commit-hash@v2
- name: Download task definition
run: |
aws ecs describe-task-definition --task-definition infisical-prod-platform --query taskDefinition > task-definition.json
aws ecs describe-task-definition --task-definition infisical-core-platform --query taskDefinition > task-definition.json
- name: Render Amazon ECS task definition
id: render-web-container
uses: aws-actions/amazon-ecs-render-task-definition@v1
with:
task-definition: task-definition.json
container-name: infisical-prod-platform
container-name: infisical-core-platform
image: infisical/staging_infisical:${{ steps.commit.outputs.short }}
environment-variables: "LOG_LEVEL=info"
- name: Deploy to Amazon ECS service
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
with:
task-definition: ${{ steps.render-web-container.outputs.task-definition }}
service: infisical-prod-platform
cluster: infisical-prod-platform
service: infisical-core-platform
cluster: infisical-core-platform
wait-for-service-stability: true
production-postgres-deployment:
@ -122,19 +122,19 @@ jobs:
uses: pr-mpt/actions-commit-hash@v2
- name: Download task definition
run: |
aws ecs describe-task-definition --task-definition infisical-prod-platform --query taskDefinition > task-definition.json
aws ecs describe-task-definition --task-definition infisical-core-platform --query taskDefinition > task-definition.json
- name: Render Amazon ECS task definition
id: render-web-container
uses: aws-actions/amazon-ecs-render-task-definition@v1
with:
task-definition: task-definition.json
container-name: infisical-prod-platform
container-name: infisical-core-platform
image: infisical/staging_infisical:${{ steps.commit.outputs.short }}
environment-variables: "LOG_LEVEL=info"
- name: Deploy to Amazon ECS service
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
with:
task-definition: ${{ steps.render-web-container.outputs.task-definition }}
service: infisical-prod-platform
cluster: infisical-prod-platform
service: infisical-core-platform
cluster: infisical-core-platform
wait-for-service-stability: true

@ -38,13 +38,15 @@ jobs:
rm added_files.txt
git commit -m "chore: renamed new migration files to latest timestamp (gh-action)"
- name: Get the username of the person who closed the PR
run: |
- name: Get PR details
id: pr_details
run: |
PR_NUMBER=${{ github.event.pull_request.number }}
PR_CLOSER=$(curl -s -H "Authorization: Bearer ${{ secrets.GITHUB_TOKEN }}" "https://api.github.com/repos/${{ github.repository }}/pulls/$PR_NUMBER" | jq -r '.closed_by.login')
PR_MERGER=$(curl -s "https://api.github.com/repos/${{ github.repository }}/pulls/$PR_NUMBER" | jq -r '.merged_by.login')
echo "PR Number: $PR_NUMBER"
echo "PR Closer: $PR_CLOSER"
echo "pr_closer=$PR_CLOSER" >> $GITHUB_OUTPUT
echo "PR Merger: $PR_MERGER"
echo "pr_merger=$PR_MERGER" >> $GITHUB_OUTPUT
- name: Create Pull Request
if: env.SKIP_RENAME != 'true'
@ -54,3 +56,4 @@ jobs:
commit-message: 'chore: renamed new migration files to latest UTC (gh-action)'
title: 'GH Action: rename new migration file timestamp'
branch-suffix: timestamp
reviewers: ${{ steps.pr_details.outputs.pr_merger }}

File diff suppressed because it is too large Load Diff

@ -95,11 +95,13 @@
"axios": "^1.6.7",
"axios-retry": "^4.0.0",
"bcrypt": "^5.1.1",
"bullmq": "^5.3.3",
"bullmq": "^5.4.2",
"cassandra-driver": "^4.7.2",
"dotenv": "^16.4.1",
"fastify": "^4.26.0",
"fastify-plugin": "^4.5.1",
"google-auth-library": "^9.9.0",
"googleapis": "^137.1.0",
"handlebars": "^4.7.8",
"ioredis": "^5.3.2",
"jmespath": "^0.16.0",

@ -32,6 +32,9 @@ import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-se
import { TGroupProjectServiceFactory } from "@app/services/group-project/group-project-service";
import { TIdentityServiceFactory } from "@app/services/identity/identity-service";
import { TIdentityAccessTokenServiceFactory } from "@app/services/identity-access-token/identity-access-token-service";
import { TIdentityAwsAuthServiceFactory } from "@app/services/identity-aws-auth/identity-aws-auth-service";
import { TIdentityGcpAuthServiceFactory } from "@app/services/identity-gcp-auth/identity-gcp-auth-service";
import { TIdentityKubernetesAuthServiceFactory } from "@app/services/identity-kubernetes-auth/identity-kubernetes-auth-service";
import { TIdentityProjectServiceFactory } from "@app/services/identity-project/identity-project-service";
import { TIdentityUaServiceFactory } from "@app/services/identity-ua/identity-ua-service";
import { TIntegrationServiceFactory } from "@app/services/integration/integration-service";
@ -115,6 +118,9 @@ declare module "fastify" {
identityAccessToken: TIdentityAccessTokenServiceFactory;
identityProject: TIdentityProjectServiceFactory;
identityUa: TIdentityUaServiceFactory;
identityKubernetesAuth: TIdentityKubernetesAuthServiceFactory;
identityGcpAuth: TIdentityGcpAuthServiceFactory;
identityAwsAuth: TIdentityAwsAuthServiceFactory;
accessApprovalPolicy: TAccessApprovalPolicyServiceFactory;
accessApprovalRequest: TAccessApprovalRequestServiceFactory;
secretApprovalPolicy: TSecretApprovalPolicyServiceFactory;

@ -59,6 +59,15 @@ import {
TIdentityAccessTokens,
TIdentityAccessTokensInsert,
TIdentityAccessTokensUpdate,
TIdentityAwsAuths,
TIdentityAwsAuthsInsert,
TIdentityAwsAuthsUpdate,
TIdentityGcpAuths,
TIdentityGcpAuthsInsert,
TIdentityGcpAuthsUpdate,
TIdentityKubernetesAuths,
TIdentityKubernetesAuthsInsert,
TIdentityKubernetesAuthsUpdate,
TIdentityOrgMemberships,
TIdentityOrgMembershipsInsert,
TIdentityOrgMembershipsUpdate,
@ -225,6 +234,7 @@ import {
TWebhooksInsert,
TWebhooksUpdate
} from "@app/db/schemas";
import { TSecretReferences, TSecretReferencesInsert, TSecretReferencesUpdate } from "@app/db/schemas/secret-references";
declare module "knex/types/tables" {
interface Tables {
@ -298,6 +308,11 @@ declare module "knex/types/tables" {
>;
[TableName.ProjectKeys]: Knex.CompositeTableType<TProjectKeys, TProjectKeysInsert, TProjectKeysUpdate>;
[TableName.Secret]: Knex.CompositeTableType<TSecrets, TSecretsInsert, TSecretsUpdate>;
[TableName.SecretReference]: Knex.CompositeTableType<
TSecretReferences,
TSecretReferencesInsert,
TSecretReferencesUpdate
>;
[TableName.SecretBlindIndex]: Knex.CompositeTableType<
TSecretBlindIndexes,
TSecretBlindIndexesInsert,
@ -326,6 +341,21 @@ declare module "knex/types/tables" {
TIdentityUniversalAuthsInsert,
TIdentityUniversalAuthsUpdate
>;
[TableName.IdentityKubernetesAuth]: Knex.CompositeTableType<
TIdentityKubernetesAuths,
TIdentityKubernetesAuthsInsert,
TIdentityKubernetesAuthsUpdate
>;
[TableName.IdentityGcpAuth]: Knex.CompositeTableType<
TIdentityGcpAuths,
TIdentityGcpAuthsInsert,
TIdentityGcpAuthsUpdate
>;
[TableName.IdentityAwsAuth]: Knex.CompositeTableType<
TIdentityAwsAuths,
TIdentityAwsAuthsInsert,
TIdentityAwsAuthsUpdate
>;
[TableName.IdentityUaClientSecret]: Knex.CompositeTableType<
TIdentityUaClientSecrets,
TIdentityUaClientSecretsInsert,

@ -0,0 +1,30 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.IdentityAwsAuth))) {
await knex.schema.createTable(TableName.IdentityAwsAuth, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.bigInteger("accessTokenTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenMaxTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenNumUsesLimit").defaultTo(0).notNullable();
t.jsonb("accessTokenTrustedIps").notNullable();
t.timestamps(true, true, true);
t.uuid("identityId").notNullable().unique();
t.foreign("identityId").references("id").inTable(TableName.Identity).onDelete("CASCADE");
t.string("type").notNullable();
t.string("stsEndpoint").notNullable();
t.string("allowedPrincipalArns").notNullable();
t.string("allowedAccountIds").notNullable();
});
}
await createOnUpdateTrigger(knex, TableName.IdentityAwsAuth);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.IdentityAwsAuth);
await dropOnUpdateTrigger(knex, TableName.IdentityAwsAuth);
}

@ -0,0 +1,30 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.IdentityGcpAuth))) {
await knex.schema.createTable(TableName.IdentityGcpAuth, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.bigInteger("accessTokenTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenMaxTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenNumUsesLimit").defaultTo(0).notNullable();
t.jsonb("accessTokenTrustedIps").notNullable();
t.timestamps(true, true, true);
t.uuid("identityId").notNullable().unique();
t.foreign("identityId").references("id").inTable(TableName.Identity).onDelete("CASCADE");
t.string("type").notNullable();
t.string("allowedServiceAccounts").notNullable();
t.string("allowedProjects").notNullable();
t.string("allowedZones").notNullable(); // GCE only (fully qualified zone names)
});
}
await createOnUpdateTrigger(knex, TableName.IdentityGcpAuth);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.IdentityGcpAuth);
await dropOnUpdateTrigger(knex, TableName.IdentityGcpAuth);
}

@ -0,0 +1,24 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.SecretReference))) {
await knex.schema.createTable(TableName.SecretReference, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.string("environment").notNullable();
t.string("secretPath").notNullable();
t.uuid("secretId").notNullable();
t.foreign("secretId").references("id").inTable(TableName.Secret).onDelete("CASCADE");
t.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.SecretReference);
}
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.SecretReference);
await dropOnUpdateTrigger(knex, TableName.SecretReference);
}

@ -0,0 +1,36 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.IdentityKubernetesAuth))) {
await knex.schema.createTable(TableName.IdentityKubernetesAuth, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.bigInteger("accessTokenTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenMaxTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenNumUsesLimit").defaultTo(0).notNullable();
t.jsonb("accessTokenTrustedIps").notNullable();
t.timestamps(true, true, true);
t.uuid("identityId").notNullable().unique();
t.foreign("identityId").references("id").inTable(TableName.Identity).onDelete("CASCADE");
t.string("kubernetesHost").notNullable();
t.text("encryptedCaCert").notNullable();
t.string("caCertIV").notNullable();
t.string("caCertTag").notNullable();
t.text("encryptedTokenReviewerJwt").notNullable();
t.string("tokenReviewerJwtIV").notNullable();
t.string("tokenReviewerJwtTag").notNullable();
t.string("allowedNamespaces").notNullable();
t.string("allowedNames").notNullable();
t.string("allowedAudience").notNullable();
});
}
await createOnUpdateTrigger(knex, TableName.IdentityKubernetesAuth);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.IdentityKubernetesAuth);
await dropOnUpdateTrigger(knex, TableName.IdentityKubernetesAuth);
}

@ -0,0 +1,43 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
const hasIsSyncedColumn = await knex.schema.hasColumn(TableName.Integration, "isSynced");
const hasSyncMessageColumn = await knex.schema.hasColumn(TableName.Integration, "syncMessage");
const hasLastSyncJobId = await knex.schema.hasColumn(TableName.Integration, "lastSyncJobId");
await knex.schema.alterTable(TableName.Integration, (t) => {
if (!hasIsSyncedColumn) {
t.boolean("isSynced").nullable();
}
if (!hasSyncMessageColumn) {
t.text("syncMessage").nullable();
}
if (!hasLastSyncJobId) {
t.string("lastSyncJobId").nullable();
}
});
}
export async function down(knex: Knex): Promise<void> {
const hasIsSyncedColumn = await knex.schema.hasColumn(TableName.Integration, "isSynced");
const hasSyncMessageColumn = await knex.schema.hasColumn(TableName.Integration, "syncMessage");
const hasLastSyncJobId = await knex.schema.hasColumn(TableName.Integration, "lastSyncJobId");
await knex.schema.alterTable(TableName.Integration, (t) => {
if (hasIsSyncedColumn) {
t.dropColumn("isSynced");
}
if (hasSyncMessageColumn) {
t.dropColumn("syncMessage");
}
if (hasLastSyncJobId) {
t.dropColumn("lastSyncJobId");
}
});
}

@ -11,8 +11,8 @@ export const AccessApprovalPoliciesSchema = z.object({
id: z.string().uuid(),
name: z.string(),
approvals: z.number().default(1),
envId: z.string().uuid(),
secretPath: z.string().nullable().optional(),
envId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
});

@ -0,0 +1,27 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const IdentityAwsAuthsSchema = z.object({
id: z.string().uuid(),
accessTokenTTL: z.coerce.number().default(7200),
accessTokenMaxTTL: z.coerce.number().default(7200),
accessTokenNumUsesLimit: z.coerce.number().default(0),
accessTokenTrustedIps: z.unknown(),
createdAt: z.date(),
updatedAt: z.date(),
identityId: z.string().uuid(),
type: z.string(),
stsEndpoint: z.string(),
allowedPrincipalArns: z.string(),
allowedAccountIds: z.string()
});
export type TIdentityAwsAuths = z.infer<typeof IdentityAwsAuthsSchema>;
export type TIdentityAwsAuthsInsert = Omit<z.input<typeof IdentityAwsAuthsSchema>, TImmutableDBKeys>;
export type TIdentityAwsAuthsUpdate = Partial<Omit<z.input<typeof IdentityAwsAuthsSchema>, TImmutableDBKeys>>;

@ -0,0 +1,27 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const IdentityGcpAuthsSchema = z.object({
id: z.string().uuid(),
accessTokenTTL: z.coerce.number().default(7200),
accessTokenMaxTTL: z.coerce.number().default(7200),
accessTokenNumUsesLimit: z.coerce.number().default(0),
accessTokenTrustedIps: z.unknown(),
createdAt: z.date(),
updatedAt: z.date(),
identityId: z.string().uuid(),
type: z.string(),
allowedServiceAccounts: z.string(),
allowedProjects: z.string(),
allowedZones: z.string()
});
export type TIdentityGcpAuths = z.infer<typeof IdentityGcpAuthsSchema>;
export type TIdentityGcpAuthsInsert = Omit<z.input<typeof IdentityGcpAuthsSchema>, TImmutableDBKeys>;
export type TIdentityGcpAuthsUpdate = Partial<Omit<z.input<typeof IdentityGcpAuthsSchema>, TImmutableDBKeys>>;

@ -0,0 +1,35 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const IdentityKubernetesAuthsSchema = z.object({
id: z.string().uuid(),
accessTokenTTL: z.coerce.number().default(7200),
accessTokenMaxTTL: z.coerce.number().default(7200),
accessTokenNumUsesLimit: z.coerce.number().default(0),
accessTokenTrustedIps: z.unknown(),
createdAt: z.date(),
updatedAt: z.date(),
identityId: z.string().uuid(),
kubernetesHost: z.string(),
encryptedCaCert: z.string(),
caCertIV: z.string(),
caCertTag: z.string(),
encryptedTokenReviewerJwt: z.string(),
tokenReviewerJwtIV: z.string(),
tokenReviewerJwtTag: z.string(),
allowedNamespaces: z.string(),
allowedNames: z.string(),
allowedAudience: z.string()
});
export type TIdentityKubernetesAuths = z.infer<typeof IdentityKubernetesAuthsSchema>;
export type TIdentityKubernetesAuthsInsert = Omit<z.input<typeof IdentityKubernetesAuthsSchema>, TImmutableDBKeys>;
export type TIdentityKubernetesAuthsUpdate = Partial<
Omit<z.input<typeof IdentityKubernetesAuthsSchema>, TImmutableDBKeys>
>;

@ -17,6 +17,9 @@ export * from "./group-project-memberships";
export * from "./groups";
export * from "./identities";
export * from "./identity-access-tokens";
export * from "./identity-aws-auths";
export * from "./identity-gcp-auths";
export * from "./identity-kubernetes-auths";
export * from "./identity-org-memberships";
export * from "./identity-project-additional-privilege";
export * from "./identity-project-membership-role";

@ -28,7 +28,10 @@ export const IntegrationsSchema = z.object({
secretPath: z.string().default("/"),
createdAt: z.date(),
updatedAt: z.date(),
lastUsed: z.date().nullable().optional()
lastUsed: z.date().nullable().optional(),
isSynced: z.boolean().nullable().optional(),
syncMessage: z.string().nullable().optional(),
lastSyncJobId: z.string().nullable().optional()
});
export type TIntegrations = z.infer<typeof IntegrationsSchema>;

@ -28,6 +28,7 @@ export enum TableName {
ProjectUserMembershipRole = "project_user_membership_roles",
ProjectKeys = "project_keys",
Secret = "secrets",
SecretReference = "secret_references",
SecretBlindIndex = "secret_blind_indexes",
SecretVersion = "secret_versions",
SecretFolder = "secret_folders",
@ -44,7 +45,10 @@ export enum TableName {
Identity = "identities",
IdentityAccessToken = "identity_access_tokens",
IdentityUniversalAuth = "identity_universal_auths",
IdentityKubernetesAuth = "identity_kubernetes_auths",
IdentityGcpAuth = "identity_gcp_auths",
IdentityUaClientSecret = "identity_ua_client_secrets",
IdentityAwsAuth = "identity_aws_auths",
IdentityOrgMembership = "identity_org_memberships",
IdentityProjectMembership = "identity_project_memberships",
IdentityProjectMembershipRole = "identity_project_membership_role",
@ -142,5 +146,8 @@ export enum ProjectUpgradeStatus {
}
export enum IdentityAuthMethod {
Univeral = "universal-auth"
Univeral = "universal-auth",
KUBERNETES_AUTH = "kubernetes-auth",
GCP_AUTH = "gcp-auth",
AWS_AUTH = "aws-auth"
}

@ -0,0 +1,21 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const SecretReferencesSchema = z.object({
id: z.string().uuid(),
environment: z.string(),
secretPath: z.string(),
secretId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TSecretReferences = z.infer<typeof SecretReferencesSchema>;
export type TSecretReferencesInsert = Omit<z.input<typeof SecretReferencesSchema>, TImmutableDBKeys>;
export type TSecretReferencesUpdate = Partial<Omit<z.input<typeof SecretReferencesSchema>, TImmutableDBKeys>>;

@ -22,7 +22,7 @@ export const UsersSchema = z.object({
updatedAt: z.date(),
isGhost: z.boolean().default(false),
username: z.string(),
isEmailVerified: z.boolean().nullable().optional()
isEmailVerified: z.boolean().default(false).nullable().optional()
});
export type TUsers = z.infer<typeof UsersSchema>;

@ -8,7 +8,7 @@ import { IDENTITY_ADDITIONAL_PRIVILEGE } from "@app/lib/api-docs";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { PermissionSchema, SanitizedIdentityPrivilegeSchema } from "@app/server/routes/sanitizedSchemas";
import { ProjectPermissionSchema, SanitizedIdentityPrivilegeSchema } from "@app/server/routes/sanitizedSchemas";
import { AuthMode } from "@app/services/auth/auth-type";
export const registerIdentityProjectAdditionalPrivilegeRouter = async (server: FastifyZodProvider) => {
@ -39,7 +39,7 @@ export const registerIdentityProjectAdditionalPrivilegeRouter = async (server: F
})
.optional()
.describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.slug),
permissions: PermissionSchema.array().describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.permissions)
permissions: ProjectPermissionSchema.array().describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.permissions)
}),
response: {
200: z.object({
@ -90,7 +90,7 @@ export const registerIdentityProjectAdditionalPrivilegeRouter = async (server: F
})
.optional()
.describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.slug),
permissions: PermissionSchema.array().describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.permissions),
permissions: ProjectPermissionSchema.array().describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.permissions),
temporaryMode: z
.nativeEnum(IdentityProjectAdditionalPrivilegeTemporaryMode)
.describe(IDENTITY_ADDITIONAL_PRIVILEGE.CREATE.temporaryMode),
@ -155,7 +155,7 @@ export const registerIdentityProjectAdditionalPrivilegeRouter = async (server: F
message: "Slug must be a valid slug"
})
.describe(IDENTITY_ADDITIONAL_PRIVILEGE.UPDATE.newSlug),
permissions: PermissionSchema.array().describe(IDENTITY_ADDITIONAL_PRIVILEGE.UPDATE.permissions),
permissions: ProjectPermissionSchema.array().describe(IDENTITY_ADDITIONAL_PRIVILEGE.UPDATE.permissions),
isTemporary: z.boolean().describe(IDENTITY_ADDITIONAL_PRIVILEGE.UPDATE.isTemporary),
temporaryMode: z
.nativeEnum(IdentityProjectAdditionalPrivilegeTemporaryMode)

@ -51,6 +51,7 @@ export enum EventType {
UNAUTHORIZE_INTEGRATION = "unauthorize-integration",
CREATE_INTEGRATION = "create-integration",
DELETE_INTEGRATION = "delete-integration",
MANUAL_SYNC_INTEGRATION = "manual-sync-integration",
ADD_TRUSTED_IP = "add-trusted-ip",
UPDATE_TRUSTED_IP = "update-trusted-ip",
DELETE_TRUSTED_IP = "delete-trusted-ip",
@ -63,9 +64,21 @@ export enum EventType {
ADD_IDENTITY_UNIVERSAL_AUTH = "add-identity-universal-auth",
UPDATE_IDENTITY_UNIVERSAL_AUTH = "update-identity-universal-auth",
GET_IDENTITY_UNIVERSAL_AUTH = "get-identity-universal-auth",
LOGIN_IDENTITY_KUBERNETES_AUTH = "login-identity-kubernetes-auth",
ADD_IDENTITY_KUBERNETES_AUTH = "add-identity-kubernetes-auth",
UPDATE_IDENTITY_KUBENETES_AUTH = "update-identity-kubernetes-auth",
GET_IDENTITY_KUBERNETES_AUTH = "get-identity-kubernetes-auth",
CREATE_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRET = "create-identity-universal-auth-client-secret",
REVOKE_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRET = "revoke-identity-universal-auth-client-secret",
GET_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRETS = "get-identity-universal-auth-client-secret",
LOGIN_IDENTITY_GCP_AUTH = "login-identity-gcp-auth",
ADD_IDENTITY_GCP_AUTH = "add-identity-gcp-auth",
UPDATE_IDENTITY_GCP_AUTH = "update-identity-gcp-auth",
GET_IDENTITY_GCP_AUTH = "get-identity-gcp-auth",
LOGIN_IDENTITY_AWS_AUTH = "login-identity-aws-auth",
ADD_IDENTITY_AWS_AUTH = "add-identity-aws-auth",
UPDATE_IDENTITY_AWS_AUTH = "update-identity-aws-auth",
GET_IDENTITY_AWS_AUTH = "get-identity-aws-auth",
CREATE_ENVIRONMENT = "create-environment",
UPDATE_ENVIRONMENT = "update-environment",
DELETE_ENVIRONMENT = "delete-environment",
@ -269,6 +282,25 @@ interface DeleteIntegrationEvent {
};
}
interface ManualSyncIntegrationEvent {
type: EventType.MANUAL_SYNC_INTEGRATION;
metadata: {
integrationId: string;
integration: string;
environment: string;
secretPath: string;
url?: string;
app?: string;
appId?: string;
targetEnvironment?: string;
targetEnvironmentId?: string;
targetService?: string;
targetServiceId?: string;
path?: string;
region?: string;
};
}
interface AddTrustedIPEvent {
type: EventType.ADD_TRUSTED_IP;
metadata: {
@ -383,6 +415,50 @@ interface GetIdentityUniversalAuthEvent {
};
}
interface LoginIdentityKubernetesAuthEvent {
type: EventType.LOGIN_IDENTITY_KUBERNETES_AUTH;
metadata: {
identityId: string;
identityKubernetesAuthId: string;
identityAccessTokenId: string;
};
}
interface AddIdentityKubernetesAuthEvent {
type: EventType.ADD_IDENTITY_KUBERNETES_AUTH;
metadata: {
identityId: string;
kubernetesHost: string;
allowedNamespaces: string;
allowedNames: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: Array<TIdentityTrustedIp>;
};
}
interface UpdateIdentityKubernetesAuthEvent {
type: EventType.UPDATE_IDENTITY_KUBENETES_AUTH;
metadata: {
identityId: string;
kubernetesHost?: string;
allowedNamespaces?: string;
allowedNames?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: Array<TIdentityTrustedIp>;
};
}
interface GetIdentityKubernetesAuthEvent {
type: EventType.GET_IDENTITY_KUBERNETES_AUTH;
metadata: {
identityId: string;
};
}
interface CreateIdentityUniversalAuthClientSecretEvent {
type: EventType.CREATE_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRET;
metadata: {
@ -406,6 +482,96 @@ interface RevokeIdentityUniversalAuthClientSecretEvent {
};
}
interface LoginIdentityGcpAuthEvent {
type: EventType.LOGIN_IDENTITY_GCP_AUTH;
metadata: {
identityId: string;
identityGcpAuthId: string;
identityAccessTokenId: string;
};
}
interface AddIdentityGcpAuthEvent {
type: EventType.ADD_IDENTITY_GCP_AUTH;
metadata: {
identityId: string;
type: string;
allowedServiceAccounts: string;
allowedProjects: string;
allowedZones: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: Array<TIdentityTrustedIp>;
};
}
interface UpdateIdentityGcpAuthEvent {
type: EventType.UPDATE_IDENTITY_GCP_AUTH;
metadata: {
identityId: string;
type?: string;
allowedServiceAccounts?: string;
allowedProjects?: string;
allowedZones?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: Array<TIdentityTrustedIp>;
};
}
interface GetIdentityGcpAuthEvent {
type: EventType.GET_IDENTITY_GCP_AUTH;
metadata: {
identityId: string;
};
}
interface LoginIdentityAwsAuthEvent {
type: EventType.LOGIN_IDENTITY_AWS_AUTH;
metadata: {
identityId: string;
identityAwsAuthId: string;
identityAccessTokenId: string;
};
}
interface AddIdentityAwsAuthEvent {
type: EventType.ADD_IDENTITY_AWS_AUTH;
metadata: {
identityId: string;
stsEndpoint: string;
allowedPrincipalArns: string;
allowedAccountIds: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: Array<TIdentityTrustedIp>;
};
}
interface UpdateIdentityAwsAuthEvent {
type: EventType.UPDATE_IDENTITY_AWS_AUTH;
metadata: {
identityId: string;
stsEndpoint?: string;
allowedPrincipalArns?: string;
allowedAccountIds?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: Array<TIdentityTrustedIp>;
};
}
interface GetIdentityAwsAuthEvent {
type: EventType.GET_IDENTITY_AWS_AUTH;
metadata: {
identityId: string;
};
}
interface CreateEnvironmentEvent {
type: EventType.CREATE_ENVIRONMENT;
metadata: {
@ -645,6 +811,7 @@ export type Event =
| UnauthorizeIntegrationEvent
| CreateIntegrationEvent
| DeleteIntegrationEvent
| ManualSyncIntegrationEvent
| AddTrustedIPEvent
| UpdateTrustedIPEvent
| DeleteTrustedIPEvent
@ -657,9 +824,21 @@ export type Event =
| AddIdentityUniversalAuthEvent
| UpdateIdentityUniversalAuthEvent
| GetIdentityUniversalAuthEvent
| LoginIdentityKubernetesAuthEvent
| AddIdentityKubernetesAuthEvent
| UpdateIdentityKubernetesAuthEvent
| GetIdentityKubernetesAuthEvent
| CreateIdentityUniversalAuthClientSecretEvent
| GetIdentityUniversalAuthClientSecretsEvent
| RevokeIdentityUniversalAuthClientSecretEvent
| LoginIdentityGcpAuthEvent
| AddIdentityGcpAuthEvent
| UpdateIdentityGcpAuthEvent
| GetIdentityGcpAuthEvent
| LoginIdentityAwsAuthEvent
| AddIdentityAwsAuthEvent
| UpdateIdentityAwsAuthEvent
| GetIdentityAwsAuthEvent
| CreateEnvironmentEvent
| UpdateEnvironmentEvent
| DeleteEnvironmentEvent

@ -7,12 +7,15 @@ import {
SecretType,
TSecretApprovalRequestsSecretsInsert
} from "@app/db/schemas";
import { decryptSymmetric128BitHexKeyUTF8 } from "@app/lib/crypto";
import { BadRequestError, UnauthorizedError } from "@app/lib/errors";
import { groupBy, pick, unique } from "@app/lib/fn";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { ActorType } from "@app/services/auth/auth-type";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service";
import { TSecretDALFactory } from "@app/services/secret/secret-dal";
import { getAllNestedSecretReferences } from "@app/services/secret/secret-fns";
import { TSecretQueueFactory } from "@app/services/secret/secret-queue";
import { TSecretServiceFactory } from "@app/services/secret/secret-service";
import { TSecretVersionDALFactory } from "@app/services/secret/secret-version-dal";
@ -53,6 +56,7 @@ type TSecretApprovalRequestServiceFactoryDep = {
secretVersionDAL: Pick<TSecretVersionDALFactory, "findLatestVersionMany" | "insertMany">;
secretVersionTagDAL: Pick<TSecretVersionTagDALFactory, "insertMany">;
projectDAL: Pick<TProjectDALFactory, "checkProjectUpgradeStatus">;
projectBotService: Pick<TProjectBotServiceFactory, "getBotKey">;
secretService: Pick<
TSecretServiceFactory,
| "fnSecretBulkInsert"
@ -80,7 +84,8 @@ export const secretApprovalRequestServiceFactory = ({
snapshotService,
secretService,
secretVersionDAL,
secretQueueService
secretQueueService,
projectBotService
}: TSecretApprovalRequestServiceFactoryDep) => {
const requestCount = async ({ projectId, actor, actorId, actorOrgId, actorAuthMethod }: TApprovalRequestCountDTO) => {
if (actor === ActorType.SERVICE) throw new BadRequestError({ message: "Cannot use service token" });
@ -352,7 +357,7 @@ export const secretApprovalRequestServiceFactory = ({
}
const secretDeletionCommits = secretApprovalSecrets.filter(({ op }) => op === CommitType.Delete);
const botKey = await projectBotService.getBotKey(projectId).catch(() => null);
const mergeStatus = await secretApprovalRequestDAL.transaction(async (tx) => {
const newSecrets = secretCreationCommits.length
? await secretService.fnSecretBulkInsert({
@ -379,7 +384,17 @@ export const secretApprovalRequestServiceFactory = ({
]),
tags: el?.tags.map(({ id }) => id),
version: 1,
type: SecretType.Shared
type: SecretType.Shared,
references: botKey
? getAllNestedSecretReferences(
decryptSymmetric128BitHexKeyUTF8({
ciphertext: el.secretValueCiphertext,
iv: el.secretValueIV,
tag: el.secretValueTag,
key: botKey
})
)
: undefined
})),
secretDAL,
secretVersionDAL,
@ -414,7 +429,17 @@ export const secretApprovalRequestServiceFactory = ({
"secretReminderNote",
"secretReminderRepeatDays",
"secretBlindIndex"
])
]),
references: botKey
? getAllNestedSecretReferences(
decryptSymmetric128BitHexKeyUTF8({
ciphertext: el.secretValueCiphertext,
iv: el.secretValueIV,
tag: el.secretValueTag,
key: botKey
})
)
: undefined
}
})),
secretDAL,

@ -90,15 +90,17 @@ export const secretScanningServiceFactory = ({
const {
data: { repositories }
} = await octokit.apps.listReposAccessibleToInstallation();
await Promise.all(
repositories.map(({ id, full_name }) =>
secretScanningQueue.startFullRepoScan({
organizationId: session.orgId,
installationId,
repository: { id, fullName: full_name }
})
)
);
if (!appCfg.DISABLE_SECRET_SCANNING) {
await Promise.all(
repositories.map(({ id, full_name }) =>
secretScanningQueue.startFullRepoScan({
organizationId: session.orgId,
installationId,
repository: { id, fullName: full_name }
})
)
);
}
return { installatedApp };
};
@ -151,6 +153,7 @@ export const secretScanningServiceFactory = ({
};
const handleRepoPushEvent = async (payload: WebhookEventMap["push"]) => {
const appCfg = getConfig();
const { commits, repository, installation, pusher } = payload;
if (!commits || !repository || !installation || !pusher) {
return;
@ -161,13 +164,15 @@ export const secretScanningServiceFactory = ({
});
if (!installationLink) return;
await secretScanningQueue.startPushEventScan({
commits,
pusher: { name: pusher.name, email: pusher.email },
repository: { fullName: repository.full_name, id: repository.id },
organizationId: installationLink.orgId,
installationId: String(installation?.id)
});
if (!appCfg.DISABLE_SECRET_SCANNING) {
await secretScanningQueue.startPushEventScan({
commits,
pusher: { name: pusher.name, email: pusher.email },
repository: { fullName: repository.full_name, id: repository.id },
organizationId: installationLink.orgId,
installationId: String(installation?.id)
});
}
};
const handleRepoDeleteEvent = async (installationId: string, repositoryIds: string[]) => {

@ -89,6 +89,21 @@ export const UNIVERSAL_AUTH = {
},
RENEW_ACCESS_TOKEN: {
accessToken: "The access token to renew."
},
REVOKE_ACCESS_TOKEN: {
accessToken: "The access token to revoke."
}
} as const;
export const AWS_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login.",
iamHttpRequestMethod: "The HTTP request method used in the signed request.",
iamRequestUrl:
"The base64-encoded HTTP URL used in the signed request. Most likely, the base64-encoding of https://sts.amazonaws.com/",
iamRequestBody:
"The base64-encoded body of the signed request. Most likely, the base64-encoding of Action=GetCallerIdentity&Version=2011-06-15.",
iamRequestHeaders: "The base64-encoded headers of the sts:GetCallerIdentity signed request."
}
} as const;
@ -133,36 +148,6 @@ export const PROJECTS = {
name: "The new name of the project.",
autoCapitalization: "Disable or enable auto-capitalization for the project."
},
INVITE_MEMBER: {
projectId: "The ID of the project to invite the member to.",
emails: "A list of organization member emails to invite to the project.",
usernames: "A list of usernames to invite to the project."
},
REMOVE_MEMBER: {
projectId: "The ID of the project to remove the member from.",
emails: "A list of organization member emails to remove from the project.",
usernames: "A list of usernames to remove from the project."
},
GET_USER_MEMBERSHIPS: {
workspaceId: "The ID of the project to get memberships from."
},
UPDATE_USER_MEMBERSHIP: {
workspaceId: "The ID of the project to update the membership for.",
membershipId: "The ID of the membership to update.",
roles: "A list of roles to update the membership to."
},
LIST_IDENTITY_MEMBERSHIPS: {
projectId: "The ID of the project to get identity memberships from."
},
UPDATE_IDENTITY_MEMBERSHIP: {
projectId: "The ID of the project to update the identity membership for.",
identityId: "The ID of the identity to update the membership for.",
roles: "A list of roles to update the membership to."
},
DELETE_IDENTITY_MEMBERSHIP: {
projectId: "The ID of the project to delete the identity membership from.",
identityId: "The ID of the identity to delete the membership from."
},
GET_KEY: {
workspaceId: "The ID of the project to get the key from."
},
@ -201,6 +186,72 @@ export const PROJECTS = {
}
} as const;
export const PROJECT_USERS = {
INVITE_MEMBER: {
projectId: "The ID of the project to invite the member to.",
emails: "A list of organization member emails to invite to the project.",
usernames: "A list of usernames to invite to the project."
},
REMOVE_MEMBER: {
projectId: "The ID of the project to remove the member from.",
emails: "A list of organization member emails to remove from the project.",
usernames: "A list of usernames to remove from the project."
},
GET_USER_MEMBERSHIPS: {
workspaceId: "The ID of the project to get memberships from."
},
GET_USER_MEMBERSHIP: {
workspaceId: "The ID of the project to get memberships from.",
username: "The username to get project membership of. Email is the default username."
},
UPDATE_USER_MEMBERSHIP: {
workspaceId: "The ID of the project to update the membership for.",
membershipId: "The ID of the membership to update.",
roles: "A list of roles to update the membership to."
}
};
export const PROJECT_IDENTITIES = {
LIST_IDENTITY_MEMBERSHIPS: {
projectId: "The ID of the project to get identity memberships from."
},
GET_IDENTITY_MEMBERSHIP_BY_ID: {
identityId: "The ID of the identity to get the membership for.",
projectId: "The ID of the project to get the identity membership for."
},
UPDATE_IDENTITY_MEMBERSHIP: {
projectId: "The ID of the project to update the identity membership for.",
identityId: "The ID of the identity to update the membership for.",
roles: {
description: "A list of role slugs to assign to the identity project membership.",
role: "The role slug to assign to the newly created identity project membership.",
isTemporary:
"Whether the assigned role is temporary. If isTemporary is set true, must provide temporaryMode, temporaryRange and temporaryAccessStartTime.",
temporaryMode: "Type of temporary expiry.",
temporaryRange: "Expiry time for temporary access. In relative mode it could be 1s,2m,3h",
temporaryAccessStartTime: "Time to which the temporary access starts"
}
},
DELETE_IDENTITY_MEMBERSHIP: {
projectId: "The ID of the project to delete the identity membership from.",
identityId: "The ID of the identity to delete the membership from."
},
CREATE_IDENTITY_MEMBERSHIP: {
projectId: "The ID of the project to create the identity membership from.",
identityId: "The ID of the identity to create the membership from.",
role: "The role slug to assign to the newly created identity project membership.",
roles: {
description: "A list of role slugs to assign to the newly created identity project membership.",
role: "The role slug to assign to the newly created identity project membership.",
isTemporary:
"Whether the assigned role is temporary. If isTemporary is set true, must provide temporaryMode, temporaryRange and temporaryAccessStartTime.",
temporaryMode: "Type of temporary expiry.",
temporaryRange: "Expiry time for temporary access. In relative mode it could be 1s,2m,3h",
temporaryAccessStartTime: "Time to which the temporary access starts"
}
}
};
export const ENVIRONMENTS = {
CREATE: {
workspaceId: "The ID of the project to create the environment in.",
@ -240,6 +291,7 @@ export const FOLDERS = {
name: "The new name of the folder.",
path: "The path of the folder to update.",
directory: "The new directory of the folder to update. (Deprecated in favor of path)",
projectSlug: "The slug of the project where the folder is located.",
workspaceId: "The ID of the project where the folder is located."
},
DELETE: {
@ -276,7 +328,8 @@ export const RAW_SECRETS = {
recursive:
"Whether or not to fetch all secrets from the specified base path, and all of its subdirectories. Note, the max depth is 20 deep.",
workspaceId: "The ID of the project to list secrets from.",
workspaceSlug: "The slug of the project to list secrets from. This parameter is only usable by machine identities.",
workspaceSlug:
"The slug of the project to list secrets from. This parameter is only applicable by machine identities.",
environment: "The slug of the environment to list secrets from.",
secretPath: "The secret path to list secrets from.",
includeImports: "Weather to include imported secrets or not."
@ -295,6 +348,7 @@ export const RAW_SECRETS = {
GET: {
secretName: "The name of the secret to get.",
workspaceId: "The ID of the project to get the secret from.",
workspaceSlug: "The slug of the project to get the secret from.",
environment: "The slug of the environment to get the secret from.",
secretPath: "The path of the secret to get.",
version: "The version of the secret to get.",
@ -613,7 +667,8 @@ export const INTEGRATION = {
shouldAutoRedeploy: "Used by Render to trigger auto deploy.",
secretGCPLabel: "The label for GCP secrets.",
secretAWSTag: "The tags for AWS secrets.",
kmsKeyId: "The ID of the encryption key from AWS KMS."
kmsKeyId: "The ID of the encryption key from AWS KMS.",
shouldDisableDelete: "The flag to disable deletion of secrets in AWS Parameter Store."
}
},
UPDATE: {
@ -630,6 +685,9 @@ export const INTEGRATION = {
},
DELETE: {
integrationId: "The ID of the integration object."
},
SYNC: {
integrationId: "The ID of the integration object to manually sync"
}
};

@ -13,6 +13,10 @@ const zodStrBool = z
const envSchema = z
.object({
PORT: z.coerce.number().default(4000),
DISABLE_SECRET_SCANNING: z
.enum(["true", "false"])
.default("false")
.transform((el) => el === "true"),
REDIS_URL: zpStr(z.string()),
HOST: zpStr(z.string().default("localhost")),
DB_CONNECTION_URI: zpStr(z.string().describe("Postgres database connection string")).default(

@ -30,6 +30,37 @@ const loggerConfig = z.object({
NODE_ENV: z.enum(["development", "test", "production"]).default("production")
});
const redactedKeys = [
"accessToken",
"authToken",
"serviceToken",
"identityAccessToken",
"token",
"privateKey",
"serverPrivateKey",
"plainPrivateKey",
"plainProjectKey",
"encryptedPrivateKey",
"userPrivateKey",
"protectedKey",
"decryptKey",
"encryptedProjectKey",
"encryptedSymmetricKey",
"encryptedPrivateKey",
"backupPrivateKey",
"secretKey",
"SecretKey",
"botPrivateKey",
"encryptedKey",
"plaintextProjectKey",
"accessKey",
"botKey",
"decryptedSecret",
"secrets",
"key",
"password"
];
export const initLogger = async () => {
const cfg = loggerConfig.parse(process.env);
const targets: pino.TransportMultiOptions["targets"][number][] = [
@ -74,7 +105,9 @@ export const initLogger = async () => {
hostname: bindings.hostname
// node_version: process.version
})
}
},
// redact until depth of three
redact: [...redactedKeys, ...redactedKeys.map((key) => `*.${key}`), ...redactedKeys.map((key) => `*.*.${key}`)]
},
// eslint-disable-next-line @typescript-eslint/no-unsafe-argument
transport

@ -65,7 +65,13 @@ export type TQueueJobTypes = {
};
[QueueName.IntegrationSync]: {
name: QueueJobs.IntegrationSync;
payload: { projectId: string; environment: string; secretPath: string; depth?: number };
payload: {
projectId: string;
environment: string;
secretPath: string;
depth?: number;
deDupeQueue?: Record<string, boolean>;
};
};
[QueueName.SecretFullRepoScan]: {
name: QueueJobs.SecretScan;

@ -8,8 +8,6 @@ import cors from "@fastify/cors";
import fastifyEtag from "@fastify/etag";
import fastifyFormBody from "@fastify/formbody";
import helmet from "@fastify/helmet";
import type { FastifyRateLimitOptions } from "@fastify/rate-limit";
import ratelimiter from "@fastify/rate-limit";
import fasitfy from "fastify";
import { Knex } from "knex";
import { Logger } from "pino";
@ -19,7 +17,6 @@ import { getConfig } from "@app/lib/config/env";
import { TQueueServiceFactory } from "@app/queue";
import { TSmtpService } from "@app/services/smtp/smtp-service";
import { globalRateLimiterCfg } from "./config/rateLimiter";
import { fastifyErrHandler } from "./plugins/error-handler";
import { registerExternalNextjs } from "./plugins/external-nextjs";
import { serializerCompiler, validatorCompiler, ZodTypeProvider } from "./plugins/fastify-zod";
@ -67,10 +64,6 @@ export const main = async ({ db, smtp, logger, queue, keyStore }: TMain) => {
await server.register(fastifyFormBody);
await server.register(fastifyErrHandler);
// Rate limiters and security headers
if (appCfg.isProductionMode) {
await server.register<FastifyRateLimitOptions>(ratelimiter, globalRateLimiterCfg());
}
await server.register(helmet, { contentSecurityPolicy: false });
await server.register(maintenanceMode);

@ -1,20 +1,34 @@
import type { RateLimitOptions, RateLimitPluginOptions } from "@fastify/rate-limit";
import { FastifyRequest } from "fastify";
import { Redis } from "ioredis";
import { getConfig } from "@app/lib/config/env";
import { ActorType } from "@app/services/auth/auth-type";
const getDistinctRequestActorId = (req: FastifyRequest) => {
if (req?.auth?.actor === ActorType.USER) {
return req.auth.user.username;
}
if (req?.auth?.actor === ActorType.IDENTITY) {
return `${req.auth.identityId}-machine-identity-`;
}
if (req?.auth?.actor === ActorType.SERVICE) {
return `${req.auth.serviceToken.id}-service-token`; // when user gets removed from system
}
return req.realIp;
};
export const globalRateLimiterCfg = (): RateLimitPluginOptions => {
const appCfg = getConfig();
const redis = appCfg.isRedisConfigured
? new Redis(appCfg.REDIS_URL, { connectTimeout: 500, maxRetriesPerRequest: 1 })
: null;
return {
timeWindow: 60 * 1000,
max: 600,
redis,
allowList: (req) => req.url === "/healthcheck" || req.url === "/api/status",
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};
};
@ -22,14 +36,14 @@ export const globalRateLimiterCfg = (): RateLimitPluginOptions => {
export const readLimit: RateLimitOptions = {
timeWindow: 60 * 1000,
max: 600,
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};
// POST, PATCH, PUT, DELETE endpoints
export const writeLimit: RateLimitOptions = {
timeWindow: 60 * 1000,
max: 50,
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};
// special endpoints
@ -37,24 +51,24 @@ export const secretsLimit: RateLimitOptions = {
// secrets, folders, secret imports
timeWindow: 60 * 1000,
max: 1000,
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};
export const authRateLimit: RateLimitOptions = {
timeWindow: 60 * 1000,
max: 60,
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};
export const inviteUserRateLimit: RateLimitOptions = {
timeWindow: 60 * 1000,
max: 30,
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};
export const creationLimit: RateLimitOptions = {
// identity, project, org
timeWindow: 60 * 1000,
max: 30,
keyGenerator: (req) => req.realIp
keyGenerator: (req) => getDistinctRequestActorId(req)
};

@ -5,8 +5,13 @@ import { getConfig } from "@app/lib/config/env";
export const maintenanceMode = fp(async (fastify) => {
fastify.addHook("onRequest", async (req) => {
const serverEnvs = getConfig();
if (req.url !== "/api/v1/auth/checkAuth" && req.method !== "GET" && serverEnvs.MAINTENANCE_MODE) {
throw new Error("Infisical is in maintenance mode. Please try again later.");
if (serverEnvs.MAINTENANCE_MODE) {
// skip if its universal auth login or renew
if (req.url === "/api/v1/auth/universal-auth/login" && req.method === "POST") return;
if (req.url === "/api/v1/auth/token/renew" && req.method === "POST") return;
if (req.url !== "/api/v1/auth/checkAuth" && req.method !== "GET") {
throw new Error("Infisical is in maintenance mode. Please try again later.");
}
}
});
});

@ -1,3 +1,4 @@
import ratelimiter, { FastifyRateLimitOptions } from "@fastify/rate-limit";
import { Knex } from "knex";
import { z } from "zod";
@ -61,7 +62,7 @@ import { trustedIpServiceFactory } from "@app/ee/services/trusted-ip/trusted-ip-
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { TQueueServiceFactory } from "@app/queue";
import { readLimit } from "@app/server/config/rateLimiter";
import { globalRateLimiterCfg, readLimit } from "@app/server/config/rateLimiter";
import { apiKeyDALFactory } from "@app/services/api-key/api-key-dal";
import { apiKeyServiceFactory } from "@app/services/api-key/api-key-service";
import { authDALFactory } from "@app/services/auth/auth-dal";
@ -78,6 +79,12 @@ import { identityOrgDALFactory } from "@app/services/identity/identity-org-dal";
import { identityServiceFactory } from "@app/services/identity/identity-service";
import { identityAccessTokenDALFactory } from "@app/services/identity-access-token/identity-access-token-dal";
import { identityAccessTokenServiceFactory } from "@app/services/identity-access-token/identity-access-token-service";
import { identityAwsAuthDALFactory } from "@app/services/identity-aws-auth/identity-aws-auth-dal";
import { identityAwsAuthServiceFactory } from "@app/services/identity-aws-auth/identity-aws-auth-service";
import { identityGcpAuthDALFactory } from "@app/services/identity-gcp-auth/identity-gcp-auth-dal";
import { identityGcpAuthServiceFactory } from "@app/services/identity-gcp-auth/identity-gcp-auth-service";
import { identityKubernetesAuthDALFactory } from "@app/services/identity-kubernetes-auth/identity-kubernetes-auth-dal";
import { identityKubernetesAuthServiceFactory } from "@app/services/identity-kubernetes-auth/identity-kubernetes-auth-service";
import { identityProjectDALFactory } from "@app/services/identity-project/identity-project-dal";
import { identityProjectMembershipRoleDALFactory } from "@app/services/identity-project/identity-project-membership-role-dal";
import { identityProjectServiceFactory } from "@app/services/identity-project/identity-project-service";
@ -154,7 +161,10 @@ export const registerRoutes = async (
keyStore
}: { db: Knex; smtp: TSmtpService; queue: TQueueServiceFactory; keyStore: TKeyStoreFactory }
) => {
await server.register(registerSecretScannerGhApp, { prefix: "/ss-webhook" });
const appCfg = getConfig();
if (!appCfg.DISABLE_SECRET_SCANNING) {
await server.register(registerSecretScannerGhApp, { prefix: "/ss-webhook" });
}
// db layers
const userDAL = userDALFactory(db);
@ -200,7 +210,11 @@ export const registerRoutes = async (
const identityProjectAdditionalPrivilegeDAL = identityProjectAdditionalPrivilegeDALFactory(db);
const identityUaDAL = identityUaDALFactory(db);
const identityKubernetesAuthDAL = identityKubernetesAuthDALFactory(db);
const identityUaClientSecretDAL = identityUaClientSecretDALFactory(db);
const identityAwsAuthDAL = identityAwsAuthDALFactory(db);
const identityGcpAuthDAL = identityGcpAuthDALFactory(db);
const auditLogDAL = auditLogDALFactory(db);
const auditLogStreamDAL = auditLogStreamDALFactory(db);
@ -535,8 +549,10 @@ export const registerRoutes = async (
folderDAL,
folderVersionDAL,
projectEnvDAL,
snapshotService
snapshotService,
projectDAL
});
const integrationAuthService = integrationAuthServiceFactory({
integrationAuthDAL,
integrationDAL,
@ -595,6 +611,7 @@ export const registerRoutes = async (
});
const sarService = secretApprovalRequestServiceFactory({
permissionService,
projectBotService,
folderDAL,
secretDAL,
secretTagDAL,
@ -699,6 +716,32 @@ export const registerRoutes = async (
identityUaDAL,
licenseService
});
const identityKubernetesAuthService = identityKubernetesAuthServiceFactory({
identityKubernetesAuthDAL,
identityOrgMembershipDAL,
identityAccessTokenDAL,
identityDAL,
orgBotDAL,
permissionService,
licenseService
});
const identityGcpAuthService = identityGcpAuthServiceFactory({
identityGcpAuthDAL,
identityOrgMembershipDAL,
identityAccessTokenDAL,
identityDAL,
permissionService,
licenseService
});
const identityAwsAuthService = identityAwsAuthServiceFactory({
identityAccessTokenDAL,
identityAwsAuthDAL,
identityOrgMembershipDAL,
identityDAL,
licenseService,
permissionService
});
const dynamicSecretProviders = buildDynamicSecretProviders();
const dynamicSecretQueueService = dynamicSecretLeaseQueueServiceFactory({
@ -768,6 +811,9 @@ export const registerRoutes = async (
identityAccessToken: identityAccessTokenService,
identityProject: identityProjectService,
identityUa: identityUaService,
identityKubernetesAuth: identityKubernetesAuthService,
identityGcpAuth: identityGcpAuthService,
identityAwsAuth: identityAwsAuthService,
secretApprovalPolicy: sapService,
accessApprovalPolicy: accessApprovalPolicyService,
accessApprovalRequest: accessApprovalRequestService,
@ -794,6 +840,11 @@ export const registerRoutes = async (
user: userDAL
});
// Rate limiters and security headers
if (appCfg.isProductionMode) {
await server.register<FastifyRateLimitOptions>(ratelimiter, globalRateLimiterCfg());
}
await server.register(injectIdentity, { userDAL, serviceTokenDAL });
await server.register(injectPermission);
await server.register(injectAuditLogInfo);

@ -8,6 +8,7 @@ import {
UsersSchema
} from "@app/db/schemas";
import { UnpackedPermissionSchema } from "@app/ee/services/identity-project-additional-privilege/identity-project-additional-privilege-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
// sometimes the return data must be santizied to avoid leaking important values
// always prefer pick over omit in zod
@ -64,14 +65,12 @@ export const secretRawSchema = z.object({
secretComment: z.string().optional()
});
export const PermissionSchema = z.object({
export const ProjectPermissionSchema = z.object({
action: z
.string()
.min(1)
.nativeEnum(ProjectPermissionActions)
.describe("Describe what action an entity can take. Possible actions: create, edit, delete, and read"),
subject: z
.string()
.min(1)
.nativeEnum(ProjectPermissionSub)
.describe("The entity this permission pertains to. Possible options: secrets, environments"),
conditions: z
.object({

@ -20,16 +20,23 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
schema: {
response: {
200: z.object({
config: SuperAdminSchema.omit({ createdAt: true, updatedAt: true }).merge(
z.object({ isMigrationModeOn: z.boolean() })
)
config: SuperAdminSchema.omit({ createdAt: true, updatedAt: true }).extend({
isMigrationModeOn: z.boolean(),
isSecretScanningDisabled: z.boolean()
})
})
}
},
handler: async () => {
const config = await getServerCfg();
const serverEnvs = getConfig();
return { config: { ...config, isMigrationModeOn: serverEnvs.MAINTENANCE_MODE } };
return {
config: {
...config,
isMigrationModeOn: serverEnvs.MAINTENANCE_MODE,
isSecretScanningDisabled: serverEnvs.DISABLE_SECRET_SCANNING
}
};
}
});

@ -36,4 +36,29 @@ export const registerIdentityAccessTokenRouter = async (server: FastifyZodProvid
};
}
});
server.route({
url: "/token/revoke",
method: "POST",
config: {
rateLimit: writeLimit
},
schema: {
description: "Revoke access token",
body: z.object({
accessToken: z.string().trim().describe(UNIVERSAL_AUTH.REVOKE_ACCESS_TOKEN.accessToken)
}),
response: {
200: z.object({
message: z.string()
})
}
},
handler: async (req) => {
await server.services.identityAccessToken.revokeAccessToken(req.body.accessToken);
return {
message: "Successfully revoked access token"
};
}
});
};

@ -0,0 +1,269 @@
import { z } from "zod";
import { IdentityAwsAuthsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { AWS_AUTH } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { TIdentityTrustedIp } from "@app/services/identity/identity-types";
import {
validateAccountIds,
validatePrincipalArns
} from "@app/services/identity-aws-auth/identity-aws-auth-validators";
export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider) => {
server.route({
method: "POST",
url: "/aws-auth/login",
config: {
rateLimit: writeLimit
},
schema: {
description: "Login with AWS Auth",
body: z.object({
identityId: z.string().describe(AWS_AUTH.LOGIN.identityId),
iamHttpRequestMethod: z.string().default("POST").describe(AWS_AUTH.LOGIN.iamHttpRequestMethod),
iamRequestBody: z.string().describe(AWS_AUTH.LOGIN.iamRequestBody),
iamRequestHeaders: z.string().describe(AWS_AUTH.LOGIN.iamRequestHeaders)
}),
response: {
200: z.object({
accessToken: z.string(),
expiresIn: z.coerce.number(),
accessTokenMaxTTL: z.coerce.number(),
tokenType: z.literal("Bearer")
})
}
},
handler: async (req) => {
const { identityAwsAuth, accessToken, identityAccessToken, identityMembershipOrg } =
await server.services.identityAwsAuth.login(req.body);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityMembershipOrg?.orgId,
event: {
type: EventType.LOGIN_IDENTITY_AWS_AUTH,
metadata: {
identityId: identityAwsAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
identityAwsAuthId: identityAwsAuth.id
}
}
});
return {
accessToken,
tokenType: "Bearer" as const,
expiresIn: identityAwsAuth.accessTokenTTL,
accessTokenMaxTTL: identityAwsAuth.accessTokenMaxTTL
};
}
});
server.route({
method: "POST",
url: "/aws-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Attach AWS Auth configuration onto identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim()
}),
body: z.object({
stsEndpoint: z.string().trim().min(1).default("https://sts.amazonaws.com/"),
allowedPrincipalArns: validatePrincipalArns,
allowedAccountIds: validateAccountIds,
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
accessTokenTTL: z
.number()
.int()
.min(1)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
accessTokenMaxTTL: z
.number()
.int()
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
}),
response: {
200: z.object({
identityAwsAuth: IdentityAwsAuthsSchema
})
}
},
handler: async (req) => {
const identityAwsAuth = await server.services.identityAwsAuth.attachAwsAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityAwsAuth.orgId,
event: {
type: EventType.ADD_IDENTITY_AWS_AUTH,
metadata: {
identityId: identityAwsAuth.identityId,
stsEndpoint: identityAwsAuth.stsEndpoint,
allowedPrincipalArns: identityAwsAuth.allowedPrincipalArns,
allowedAccountIds: identityAwsAuth.allowedAccountIds,
accessTokenTTL: identityAwsAuth.accessTokenTTL,
accessTokenMaxTTL: identityAwsAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityAwsAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityAwsAuth.accessTokenNumUsesLimit
}
}
});
return { identityAwsAuth };
}
});
server.route({
method: "PATCH",
url: "/aws-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Update AWS Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
}),
body: z.object({
stsEndpoint: z.string().trim().min(1).optional(),
allowedPrincipalArns: validatePrincipalArns,
allowedAccountIds: validateAccountIds,
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
accessTokenMaxTTL: z
.number()
.int()
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
}),
response: {
200: z.object({
identityAwsAuth: IdentityAwsAuthsSchema
})
}
},
handler: async (req) => {
const identityAwsAuth = await server.services.identityAwsAuth.updateAwsAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityAwsAuth.orgId,
event: {
type: EventType.UPDATE_IDENTITY_AWS_AUTH,
metadata: {
identityId: identityAwsAuth.identityId,
stsEndpoint: identityAwsAuth.stsEndpoint,
allowedPrincipalArns: identityAwsAuth.allowedPrincipalArns,
allowedAccountIds: identityAwsAuth.allowedAccountIds,
accessTokenTTL: identityAwsAuth.accessTokenTTL,
accessTokenMaxTTL: identityAwsAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityAwsAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityAwsAuth.accessTokenNumUsesLimit
}
}
});
return { identityAwsAuth };
}
});
server.route({
method: "GET",
url: "/aws-auth/identities/:identityId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Retrieve AWS Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
}),
response: {
200: z.object({
identityAwsAuth: IdentityAwsAuthsSchema
})
}
},
handler: async (req) => {
const identityAwsAuth = await server.services.identityAwsAuth.getAwsAuth({
identityId: req.params.identityId,
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityAwsAuth.orgId,
event: {
type: EventType.GET_IDENTITY_AWS_AUTH,
metadata: {
identityId: identityAwsAuth.identityId
}
}
});
return { identityAwsAuth };
}
});
};

@ -0,0 +1,268 @@
import { z } from "zod";
import { IdentityGcpAuthsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { TIdentityTrustedIp } from "@app/services/identity/identity-types";
import { validateGcpAuthField } from "@app/services/identity-gcp-auth/identity-gcp-auth-validators";
export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider) => {
server.route({
method: "POST",
url: "/gcp-auth/login",
config: {
rateLimit: writeLimit
},
schema: {
description: "Login with GCP Auth",
body: z.object({
identityId: z.string(),
jwt: z.string()
}),
response: {
200: z.object({
accessToken: z.string(),
expiresIn: z.coerce.number(),
accessTokenMaxTTL: z.coerce.number(),
tokenType: z.literal("Bearer")
})
}
},
handler: async (req) => {
const { identityGcpAuth, accessToken, identityAccessToken, identityMembershipOrg } =
await server.services.identityGcpAuth.login(req.body);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityMembershipOrg?.orgId,
event: {
type: EventType.LOGIN_IDENTITY_GCP_AUTH,
metadata: {
identityId: identityGcpAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
identityGcpAuthId: identityGcpAuth.id
}
}
});
return {
accessToken,
tokenType: "Bearer" as const,
expiresIn: identityGcpAuth.accessTokenTTL,
accessTokenMaxTTL: identityGcpAuth.accessTokenMaxTTL
};
}
});
server.route({
method: "POST",
url: "/gcp-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Attach GCP Auth configuration onto identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim()
}),
body: z.object({
type: z.enum(["iam", "gce"]),
allowedServiceAccounts: validateGcpAuthField,
allowedProjects: validateGcpAuthField,
allowedZones: validateGcpAuthField,
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
accessTokenTTL: z
.number()
.int()
.min(1)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
accessTokenMaxTTL: z
.number()
.int()
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
}),
response: {
200: z.object({
identityGcpAuth: IdentityGcpAuthsSchema
})
}
},
handler: async (req) => {
const identityGcpAuth = await server.services.identityGcpAuth.attachGcpAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityGcpAuth.orgId,
event: {
type: EventType.ADD_IDENTITY_GCP_AUTH,
metadata: {
identityId: identityGcpAuth.identityId,
type: identityGcpAuth.type,
allowedServiceAccounts: identityGcpAuth.allowedServiceAccounts,
allowedProjects: identityGcpAuth.allowedProjects,
allowedZones: identityGcpAuth.allowedZones,
accessTokenTTL: identityGcpAuth.accessTokenTTL,
accessTokenMaxTTL: identityGcpAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityGcpAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityGcpAuth.accessTokenNumUsesLimit
}
}
});
return { identityGcpAuth };
}
});
server.route({
method: "PATCH",
url: "/gcp-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Update GCP Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim()
}),
body: z.object({
type: z.enum(["iam", "gce"]).optional(),
allowedServiceAccounts: validateGcpAuthField,
allowedProjects: validateGcpAuthField,
allowedZones: validateGcpAuthField,
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
accessTokenMaxTTL: z
.number()
.int()
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
}),
response: {
200: z.object({
identityGcpAuth: IdentityGcpAuthsSchema
})
}
},
handler: async (req) => {
const identityGcpAuth = await server.services.identityGcpAuth.updateGcpAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityGcpAuth.orgId,
event: {
type: EventType.UPDATE_IDENTITY_GCP_AUTH,
metadata: {
identityId: identityGcpAuth.identityId,
type: identityGcpAuth.type,
allowedServiceAccounts: identityGcpAuth.allowedServiceAccounts,
allowedProjects: identityGcpAuth.allowedProjects,
allowedZones: identityGcpAuth.allowedZones,
accessTokenTTL: identityGcpAuth.accessTokenTTL,
accessTokenMaxTTL: identityGcpAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityGcpAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityGcpAuth.accessTokenNumUsesLimit
}
}
});
return { identityGcpAuth };
}
});
server.route({
method: "GET",
url: "/gcp-auth/identities/:identityId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Retrieve GCP Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
}),
response: {
200: z.object({
identityGcpAuth: IdentityGcpAuthsSchema
})
}
},
handler: async (req) => {
const identityGcpAuth = await server.services.identityGcpAuth.getGcpAuth({
identityId: req.params.identityId,
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityGcpAuth.orgId,
event: {
type: EventType.GET_IDENTITY_GCP_AUTH,
metadata: {
identityId: identityGcpAuth.identityId
}
}
});
return { identityGcpAuth };
}
});
};

@ -0,0 +1,283 @@
import { z } from "zod";
import { IdentityKubernetesAuthsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { TIdentityTrustedIp } from "@app/services/identity/identity-types";
const IdentityKubernetesAuthResponseSchema = IdentityKubernetesAuthsSchema.omit({
encryptedCaCert: true,
caCertIV: true,
caCertTag: true,
encryptedTokenReviewerJwt: true,
tokenReviewerJwtIV: true,
tokenReviewerJwtTag: true
}).extend({
caCert: z.string(),
tokenReviewerJwt: z.string()
});
export const registerIdentityKubernetesRouter = async (server: FastifyZodProvider) => {
server.route({
method: "POST",
url: "/kubernetes-auth/login",
config: {
rateLimit: writeLimit
},
schema: {
description: "Login with Kubernetes Auth",
body: z.object({
identityId: z.string().trim(),
jwt: z.string().trim()
}),
response: {
200: z.object({
accessToken: z.string(),
expiresIn: z.coerce.number(),
accessTokenMaxTTL: z.coerce.number(),
tokenType: z.literal("Bearer")
})
}
},
handler: async (req) => {
const { identityKubernetesAuth, accessToken, identityAccessToken, identityMembershipOrg } =
await server.services.identityKubernetesAuth.login({
identityId: req.body.identityId,
jwt: req.body.jwt
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityMembershipOrg?.orgId,
event: {
type: EventType.LOGIN_IDENTITY_KUBERNETES_AUTH,
metadata: {
identityId: identityKubernetesAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
identityKubernetesAuthId: identityKubernetesAuth.id
}
}
});
return {
accessToken,
tokenType: "Bearer" as const,
expiresIn: identityKubernetesAuth.accessTokenTTL,
accessTokenMaxTTL: identityKubernetesAuth.accessTokenMaxTTL
};
}
});
server.route({
method: "POST",
url: "/kubernetes-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Attach Kubernetes Auth configuration onto identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim()
}),
body: z.object({
kubernetesHost: z.string().trim().min(1),
caCert: z.string().trim().default(""),
tokenReviewerJwt: z.string().trim().min(1),
allowedNamespaces: z.string(), // TODO: validation
allowedNames: z.string(),
allowedAudience: z.string(),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
accessTokenTTL: z
.number()
.int()
.min(1)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
accessTokenMaxTTL: z
.number()
.int()
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
}),
response: {
200: z.object({
identityKubernetesAuth: IdentityKubernetesAuthResponseSchema
})
}
},
handler: async (req) => {
const identityKubernetesAuth = await server.services.identityKubernetesAuth.attachKubernetesAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityKubernetesAuth.orgId,
event: {
type: EventType.ADD_IDENTITY_KUBERNETES_AUTH,
metadata: {
identityId: identityKubernetesAuth.identityId,
kubernetesHost: identityKubernetesAuth.kubernetesHost,
allowedNamespaces: identityKubernetesAuth.allowedNamespaces,
allowedNames: identityKubernetesAuth.allowedNames,
accessTokenTTL: identityKubernetesAuth.accessTokenTTL,
accessTokenMaxTTL: identityKubernetesAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityKubernetesAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityKubernetesAuth.accessTokenNumUsesLimit
}
}
});
return { identityKubernetesAuth: IdentityKubernetesAuthResponseSchema.parse(identityKubernetesAuth) };
}
});
server.route({
method: "PATCH",
url: "/kubernetes-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Update Kubernetes Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
}),
body: z.object({
kubernetesHost: z.string().trim().min(1).optional(),
caCert: z.string().trim().optional(),
tokenReviewerJwt: z.string().trim().min(1).optional(),
allowedNamespaces: z.string().optional(), // TODO: validation
allowedNames: z.string().optional(),
allowedAudience: z.string().optional(),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
accessTokenMaxTTL: z
.number()
.int()
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
}),
response: {
200: z.object({
identityKubernetesAuth: IdentityKubernetesAuthsSchema
})
}
},
handler: async (req) => {
const identityKubernetesAuth = await server.services.identityKubernetesAuth.updateKubernetesAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityKubernetesAuth.orgId,
event: {
type: EventType.UPDATE_IDENTITY_KUBENETES_AUTH,
metadata: {
identityId: identityKubernetesAuth.identityId,
kubernetesHost: identityKubernetesAuth.kubernetesHost,
allowedNamespaces: identityKubernetesAuth.allowedNamespaces,
allowedNames: identityKubernetesAuth.allowedNames,
accessTokenTTL: identityKubernetesAuth.accessTokenTTL,
accessTokenMaxTTL: identityKubernetesAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityKubernetesAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityKubernetesAuth.accessTokenNumUsesLimit
}
}
});
return { identityKubernetesAuth };
}
});
server.route({
method: "GET",
url: "/kubernetes-auth/identities/:identityId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Retrieve Kubernetes Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
}),
response: {
200: z.object({
identityKubernetesAuth: IdentityKubernetesAuthResponseSchema
})
}
},
handler: async (req) => {
const identityKubernetesAuth = await server.services.identityKubernetesAuth.getKubernetesAuth({
identityId: req.params.identityId,
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityKubernetesAuth.orgId,
event: {
type: EventType.GET_IDENTITY_KUBERNETES_AUTH,
metadata: {
identityId: identityKubernetesAuth.identityId
}
}
});
return { identityKubernetesAuth: IdentityKubernetesAuthResponseSchema.parse(identityKubernetesAuth) };
}
});
};

@ -2,6 +2,9 @@ import { registerAdminRouter } from "./admin-router";
import { registerAuthRoutes } from "./auth-router";
import { registerProjectBotRouter } from "./bot-router";
import { registerIdentityAccessTokenRouter } from "./identity-access-token-router";
import { registerIdentityAwsAuthRouter } from "./identity-aws-iam-auth-router";
import { registerIdentityGcpAuthRouter } from "./identity-gcp-auth-router";
import { registerIdentityKubernetesRouter } from "./identity-kubernetes-auth-router";
import { registerIdentityRouter } from "./identity-router";
import { registerIdentityUaRouter } from "./identity-ua";
import { registerIntegrationAuthRouter } from "./integration-auth-router";
@ -27,7 +30,10 @@ export const registerV1Routes = async (server: FastifyZodProvider) => {
async (authRouter) => {
await authRouter.register(registerAuthRoutes);
await authRouter.register(registerIdentityUaRouter);
await authRouter.register(registerIdentityKubernetesRouter);
await authRouter.register(registerIdentityGcpAuthRouter);
await authRouter.register(registerIdentityAccessTokenRouter);
await authRouter.register(registerIdentityAwsAuthRouter);
},
{ prefix: "/auth" }
);

@ -66,7 +66,8 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
)
.optional()
.describe(INTEGRATION.CREATE.metadata.secretAWSTag),
kmsKeyId: z.string().optional().describe(INTEGRATION.CREATE.metadata.kmsKeyId)
kmsKeyId: z.string().optional().describe(INTEGRATION.CREATE.metadata.kmsKeyId),
shouldDisableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldDisableDelete)
})
.default({})
}),
@ -142,8 +143,8 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
integrationId: z.string().trim().describe(INTEGRATION.UPDATE.integrationId)
}),
body: z.object({
app: z.string().trim().describe(INTEGRATION.UPDATE.app),
appId: z.string().trim().describe(INTEGRATION.UPDATE.appId),
app: z.string().trim().optional().describe(INTEGRATION.UPDATE.app),
appId: z.string().trim().optional().describe(INTEGRATION.UPDATE.appId),
isActive: z.boolean().describe(INTEGRATION.UPDATE.isActive),
secretPath: z
.string()
@ -153,7 +154,33 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
.describe(INTEGRATION.UPDATE.secretPath),
targetEnvironment: z.string().trim().describe(INTEGRATION.UPDATE.targetEnvironment),
owner: z.string().trim().describe(INTEGRATION.UPDATE.owner),
environment: z.string().trim().describe(INTEGRATION.UPDATE.environment)
environment: z.string().trim().describe(INTEGRATION.UPDATE.environment),
metadata: z
.object({
secretPrefix: z.string().optional().describe(INTEGRATION.CREATE.metadata.secretPrefix),
secretSuffix: z.string().optional().describe(INTEGRATION.CREATE.metadata.secretSuffix),
initialSyncBehavior: z.string().optional().describe(INTEGRATION.CREATE.metadata.initialSyncBehavoir),
shouldAutoRedeploy: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldAutoRedeploy),
secretGCPLabel: z
.object({
labelName: z.string(),
labelValue: z.string()
})
.optional()
.describe(INTEGRATION.CREATE.metadata.secretGCPLabel),
secretAWSTag: z
.array(
z.object({
key: z.string(),
value: z.string()
})
)
.optional()
.describe(INTEGRATION.CREATE.metadata.secretAWSTag),
kmsKeyId: z.string().optional().describe(INTEGRATION.CREATE.metadata.kmsKeyId),
shouldDisableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldDisableDelete)
})
.optional()
}),
response: {
200: z.object({
@ -235,5 +262,64 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
}
});
// TODO(akhilmhdh-pg): manual sync
server.route({
method: "POST",
url: "/:integrationId/sync",
config: {
rateLimit: writeLimit
},
schema: {
description: "Manually trigger sync of an integration by integration id",
security: [
{
bearerAuth: []
}
],
params: z.object({
integrationId: z.string().trim().describe(INTEGRATION.SYNC.integrationId)
}),
response: {
200: z.object({
integration: IntegrationsSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const integration = await server.services.integration.syncIntegration({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: integration.projectId,
event: {
type: EventType.MANUAL_SYNC_INTEGRATION,
// eslint-disable-next-line
metadata: shake({
integrationId: integration.id,
integration: integration.integration,
environment: integration.environment.slug,
secretPath: integration.secretPath,
url: integration.url,
app: integration.app,
appId: integration.appId,
targetEnvironment: integration.targetEnvironment,
targetEnvironmentId: integration.targetEnvironmentId,
targetService: integration.targetService,
targetServiceId: integration.targetServiceId,
path: integration.path,
region: integration.region
// eslint-disable-next-line
}) as any
}
});
return { integration };
}
});
};

@ -9,7 +9,7 @@ import {
UsersSchema
} from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { PROJECTS } from "@app/lib/api-docs";
import { PROJECT_USERS } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
@ -30,7 +30,7 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
}
],
params: z.object({
workspaceId: z.string().trim().describe(PROJECTS.GET_USER_MEMBERSHIPS.workspaceId)
workspaceId: z.string().trim().describe(PROJECT_USERS.GET_USER_MEMBERSHIPS.workspaceId)
}),
response: {
200: z.object({
@ -74,6 +74,66 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
}
});
server.route({
method: "POST",
url: "/:workspaceId/memberships/details",
config: {
rateLimit: readLimit
},
schema: {
description: "Return project user memberships",
security: [
{
bearerAuth: []
}
],
params: z.object({
workspaceId: z.string().min(1).trim().describe(PROJECT_USERS.GET_USER_MEMBERSHIP.workspaceId)
}),
body: z.object({
username: z.string().min(1).trim().describe(PROJECT_USERS.GET_USER_MEMBERSHIP.username)
}),
response: {
200: z.object({
membership: ProjectMembershipsSchema.extend({
user: UsersSchema.pick({
email: true,
firstName: true,
lastName: true,
id: true
}).merge(UserEncryptionKeysSchema.pick({ publicKey: true })),
roles: z.array(
z.object({
id: z.string(),
role: z.string(),
customRoleId: z.string().optional().nullable(),
customRoleName: z.string().optional().nullable(),
customRoleSlug: z.string().optional().nullable(),
isTemporary: z.boolean(),
temporaryMode: z.string().optional().nullable(),
temporaryRange: z.string().nullable().optional(),
temporaryAccessStartTime: z.date().nullable().optional(),
temporaryAccessEndTime: z.date().nullable().optional()
})
)
}).omit({ createdAt: true, updatedAt: true })
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const membership = await server.services.projectMembership.getProjectMembershipByUsername({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
projectId: req.params.workspaceId,
username: req.body.username
});
return { membership };
}
});
server.route({
method: "POST",
url: "/:workspaceId/memberships",
@ -142,8 +202,8 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
}
],
params: z.object({
workspaceId: z.string().trim().describe(PROJECTS.UPDATE_USER_MEMBERSHIP.workspaceId),
membershipId: z.string().trim().describe(PROJECTS.UPDATE_USER_MEMBERSHIP.membershipId)
workspaceId: z.string().trim().describe(PROJECT_USERS.UPDATE_USER_MEMBERSHIP.workspaceId),
membershipId: z.string().trim().describe(PROJECT_USERS.UPDATE_USER_MEMBERSHIP.membershipId)
}),
body: z.object({
roles: z
@ -164,7 +224,7 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
)
.min(1)
.refine((data) => data.some(({ isTemporary }) => !isTemporary), "At least one long lived role is required")
.describe(PROJECTS.UPDATE_USER_MEMBERSHIP.roles)
.describe(PROJECT_USERS.UPDATE_USER_MEMBERSHIP.roles)
}),
response: {
200: z.object({

@ -127,6 +127,70 @@ export const registerSecretFolderRouter = async (server: FastifyZodProvider) =>
}
});
server.route({
url: "/batch",
method: "PATCH",
config: {
rateLimit: secretsLimit
},
schema: {
description: "Update folders by batch",
security: [
{
bearerAuth: []
}
],
body: z.object({
projectSlug: z.string().trim().describe(FOLDERS.UPDATE.projectSlug),
folders: z
.object({
id: z.string().describe(FOLDERS.UPDATE.folderId),
environment: z.string().trim().describe(FOLDERS.UPDATE.environment),
name: z.string().trim().describe(FOLDERS.UPDATE.name),
path: z.string().trim().default("/").transform(removeTrailingSlash).describe(FOLDERS.UPDATE.path)
})
.array()
.min(1)
}),
response: {
200: z.object({
folders: SecretFoldersSchema.array()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { newFolders, oldFolders, projectId } = await server.services.folder.updateManyFolders({
...req.body,
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await Promise.all(
req.body.folders.map(async (folder, index) => {
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId,
event: {
type: EventType.UPDATE_FOLDER,
metadata: {
environment: oldFolders[index].envId,
folderId: oldFolders[index].id,
folderPath: folder.path,
newFolderName: newFolders[index].name,
oldFolderName: oldFolders[index].name
}
}
});
})
);
return { folders: newFolders };
}
});
// TODO(daniel): Expose this route in api reference and write docs for it.
server.route({
method: "DELETE",

@ -7,7 +7,8 @@ import {
ProjectMembershipRole,
ProjectUserMembershipRolesSchema
} from "@app/db/schemas";
import { PROJECTS } from "@app/lib/api-docs";
import { PROJECT_IDENTITIES } from "@app/lib/api-docs";
import { BadRequestError } from "@app/lib/errors";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
@ -22,12 +23,48 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Create project identity membership",
security: [
{
bearerAuth: []
}
],
params: z.object({
projectId: z.string().trim(),
identityId: z.string().trim()
}),
body: z.object({
role: z.string().trim().min(1).default(ProjectMembershipRole.NoAccess)
// @depreciated
role: z.string().trim().optional().default(ProjectMembershipRole.NoAccess),
roles: z
.array(
z.union([
z.object({
role: z.string().describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role),
isTemporary: z
.literal(false)
.default(false)
.describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role)
}),
z.object({
role: z.string().describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role),
isTemporary: z.literal(true).describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role),
temporaryMode: z
.nativeEnum(ProjectUserMembershipTemporaryMode)
.describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role),
temporaryRange: z
.string()
.refine((val) => ms(val) > 0, "Temporary range must be a positive number")
.describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role),
temporaryAccessStartTime: z
.string()
.datetime()
.describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.role)
})
])
)
.describe(PROJECT_IDENTITIES.CREATE_IDENTITY_MEMBERSHIP.roles.description)
.optional()
}),
response: {
200: z.object({
@ -36,6 +73,9 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
}
},
handler: async (req) => {
const { role, roles } = req.body;
if (!role && !roles) throw new BadRequestError({ message: "You must provide either role or roles field" });
const identityMembership = await server.services.identityProject.createProjectIdentity({
actor: req.permission.type,
actorId: req.permission.id,
@ -43,7 +83,7 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
actorOrgId: req.permission.orgId,
identityId: req.params.identityId,
projectId: req.params.projectId,
role: req.body.role
roles: roles || [{ role }]
});
return { identityMembership };
}
@ -64,28 +104,39 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
projectId: z.string().trim().describe(PROJECTS.UPDATE_IDENTITY_MEMBERSHIP.projectId),
identityId: z.string().trim().describe(PROJECTS.UPDATE_IDENTITY_MEMBERSHIP.identityId)
projectId: z.string().trim().describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.projectId),
identityId: z.string().trim().describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.identityId)
}),
body: z.object({
roles: z
.array(
z.union([
z.object({
role: z.string(),
isTemporary: z.literal(false).default(false)
role: z.string().describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.role),
isTemporary: z
.literal(false)
.default(false)
.describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.isTemporary)
}),
z.object({
role: z.string(),
isTemporary: z.literal(true),
temporaryMode: z.nativeEnum(ProjectUserMembershipTemporaryMode),
temporaryRange: z.string().refine((val) => ms(val) > 0, "Temporary range must be a positive number"),
temporaryAccessStartTime: z.string().datetime()
role: z.string().describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.role),
isTemporary: z.literal(true).describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.isTemporary),
temporaryMode: z
.nativeEnum(ProjectUserMembershipTemporaryMode)
.describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.temporaryMode),
temporaryRange: z
.string()
.refine((val) => ms(val) > 0, "Temporary range must be a positive number")
.describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.temporaryRange),
temporaryAccessStartTime: z
.string()
.datetime()
.describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.temporaryAccessStartTime)
})
])
)
.min(1)
.describe(PROJECTS.UPDATE_IDENTITY_MEMBERSHIP.roles)
.describe(PROJECT_IDENTITIES.UPDATE_IDENTITY_MEMBERSHIP.roles.description)
}),
response: {
200: z.object({
@ -122,8 +173,8 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
projectId: z.string().trim().describe(PROJECTS.DELETE_IDENTITY_MEMBERSHIP.projectId),
identityId: z.string().trim().describe(PROJECTS.DELETE_IDENTITY_MEMBERSHIP.identityId)
projectId: z.string().trim().describe(PROJECT_IDENTITIES.DELETE_IDENTITY_MEMBERSHIP.projectId),
identityId: z.string().trim().describe(PROJECT_IDENTITIES.DELETE_IDENTITY_MEMBERSHIP.identityId)
}),
response: {
200: z.object({
@ -159,7 +210,7 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
projectId: z.string().trim().describe(PROJECTS.LIST_IDENTITY_MEMBERSHIPS.projectId)
projectId: z.string().trim().describe(PROJECT_IDENTITIES.LIST_IDENTITY_MEMBERSHIPS.projectId)
}),
response: {
200: z.object({
@ -200,4 +251,61 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
return { identityMemberships };
}
});
server.route({
method: "GET",
url: "/:projectId/identity-memberships/:identityId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Return project identity membership",
security: [
{
bearerAuth: []
}
],
params: z.object({
projectId: z.string().trim().describe(PROJECT_IDENTITIES.GET_IDENTITY_MEMBERSHIP_BY_ID.projectId),
identityId: z.string().trim().describe(PROJECT_IDENTITIES.GET_IDENTITY_MEMBERSHIP_BY_ID.identityId)
}),
response: {
200: z.object({
identityMembership: z.object({
id: z.string(),
identityId: z.string(),
createdAt: z.date(),
updatedAt: z.date(),
roles: z.array(
z.object({
id: z.string(),
role: z.string(),
customRoleId: z.string().optional().nullable(),
customRoleName: z.string().optional().nullable(),
customRoleSlug: z.string().optional().nullable(),
isTemporary: z.boolean(),
temporaryMode: z.string().optional().nullable(),
temporaryRange: z.string().nullable().optional(),
temporaryAccessStartTime: z.date().nullable().optional(),
temporaryAccessEndTime: z.date().nullable().optional()
})
),
identity: IdentitiesSchema.pick({ name: true, id: true, authMethod: true })
})
})
}
},
handler: async (req) => {
const identityMembership = await server.services.identityProject.getProjectIdentityByIdentityId({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
projectId: req.params.projectId,
identityId: req.params.identityId
});
return { identityMembership };
}
});
};

@ -2,7 +2,7 @@ import { z } from "zod";
import { ProjectMembershipsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { PROJECTS } from "@app/lib/api-docs";
import { PROJECT_USERS } from "@app/lib/api-docs";
import { writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
@ -22,11 +22,11 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
}
],
params: z.object({
projectId: z.string().describe(PROJECTS.INVITE_MEMBER.projectId)
projectId: z.string().describe(PROJECT_USERS.INVITE_MEMBER.projectId)
}),
body: z.object({
emails: z.string().email().array().default([]).describe(PROJECTS.INVITE_MEMBER.emails),
usernames: z.string().array().default([]).describe(PROJECTS.INVITE_MEMBER.usernames)
emails: z.string().email().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.emails),
usernames: z.string().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.usernames)
}),
response: {
200: z.object({
@ -77,11 +77,11 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
}
],
params: z.object({
projectId: z.string().describe(PROJECTS.REMOVE_MEMBER.projectId)
projectId: z.string().describe(PROJECT_USERS.REMOVE_MEMBER.projectId)
}),
body: z.object({
emails: z.string().email().array().default([]).describe(PROJECTS.REMOVE_MEMBER.emails),
usernames: z.string().array().default([]).describe(PROJECTS.REMOVE_MEMBER.usernames)
emails: z.string().email().array().default([]).describe(PROJECT_USERS.REMOVE_MEMBER.emails),
usernames: z.string().array().default([]).describe(PROJECT_USERS.REMOVE_MEMBER.usernames)
}),
response: {
200: z.object({

@ -293,6 +293,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}),
querystring: z.object({
workspaceId: z.string().trim().optional().describe(RAW_SECRETS.GET.workspaceId),
workspaceSlug: z.string().trim().optional().describe(RAW_SECRETS.GET.workspaceSlug),
environment: z.string().trim().optional().describe(RAW_SECRETS.GET.environment),
secretPath: z.string().trim().default("/").transform(removeTrailingSlash).describe(RAW_SECRETS.GET.secretPath),
version: z.coerce.number().optional().describe(RAW_SECRETS.GET.version),
@ -311,6 +312,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { workspaceSlug } = req.query;
let { secretPath, environment, workspaceId } = req.query;
if (req.auth.actor === ActorType.SERVICE) {
const scope = ServiceTokenScopes.parse(req.auth.serviceToken.scopes);
@ -322,7 +324,9 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
}
if (!workspaceId || !environment) throw new BadRequestError({ message: "Missing workspace id or environment" });
if (!environment) throw new BadRequestError({ message: "Missing environment" });
if (!workspaceId && !workspaceSlug)
throw new BadRequestError({ message: "You must provide workspaceSlug or workspaceId" });
const secret = await server.services.secret.getSecretByNameRaw({
actorId: req.permission.id,
@ -331,6 +335,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
actorOrgId: req.permission.orgId,
environment,
projectId: workspaceId,
projectSlug: workspaceSlug,
path: secretPath,
secretName: req.params.secretName,
type: req.query.type,
@ -339,7 +344,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
});
await server.services.auditLog.createAuditLog({
projectId: req.query.workspaceId,
projectId: secret.workspace,
...req.auditLogInfo,
event: {
type: EventType.GET_SECRET,
@ -358,7 +363,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
distinctId: getTelemetryDistinctId(req),
properties: {
numberOfSecrets: 1,
workspaceId,
workspaceId: secret.workspace,
environment,
secretPath: req.query.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),
@ -1921,4 +1926,41 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
return { secrets };
}
});
server.route({
method: "POST",
url: "/backfill-secret-references",
config: {
rateLimit: secretsLimit
},
schema: {
description: "Backfill secret references",
security: [
{
bearerAuth: []
}
],
body: z.object({
projectId: z.string().trim().min(1)
}),
response: {
200: z.object({
message: z.string()
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const { projectId } = req.body;
const message = await server.services.secret.backfillSecretReferences({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
projectId
});
return message;
}
});
};

@ -1,7 +1,7 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName, TIdentityAccessTokens } from "@app/db/schemas";
import { IdentityAuthMethod, TableName, TIdentityAccessTokens } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
@ -15,23 +15,56 @@ export const identityAccessTokenDALFactory = (db: TDbClient) => {
const doc = await (tx || db)(TableName.IdentityAccessToken)
.where(filter)
.join(TableName.Identity, `${TableName.Identity}.id`, `${TableName.IdentityAccessToken}.identityId`)
.leftJoin(
TableName.IdentityUaClientSecret,
`${TableName.IdentityAccessToken}.identityUAClientSecretId`,
`${TableName.IdentityUaClientSecret}.id`
)
.leftJoin(
TableName.IdentityUniversalAuth,
`${TableName.IdentityUaClientSecret}.identityUAId`,
`${TableName.IdentityUniversalAuth}.id`
)
.leftJoin(TableName.IdentityUaClientSecret, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.Univeral])).andOn(
`${TableName.IdentityAccessToken}.identityUAClientSecretId`,
`${TableName.IdentityUaClientSecret}.id`
);
})
.leftJoin(TableName.IdentityUniversalAuth, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.Univeral])).andOn(
`${TableName.IdentityUaClientSecret}.identityUAId`,
`${TableName.IdentityUniversalAuth}.id`
);
})
.leftJoin(TableName.IdentityGcpAuth, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.GCP_AUTH])).andOn(
`${TableName.Identity}.id`,
`${TableName.IdentityGcpAuth}.identityId`
);
})
.leftJoin(TableName.IdentityAwsAuth, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.AWS_AUTH])).andOn(
`${TableName.Identity}.id`,
`${TableName.IdentityAwsAuth}.identityId`
);
})
.leftJoin(TableName.IdentityKubernetesAuth, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.KUBERNETES_AUTH])).andOn(
`${TableName.Identity}.id`,
`${TableName.IdentityKubernetesAuth}.identityId`
);
})
.select(selectAllTableCols(TableName.IdentityAccessToken))
.select(
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityUniversalAuth),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityUniversalAuth).as("accessTokenTrustedIpsUa"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityGcpAuth).as("accessTokenTrustedIpsGcp"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityAwsAuth).as("accessTokenTrustedIpsAws"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityKubernetesAuth).as("accessTokenTrustedIpsK8s"),
db.ref("name").withSchema(TableName.Identity)
)
.first();
return doc;
if (!doc) return;
return {
...doc,
accessTokenTrustedIps:
doc.accessTokenTrustedIpsUa ||
doc.accessTokenTrustedIpsGcp ||
doc.accessTokenTrustedIpsAws ||
doc.accessTokenTrustedIpsK8s
};
} catch (error) {
throw new DatabaseError({ error, name: "IdAccessTokenFindOne" });
}

@ -106,6 +106,24 @@ export const identityAccessTokenServiceFactory = ({
return { accessToken, identityAccessToken: updatedIdentityAccessToken };
};
const revokeAccessToken = async (accessToken: string) => {
const appCfg = getConfig();
const decodedToken = jwt.verify(accessToken, appCfg.AUTH_SECRET) as JwtPayload & {
identityAccessTokenId: string;
};
if (decodedToken.authTokenType !== AuthTokenType.IDENTITY_ACCESS_TOKEN) throw new UnauthorizedError();
const identityAccessToken = await identityAccessTokenDAL.findOne({
[`${TableName.IdentityAccessToken}.id` as "id"]: decodedToken.identityAccessTokenId,
isAccessTokenRevoked: false
});
if (!identityAccessToken) throw new UnauthorizedError();
const revokedToken = await identityAccessTokenDAL.deleteById(identityAccessToken.id);
return { revokedToken };
};
const fnValidateIdentityAccessToken = async (token: TIdentityAccessTokenJwtPayload, ipAddress?: string) => {
const identityAccessToken = await identityAccessTokenDAL.findOne({
[`${TableName.IdentityAccessToken}.id` as "id"]: token.identityAccessTokenId,
@ -132,5 +150,5 @@ export const identityAccessTokenServiceFactory = ({
return { ...identityAccessToken, orgId: identityOrgMembership.orgId };
};
return { renewAccessToken, fnValidateIdentityAccessToken };
return { renewAccessToken, revokeAccessToken, fnValidateIdentityAccessToken };
};

@ -0,0 +1,11 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TIdentityAwsAuthDALFactory = ReturnType<typeof identityAwsAuthDALFactory>;
export const identityAwsAuthDALFactory = (db: TDbClient) => {
const awsAuthOrm = ormify(db, TableName.IdentityAwsAuth);
return awsAuthOrm;
};

@ -0,0 +1,67 @@
/**
* Extracts the identity ARN from the GetCallerIdentity response to one of the following formats:
* - arn:aws:iam::123456789012:user/MyUserName
* - arn:aws:iam::123456789012:role/MyRoleName
*/
export const extractPrincipalArn = (arn: string) => {
// split the ARN into parts using ":" as the delimiter
const fullParts = arn.split(":");
if (fullParts.length !== 6) {
throw new Error(`Unrecognized ARN: contains ${fullParts.length} colon-separated parts, expected 6`);
}
const [prefix, partition, service, , accountNumber, resource] = fullParts;
if (prefix !== "arn") {
throw new Error('Unrecognized ARN: does not begin with "arn:"');
}
// structure to hold the parsed data
const entity = {
Partition: partition,
Service: service,
AccountNumber: accountNumber,
Type: "",
Path: "",
FriendlyName: "",
SessionInfo: ""
};
// validate the service is either 'iam' or 'sts'
if (entity.Service !== "iam" && entity.Service !== "sts") {
throw new Error(`Unrecognized service: ${entity.Service}, not one of iam or sts`);
}
// parse the last part of the ARN which describes the resource
const parts = resource.split("/");
if (parts.length < 2) {
throw new Error(`Unrecognized ARN: "${resource}" contains fewer than 2 slash-separated parts`);
}
const [type, ...rest] = parts;
entity.Type = type;
entity.FriendlyName = parts[parts.length - 1];
// handle different types of resources
switch (entity.Type) {
case "assumed-role": {
if (rest.length < 2) {
throw new Error(`Unrecognized ARN: "${resource}" contains fewer than 3 slash-separated parts`);
}
// assumed roles use a special format where the friendly name is the role name
const [roleName, sessionId] = rest;
entity.Type = "role"; // treat assumed role case as role
entity.FriendlyName = roleName;
entity.SessionInfo = sessionId;
break;
}
case "user":
case "role":
case "instance-profile":
// standard cases: just join back the path if there's any
entity.Path = rest.slice(0, -1).join("/");
break;
default:
throw new Error(`Unrecognized principal type: "${entity.Type}"`);
}
return `arn:aws:iam::${entity.AccountNumber}:${entity.Type}/${entity.FriendlyName}`;
};

@ -0,0 +1,310 @@
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
import { ForbiddenError } from "@casl/ability";
import axios from "axios";
import jwt from "jsonwebtoken";
import { IdentityAuthMethod } from "@app/db/schemas";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError, UnauthorizedError } from "@app/lib/errors";
import { extractIPDetails, isValidIpOrCidr } from "@app/lib/ip";
import { AuthTokenType } from "../auth/auth-type";
import { TIdentityDALFactory } from "../identity/identity-dal";
import { TIdentityOrgDALFactory } from "../identity/identity-org-dal";
import { TIdentityAccessTokenDALFactory } from "../identity-access-token/identity-access-token-dal";
import { TIdentityAccessTokenJwtPayload } from "../identity-access-token/identity-access-token-types";
import { TIdentityAwsAuthDALFactory } from "./identity-aws-auth-dal";
import { extractPrincipalArn } from "./identity-aws-auth-fns";
import {
TAttachAwsAuthDTO,
TAwsGetCallerIdentityHeaders,
TGetAwsAuthDTO,
TGetCallerIdentityResponse,
TLoginAwsAuthDTO,
TUpdateAwsAuthDTO
} from "./identity-aws-auth-types";
type TIdentityAwsAuthServiceFactoryDep = {
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">;
identityAwsAuthDAL: Pick<TIdentityAwsAuthDALFactory, "findOne" | "transaction" | "create" | "updateById">;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
identityDAL: Pick<TIdentityDALFactory, "updateById">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
};
export type TIdentityAwsAuthServiceFactory = ReturnType<typeof identityAwsAuthServiceFactory>;
export const identityAwsAuthServiceFactory = ({
identityAccessTokenDAL,
identityAwsAuthDAL,
identityOrgMembershipDAL,
identityDAL,
licenseService,
permissionService
}: TIdentityAwsAuthServiceFactoryDep) => {
const login = async ({ identityId, iamHttpRequestMethod, iamRequestBody, iamRequestHeaders }: TLoginAwsAuthDTO) => {
const identityAwsAuth = await identityAwsAuthDAL.findOne({ identityId });
if (!identityAwsAuth) throw new UnauthorizedError();
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId: identityAwsAuth.identityId });
const headers: TAwsGetCallerIdentityHeaders = JSON.parse(Buffer.from(iamRequestHeaders, "base64").toString());
const body: string = Buffer.from(iamRequestBody, "base64").toString();
const {
data: {
GetCallerIdentityResponse: {
GetCallerIdentityResult: { Account, Arn }
}
}
}: { data: TGetCallerIdentityResponse } = await axios({
method: iamHttpRequestMethod,
url: identityAwsAuth.stsEndpoint,
headers,
data: body
});
if (identityAwsAuth.allowedAccountIds) {
// validate if Account is in the list of allowed Account IDs
const isAccountAllowed = identityAwsAuth.allowedAccountIds
.split(",")
.map((accountId) => accountId.trim())
.some((accountId) => accountId === Account);
if (!isAccountAllowed) throw new UnauthorizedError();
}
if (identityAwsAuth.allowedPrincipalArns) {
// validate if Arn is in the list of allowed Principal ARNs
const isArnAllowed = identityAwsAuth.allowedPrincipalArns
.split(",")
.map((principalArn) => principalArn.trim())
.some((principalArn) => {
// convert wildcard ARN to a regular expression: "arn:aws:iam::123456789012:*" -> "^arn:aws:iam::123456789012:.*$"
// considers exact matches + wildcard matches
const regex = new RegExp(`^${principalArn.replace(/\*/g, ".*")}$`);
return regex.test(extractPrincipalArn(Arn));
});
if (!isArnAllowed) throw new UnauthorizedError();
}
const identityAccessToken = await identityAwsAuthDAL.transaction(async (tx) => {
const newToken = await identityAccessTokenDAL.create(
{
identityId: identityAwsAuth.identityId,
isAccessTokenRevoked: false,
accessTokenTTL: identityAwsAuth.accessTokenTTL,
accessTokenMaxTTL: identityAwsAuth.accessTokenMaxTTL,
accessTokenNumUses: 0,
accessTokenNumUsesLimit: identityAwsAuth.accessTokenNumUsesLimit
},
tx
);
return newToken;
});
const appCfg = getConfig();
const accessToken = jwt.sign(
{
identityId: identityAwsAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET,
{
expiresIn:
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined
: Number(identityAccessToken.accessTokenMaxTTL)
}
);
return { accessToken, identityAwsAuth, identityAccessToken, identityMembershipOrg };
};
const attachAwsAuth = async ({
identityId,
stsEndpoint,
allowedPrincipalArns,
allowedAccountIds,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TAttachAwsAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity.authMethod)
throw new BadRequestError({
message: "Failed to add AWS Auth to already configured identity"
});
if (accessTokenMaxTTL > 0 && accessTokenTTL > accessTokenMaxTTL) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const identityAwsAuth = await identityAwsAuthDAL.transaction(async (tx) => {
const doc = await identityAwsAuthDAL.create(
{
identityId: identityMembershipOrg.identityId,
type: "iam",
stsEndpoint,
allowedPrincipalArns,
allowedAccountIds,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: JSON.stringify(reformattedAccessTokenTrustedIps)
},
tx
);
await identityDAL.updateById(
identityMembershipOrg.identityId,
{
authMethod: IdentityAuthMethod.AWS_AUTH
},
tx
);
return doc;
});
return { ...identityAwsAuth, orgId: identityMembershipOrg.orgId };
};
const updateAwsAuth = async ({
identityId,
stsEndpoint,
allowedPrincipalArns,
allowedAccountIds,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TUpdateAwsAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.AWS_AUTH)
throw new BadRequestError({
message: "Failed to update AWS Auth"
});
const identityAwsAuth = await identityAwsAuthDAL.findOne({ identityId });
if (
(accessTokenMaxTTL || identityAwsAuth.accessTokenMaxTTL) > 0 &&
(accessTokenTTL || identityAwsAuth.accessTokenMaxTTL) > (accessTokenMaxTTL || identityAwsAuth.accessTokenMaxTTL)
) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps?.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const updatedAwsAuth = await identityAwsAuthDAL.updateById(identityAwsAuth.id, {
stsEndpoint,
allowedPrincipalArns,
allowedAccountIds,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: reformattedAccessTokenTrustedIps
? JSON.stringify(reformattedAccessTokenTrustedIps)
: undefined
});
return { ...updatedAwsAuth, orgId: identityMembershipOrg.orgId };
};
const getAwsAuth = async ({ identityId, actorId, actor, actorAuthMethod, actorOrgId }: TGetAwsAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.AWS_AUTH)
throw new BadRequestError({
message: "The identity does not have AWS Auth attached"
});
const awsIdentityAuth = await identityAwsAuthDAL.findOne({ identityId });
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Identity);
return { ...awsIdentityAuth, orgId: identityMembershipOrg.orgId };
};
return {
login,
attachAwsAuth,
updateAwsAuth,
getAwsAuth
};
};

@ -0,0 +1,54 @@
import { TProjectPermission } from "@app/lib/types";
export type TLoginAwsAuthDTO = {
identityId: string;
iamHttpRequestMethod: string;
iamRequestBody: string;
iamRequestHeaders: string;
};
export type TAttachAwsAuthDTO = {
identityId: string;
stsEndpoint: string;
allowedPrincipalArns: string;
allowedAccountIds: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TUpdateAwsAuthDTO = {
identityId: string;
stsEndpoint?: string;
allowedPrincipalArns?: string;
allowedAccountIds?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TGetAwsAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;
export type TAwsGetCallerIdentityHeaders = {
"Content-Type": string;
Host: string;
"X-Amz-Date": string;
"Content-Length": number;
"x-amz-security-token": string;
Authorization: string;
};
export type TGetCallerIdentityResponse = {
GetCallerIdentityResponse: {
GetCallerIdentityResult: {
Account: string;
Arn: string;
UserId: string;
};
ResponseMetadata: { RequestId: string };
};
};

@ -0,0 +1,58 @@
import { z } from "zod";
const twelveDigitRegex = /^\d{12}$/;
const arnRegex = /^arn:aws:iam::\d{12}:(user\/[\w-]+|role\/[\w-]+|\*)$/;
export const validateAccountIds = z
.string()
.trim()
.default("")
// Custom validation to ensure each part is a 12-digit number
.refine(
(data) => {
if (data === "") return true;
// Split the string by commas to check each supposed number
const accountIds = data.split(",").map((id) => id.trim());
// Return true only if every item matches the 12-digit requirement
return accountIds.every((id) => twelveDigitRegex.test(id));
},
{
message: "Each account ID must be a 12-digit number."
}
)
// Transform the string to normalize space after commas
.transform((data) => {
if (data === "") return "";
// Trim each ID and join with ', ' to ensure formatting
return data
.split(",")
.map((id) => id.trim())
.join(", ");
});
export const validatePrincipalArns = z
.string()
.trim()
.default("")
// Custom validation for ARN format
.refine(
(data) => {
// Skip validation if the string is empty
if (data === "") return true;
// Split the string by commas to check each supposed ARN
const arns = data.split(",");
// Return true only if every item matches one of the allowed ARN formats
return arns.every((arn) => arnRegex.test(arn.trim()));
},
{
message:
"Each ARN must be in the format of 'arn:aws:iam::123456789012:user/UserName', 'arn:aws:iam::123456789012:role/RoleName', or 'arn:aws:iam::123456789012:*'."
}
)
// Transform to normalize the spaces around commas
.transform((data) =>
data
.split(",")
.map((arn) => arn.trim())
.join(", ")
);

@ -0,0 +1,10 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TIdentityGcpAuthDALFactory = ReturnType<typeof identityGcpAuthDALFactory>;
export const identityGcpAuthDALFactory = (db: TDbClient) => {
const gcpAuthOrm = ormify(db, TableName.IdentityGcpAuth);
return gcpAuthOrm;
};

@ -0,0 +1,70 @@
import axios from "axios";
import { OAuth2Client } from "google-auth-library";
import jwt from "jsonwebtoken";
import { UnauthorizedError } from "@app/lib/errors";
import { TDecodedGcpIamAuthJwt, TGcpIdTokenPayload } from "./identity-gcp-auth-types";
/**
* Validates that the identity token [jwt] sent in from a client GCE instance as part of GCP ID Token authentication
* is valid.
* @param {string} identityId - The ID of the identity in Infisical that is being authenticated against (used as audience).
* @param {string} jwt - The identity token to validate.
* @param {string} credentials - The credentials in the GCP Auth configuration for Infisical.
*/
export const validateIdTokenIdentity = async ({
identityId,
jwt: identityToken
}: {
identityId: string;
jwt: string;
}) => {
const oAuth2Client = new OAuth2Client();
const response = await oAuth2Client.getFederatedSignonCerts();
const ticket = await oAuth2Client.verifySignedJwtWithCertsAsync(
identityToken,
response.certs,
identityId, // audience
["https://accounts.google.com"]
);
const payload = ticket.getPayload() as TGcpIdTokenPayload;
if (!payload || !payload.email) throw new UnauthorizedError();
return { email: payload.email, computeEngineDetails: payload.google?.compute_engine };
};
/**
* Validates that the signed JWT token for a GCP service account is valid as part of GCP IAM authentication.
* @param {string} identityId - The ID of the identity in Infisical that is being authenticated against (used as audience).
* @param {string} jwt - The signed JWT token to validate.
* @param {string} credentials - The credentials in the GCP Auth configuration for Infisical.
* @returns
*/
export const validateIamIdentity = async ({
identityId,
jwt: serviceAccountJwt
}: {
identityId: string;
jwt: string;
}) => {
const decodedJwt = jwt.decode(serviceAccountJwt, { complete: true }) as TDecodedGcpIamAuthJwt;
const { sub, aud } = decodedJwt.payload;
const {
data
}: {
data: {
[key: string]: string;
};
} = await axios.get(`https://www.googleapis.com/service_accounts/v1/metadata/x509/${sub}`);
const publicKey = data[decodedJwt.header.kid];
jwt.verify(serviceAccountJwt, publicKey, {
algorithms: ["RS256"]
});
if (aud !== identityId) throw new UnauthorizedError();
return { email: sub };
};

@ -0,0 +1,324 @@
import { ForbiddenError } from "@casl/ability";
import jwt from "jsonwebtoken";
import { IdentityAuthMethod } from "@app/db/schemas";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError, UnauthorizedError } from "@app/lib/errors";
import { extractIPDetails, isValidIpOrCidr } from "@app/lib/ip";
import { AuthTokenType } from "../auth/auth-type";
import { TIdentityDALFactory } from "../identity/identity-dal";
import { TIdentityOrgDALFactory } from "../identity/identity-org-dal";
import { TIdentityAccessTokenDALFactory } from "../identity-access-token/identity-access-token-dal";
import { TIdentityAccessTokenJwtPayload } from "../identity-access-token/identity-access-token-types";
import { TIdentityGcpAuthDALFactory } from "./identity-gcp-auth-dal";
import { validateIamIdentity, validateIdTokenIdentity } from "./identity-gcp-auth-fns";
import {
TAttachGcpAuthDTO,
TGcpIdentityDetails,
TGetGcpAuthDTO,
TLoginGcpAuthDTO,
TUpdateGcpAuthDTO
} from "./identity-gcp-auth-types";
type TIdentityGcpAuthServiceFactoryDep = {
identityGcpAuthDAL: Pick<TIdentityGcpAuthDALFactory, "findOne" | "transaction" | "create" | "updateById">;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">;
identityDAL: Pick<TIdentityDALFactory, "updateById">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
};
export type TIdentityGcpAuthServiceFactory = ReturnType<typeof identityGcpAuthServiceFactory>;
export const identityGcpAuthServiceFactory = ({
identityGcpAuthDAL,
identityOrgMembershipDAL,
identityAccessTokenDAL,
identityDAL,
permissionService,
licenseService
}: TIdentityGcpAuthServiceFactoryDep) => {
const login = async ({ identityId, jwt: gcpJwt }: TLoginGcpAuthDTO) => {
const identityGcpAuth = await identityGcpAuthDAL.findOne({ identityId });
if (!identityGcpAuth) throw new UnauthorizedError();
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId: identityGcpAuth.identityId });
if (!identityMembershipOrg) throw new UnauthorizedError();
let gcpIdentityDetails: TGcpIdentityDetails;
switch (identityGcpAuth.type) {
case "gce": {
gcpIdentityDetails = await validateIdTokenIdentity({
identityId,
jwt: gcpJwt
});
break;
}
case "iam": {
gcpIdentityDetails = await validateIamIdentity({
identityId,
jwt: gcpJwt
});
break;
}
default: {
throw new BadRequestError({ message: "Invalid GCP Auth type" });
}
}
if (identityGcpAuth.allowedServiceAccounts) {
// validate if the service account is in the list of allowed service accounts
const isServiceAccountAllowed = identityGcpAuth.allowedServiceAccounts
.split(",")
.map((serviceAccount) => serviceAccount.trim())
.some((serviceAccount) => serviceAccount === gcpIdentityDetails.email);
if (!isServiceAccountAllowed) throw new UnauthorizedError();
}
if (identityGcpAuth.type === "gce" && identityGcpAuth.allowedProjects && gcpIdentityDetails.computeEngineDetails) {
// validate if the project that the service account belongs to is in the list of allowed projects
const isProjectAllowed = identityGcpAuth.allowedProjects
.split(",")
.map((project) => project.trim())
.some((project) => project === gcpIdentityDetails.computeEngineDetails?.project_id);
if (!isProjectAllowed) throw new UnauthorizedError();
}
if (identityGcpAuth.type === "gce" && identityGcpAuth.allowedZones && gcpIdentityDetails.computeEngineDetails) {
const isZoneAllowed = identityGcpAuth.allowedZones
.split(",")
.map((zone) => zone.trim())
.some((zone) => zone === gcpIdentityDetails.computeEngineDetails?.zone);
if (!isZoneAllowed) throw new UnauthorizedError();
}
const identityAccessToken = await identityGcpAuthDAL.transaction(async (tx) => {
const newToken = await identityAccessTokenDAL.create(
{
identityId: identityGcpAuth.identityId,
isAccessTokenRevoked: false,
accessTokenTTL: identityGcpAuth.accessTokenTTL,
accessTokenMaxTTL: identityGcpAuth.accessTokenMaxTTL,
accessTokenNumUses: 0,
accessTokenNumUsesLimit: identityGcpAuth.accessTokenNumUsesLimit
},
tx
);
return newToken;
});
const appCfg = getConfig();
const accessToken = jwt.sign(
{
identityId: identityGcpAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET,
{
expiresIn:
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined
: Number(identityAccessToken.accessTokenMaxTTL)
}
);
return { accessToken, identityGcpAuth, identityAccessToken, identityMembershipOrg };
};
const attachGcpAuth = async ({
identityId,
type,
allowedServiceAccounts,
allowedProjects,
allowedZones,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TAttachGcpAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity.authMethod)
throw new BadRequestError({
message: "Failed to add GCP Auth to already configured identity"
});
if (accessTokenMaxTTL > 0 && accessTokenTTL > accessTokenMaxTTL) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const identityGcpAuth = await identityGcpAuthDAL.transaction(async (tx) => {
const doc = await identityGcpAuthDAL.create(
{
identityId: identityMembershipOrg.identityId,
type,
allowedServiceAccounts,
allowedProjects,
allowedZones,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: JSON.stringify(reformattedAccessTokenTrustedIps)
},
tx
);
await identityDAL.updateById(
identityMembershipOrg.identityId,
{
authMethod: IdentityAuthMethod.GCP_AUTH
},
tx
);
return doc;
});
return { ...identityGcpAuth, orgId: identityMembershipOrg.orgId };
};
const updateGcpAuth = async ({
identityId,
type,
allowedServiceAccounts,
allowedProjects,
allowedZones,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TUpdateGcpAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.GCP_AUTH)
throw new BadRequestError({
message: "Failed to update GCP Auth"
});
const identityGcpAuth = await identityGcpAuthDAL.findOne({ identityId });
if (
(accessTokenMaxTTL || identityGcpAuth.accessTokenMaxTTL) > 0 &&
(accessTokenTTL || identityGcpAuth.accessTokenMaxTTL) > (accessTokenMaxTTL || identityGcpAuth.accessTokenMaxTTL)
) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps?.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const updatedGcpAuth = await identityGcpAuthDAL.updateById(identityGcpAuth.id, {
type,
allowedServiceAccounts,
allowedProjects,
allowedZones,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: reformattedAccessTokenTrustedIps
? JSON.stringify(reformattedAccessTokenTrustedIps)
: undefined
});
return {
...updatedGcpAuth,
orgId: identityMembershipOrg.orgId
};
};
const getGcpAuth = async ({ identityId, actorId, actor, actorAuthMethod, actorOrgId }: TGetGcpAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.GCP_AUTH)
throw new BadRequestError({
message: "The identity does not have GCP Auth attached"
});
const identityGcpAuth = await identityGcpAuthDAL.findOne({ identityId });
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Identity);
return { ...identityGcpAuth, orgId: identityMembershipOrg.orgId };
};
return {
login,
attachGcpAuth,
updateGcpAuth,
getGcpAuth
};
};

@ -0,0 +1,78 @@
import { TProjectPermission } from "@app/lib/types";
export type TLoginGcpAuthDTO = {
identityId: string;
jwt: string;
};
export type TAttachGcpAuthDTO = {
identityId: string;
type: "iam" | "gce";
allowedServiceAccounts: string;
allowedProjects: string;
allowedZones: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TUpdateGcpAuthDTO = {
identityId: string;
type?: "iam" | "gce";
allowedServiceAccounts?: string;
allowedProjects?: string;
allowedZones?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TGetGcpAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;
type TComputeEngineDetails = {
instance_creation_timestamp: number;
instance_id: string;
instance_name: string;
project_id: string;
project_number: number;
zone: string;
};
export type TGcpIdentityDetails = {
email: string;
computeEngineDetails?: TComputeEngineDetails;
};
export type TGcpIdTokenPayload = {
aud: string;
azp: string;
email: string;
email_verified: boolean;
exp: number;
google?: {
compute_engine: TComputeEngineDetails;
};
iat: number;
iss: string;
sub: string;
};
export type TDecodedGcpIamAuthJwt = {
header: {
alg: string;
kid: string;
typ: string;
};
payload: {
sub: string;
aud: string;
};
signature: string;
metadata: {
[key: string]: string;
};
};

@ -0,0 +1,14 @@
import { z } from "zod";
export const validateGcpAuthField = z
.string()
.trim()
.default("")
.transform((data) => {
if (data === "") return "";
// Trim each ID and join with ', ' to ensure formatting
return data
.split(",")
.map((id) => id.trim())
.join(", ");
});

@ -0,0 +1,10 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TIdentityKubernetesAuthDALFactory = ReturnType<typeof identityKubernetesAuthDALFactory>;
export const identityKubernetesAuthDALFactory = (db: TDbClient) => {
const kubernetesAuthOrm = ormify(db, TableName.IdentityKubernetesAuth);
return kubernetesAuthOrm;
};

@ -0,0 +1,15 @@
/**
* Extracts the K8s service account name and namespace
* from the username in this format: system:serviceaccount:default:infisical-auth
*/
export const extractK8sUsername = (username: string) => {
const parts = username.split(":");
// Ensure that the username format is correct
if (parts.length === 4 && parts[0] === "system" && parts[1] === "serviceaccount") {
return {
namespace: parts[2],
name: parts[3]
};
}
throw new Error("Invalid username format");
};

@ -0,0 +1,515 @@
import { ForbiddenError } from "@casl/ability";
import axios from "axios";
import https from "https";
import jwt from "jsonwebtoken";
import { IdentityAuthMethod, SecretKeyEncoding, TIdentityKubernetesAuthsUpdate } from "@app/db/schemas";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { getConfig } from "@app/lib/config/env";
import {
decryptSymmetric,
encryptSymmetric,
generateAsymmetricKeyPair,
generateSymmetricKey,
infisicalSymmetricDecrypt,
infisicalSymmetricEncypt
} from "@app/lib/crypto/encryption";
import { BadRequestError, UnauthorizedError } from "@app/lib/errors";
import { extractIPDetails, isValidIpOrCidr } from "@app/lib/ip";
import { TOrgBotDALFactory } from "@app/services/org/org-bot-dal";
import { AuthTokenType } from "../auth/auth-type";
import { TIdentityDALFactory } from "../identity/identity-dal";
import { TIdentityOrgDALFactory } from "../identity/identity-org-dal";
import { TIdentityAccessTokenDALFactory } from "../identity-access-token/identity-access-token-dal";
import { TIdentityAccessTokenJwtPayload } from "../identity-access-token/identity-access-token-types";
import { TIdentityKubernetesAuthDALFactory } from "./identity-kubernetes-auth-dal";
import { extractK8sUsername } from "./identity-kubernetes-auth-fns";
import {
TAttachKubernetesAuthDTO,
TCreateTokenReviewResponse,
TGetKubernetesAuthDTO,
TLoginKubernetesAuthDTO,
TUpdateKubernetesAuthDTO
} from "./identity-kubernetes-auth-types";
type TIdentityKubernetesAuthServiceFactoryDep = {
identityKubernetesAuthDAL: Pick<
TIdentityKubernetesAuthDALFactory,
"create" | "findOne" | "transaction" | "updateById"
>;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne" | "findById">;
identityDAL: Pick<TIdentityDALFactory, "updateById">;
orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "transaction" | "create">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
};
export type TIdentityKubernetesAuthServiceFactory = ReturnType<typeof identityKubernetesAuthServiceFactory>;
export const identityKubernetesAuthServiceFactory = ({
identityKubernetesAuthDAL,
identityOrgMembershipDAL,
identityAccessTokenDAL,
identityDAL,
orgBotDAL,
permissionService,
licenseService
}: TIdentityKubernetesAuthServiceFactoryDep) => {
const login = async ({ identityId, jwt: serviceAccountJwt }: TLoginKubernetesAuthDTO) => {
const identityKubernetesAuth = await identityKubernetesAuthDAL.findOne({ identityId });
if (!identityKubernetesAuth) throw new UnauthorizedError();
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({
identityId: identityKubernetesAuth.identityId
});
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
const orgBot = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId });
if (!orgBot) throw new BadRequestError({ message: "Org bot not found", name: "OrgBotNotFound" });
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
const { encryptedCaCert, caCertIV, caCertTag, encryptedTokenReviewerJwt, tokenReviewerJwtIV, tokenReviewerJwtTag } =
identityKubernetesAuth;
let caCert = "";
if (encryptedCaCert && caCertIV && caCertTag) {
caCert = decryptSymmetric({
ciphertext: encryptedCaCert,
iv: caCertIV,
tag: caCertTag,
key
});
}
let tokenReviewerJwt = "";
if (encryptedTokenReviewerJwt && tokenReviewerJwtIV && tokenReviewerJwtTag) {
tokenReviewerJwt = decryptSymmetric({
ciphertext: encryptedTokenReviewerJwt,
iv: tokenReviewerJwtIV,
tag: tokenReviewerJwtTag,
key
});
}
const { data }: { data: TCreateTokenReviewResponse } = await axios.post(
`${identityKubernetesAuth.kubernetesHost}/apis/authentication.k8s.io/v1/tokenreviews`,
{
apiVersion: "authentication.k8s.io/v1",
kind: "TokenReview",
spec: {
token: serviceAccountJwt
}
},
{
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${tokenReviewerJwt}`
},
httpsAgent: new https.Agent({
ca: caCert,
rejectUnauthorized: !!caCert
})
}
);
if ("error" in data.status) throw new UnauthorizedError({ message: data.status.error });
// check the response to determine if the token is valid
if (!(data.status && data.status.authenticated)) throw new UnauthorizedError();
const { namespace: targetNamespace, name: targetName } = extractK8sUsername(data.status.user.username);
if (identityKubernetesAuth.allowedNamespaces) {
// validate if [targetNamespace] is in the list of allowed namespaces
const isNamespaceAllowed = identityKubernetesAuth.allowedNamespaces
.split(",")
.map((namespace) => namespace.trim())
.some((namespace) => namespace === targetNamespace);
if (!isNamespaceAllowed) throw new UnauthorizedError();
}
if (identityKubernetesAuth.allowedNames) {
// validate if [targetName] is in the list of allowed names
const isNameAllowed = identityKubernetesAuth.allowedNames
.split(",")
.map((name) => name.trim())
.some((name) => name === targetName);
if (!isNameAllowed) throw new UnauthorizedError();
}
if (identityKubernetesAuth.allowedAudience) {
// validate if [audience] is in the list of allowed audiences
const isAudienceAllowed = data.status.audiences.some(
(audience) => audience === identityKubernetesAuth.allowedAudience
);
if (!isAudienceAllowed) throw new UnauthorizedError();
}
const identityAccessToken = await identityKubernetesAuthDAL.transaction(async (tx) => {
const newToken = await identityAccessTokenDAL.create(
{
identityId: identityKubernetesAuth.identityId,
isAccessTokenRevoked: false,
accessTokenTTL: identityKubernetesAuth.accessTokenTTL,
accessTokenMaxTTL: identityKubernetesAuth.accessTokenMaxTTL,
accessTokenNumUses: 0,
accessTokenNumUsesLimit: identityKubernetesAuth.accessTokenNumUsesLimit
},
tx
);
return newToken;
});
const appCfg = getConfig();
const accessToken = jwt.sign(
{
identityId: identityKubernetesAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET,
{
expiresIn:
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined
: Number(identityAccessToken.accessTokenMaxTTL)
}
);
return { accessToken, identityKubernetesAuth, identityAccessToken, identityMembershipOrg };
};
const attachKubernetesAuth = async ({
identityId,
kubernetesHost,
caCert,
tokenReviewerJwt,
allowedNamespaces,
allowedNames,
allowedAudience,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TAttachKubernetesAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity.authMethod)
throw new BadRequestError({
message: "Failed to add Kubernetes Auth to already configured identity"
});
if (accessTokenMaxTTL > 0 && accessTokenTTL > accessTokenMaxTTL) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const orgBot = await orgBotDAL.transaction(async (tx) => {
const doc = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId }, tx);
if (doc) return doc;
const { privateKey, publicKey } = generateAsymmetricKeyPair();
const key = generateSymmetricKey();
const {
ciphertext: encryptedPrivateKey,
iv: privateKeyIV,
tag: privateKeyTag,
encoding: privateKeyKeyEncoding,
algorithm: privateKeyAlgorithm
} = infisicalSymmetricEncypt(privateKey);
const {
ciphertext: encryptedSymmetricKey,
iv: symmetricKeyIV,
tag: symmetricKeyTag,
encoding: symmetricKeyKeyEncoding,
algorithm: symmetricKeyAlgorithm
} = infisicalSymmetricEncypt(key);
return orgBotDAL.create(
{
name: "Infisical org bot",
publicKey,
privateKeyIV,
encryptedPrivateKey,
symmetricKeyIV,
symmetricKeyTag,
encryptedSymmetricKey,
symmetricKeyAlgorithm,
orgId: identityMembershipOrg.orgId,
privateKeyTag,
privateKeyAlgorithm,
privateKeyKeyEncoding,
symmetricKeyKeyEncoding
},
tx
);
});
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
const { ciphertext: encryptedCaCert, iv: caCertIV, tag: caCertTag } = encryptSymmetric(caCert, key);
const {
ciphertext: encryptedTokenReviewerJwt,
iv: tokenReviewerJwtIV,
tag: tokenReviewerJwtTag
} = encryptSymmetric(tokenReviewerJwt, key);
const identityKubernetesAuth = await identityKubernetesAuthDAL.transaction(async (tx) => {
const doc = await identityKubernetesAuthDAL.create(
{
identityId: identityMembershipOrg.identityId,
kubernetesHost,
encryptedCaCert,
caCertIV,
caCertTag,
encryptedTokenReviewerJwt,
tokenReviewerJwtIV,
tokenReviewerJwtTag,
allowedNamespaces,
allowedNames,
allowedAudience,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: JSON.stringify(reformattedAccessTokenTrustedIps)
},
tx
);
await identityDAL.updateById(
identityMembershipOrg.identityId,
{
authMethod: IdentityAuthMethod.KUBERNETES_AUTH
},
tx
);
return doc;
});
return { ...identityKubernetesAuth, caCert, tokenReviewerJwt, orgId: identityMembershipOrg.orgId };
};
const updateKubernetesAuth = async ({
identityId,
kubernetesHost,
caCert,
tokenReviewerJwt,
allowedNamespaces,
allowedNames,
allowedAudience,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TUpdateKubernetesAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.KUBERNETES_AUTH)
throw new BadRequestError({
message: "Failed to update Kubernetes Auth"
});
const identityKubernetesAuth = await identityKubernetesAuthDAL.findOne({ identityId });
if (
(accessTokenMaxTTL || identityKubernetesAuth.accessTokenMaxTTL) > 0 &&
(accessTokenTTL || identityKubernetesAuth.accessTokenMaxTTL) >
(accessTokenMaxTTL || identityKubernetesAuth.accessTokenMaxTTL)
) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps?.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const updateQuery: TIdentityKubernetesAuthsUpdate = {
kubernetesHost,
allowedNamespaces,
allowedNames,
allowedAudience,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: reformattedAccessTokenTrustedIps
? JSON.stringify(reformattedAccessTokenTrustedIps)
: undefined
};
const orgBot = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId });
if (!orgBot) throw new BadRequestError({ message: "Org bot not found", name: "OrgBotNotFound" });
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
if (caCert !== undefined) {
const { ciphertext: encryptedCACert, iv: caCertIV, tag: caCertTag } = encryptSymmetric(caCert, key);
updateQuery.encryptedCaCert = encryptedCACert;
updateQuery.caCertIV = caCertIV;
updateQuery.caCertTag = caCertTag;
}
if (tokenReviewerJwt !== undefined) {
const {
ciphertext: encryptedTokenReviewerJwt,
iv: tokenReviewerJwtIV,
tag: tokenReviewerJwtTag
} = encryptSymmetric(tokenReviewerJwt, key);
updateQuery.encryptedTokenReviewerJwt = encryptedTokenReviewerJwt;
updateQuery.tokenReviewerJwtIV = tokenReviewerJwtIV;
updateQuery.tokenReviewerJwtTag = tokenReviewerJwtTag;
}
const updatedKubernetesAuth = await identityKubernetesAuthDAL.updateById(identityKubernetesAuth.id, updateQuery);
return { ...updatedKubernetesAuth, orgId: identityMembershipOrg.orgId };
};
const getKubernetesAuth = async ({
identityId,
actorId,
actor,
actorAuthMethod,
actorOrgId
}: TGetKubernetesAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) throw new BadRequestError({ message: "Failed to find identity" });
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.KUBERNETES_AUTH)
throw new BadRequestError({
message: "The identity does not have Kubernetes Auth attached"
});
const identityKubernetesAuth = await identityKubernetesAuthDAL.findOne({ identityId });
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Identity);
const orgBot = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId });
if (!orgBot) throw new BadRequestError({ message: "Org bot not found", name: "OrgBotNotFound" });
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
const { encryptedCaCert, caCertIV, caCertTag, encryptedTokenReviewerJwt, tokenReviewerJwtIV, tokenReviewerJwtTag } =
identityKubernetesAuth;
let caCert = "";
if (encryptedCaCert && caCertIV && caCertTag) {
caCert = decryptSymmetric({
ciphertext: encryptedCaCert,
iv: caCertIV,
tag: caCertTag,
key
});
}
let tokenReviewerJwt = "";
if (encryptedTokenReviewerJwt && tokenReviewerJwtIV && tokenReviewerJwtTag) {
tokenReviewerJwt = decryptSymmetric({
ciphertext: encryptedTokenReviewerJwt,
iv: tokenReviewerJwtIV,
tag: tokenReviewerJwtTag,
key
});
}
return { ...identityKubernetesAuth, caCert, tokenReviewerJwt, orgId: identityMembershipOrg.orgId };
};
return {
login,
attachKubernetesAuth,
updateKubernetesAuth,
getKubernetesAuth
};
};

@ -0,0 +1,61 @@
import { TProjectPermission } from "@app/lib/types";
export type TLoginKubernetesAuthDTO = {
identityId: string;
jwt: string;
};
export type TAttachKubernetesAuthDTO = {
identityId: string;
kubernetesHost: string;
caCert: string;
tokenReviewerJwt: string;
allowedNamespaces: string;
allowedNames: string;
allowedAudience: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TUpdateKubernetesAuthDTO = {
identityId: string;
kubernetesHost?: string;
caCert?: string;
tokenReviewerJwt?: string;
allowedNamespaces?: string;
allowedNames?: string;
allowedAudience?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TGetKubernetesAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;
type TCreateTokenReviewSuccessResponse = {
authenticated: true;
user: {
username: string;
uid: string;
groups: string[];
};
audiences: string[];
};
type TCreateTokenReviewErrorResponse = {
error: string;
};
export type TCreateTokenReviewResponse = {
apiVersion: "authentication.k8s.io/v1";
kind: "TokenReview";
spec: {
token: string;
};
status: TCreateTokenReviewSuccessResponse | TCreateTokenReviewErrorResponse;
};

@ -10,11 +10,16 @@ export type TIdentityProjectDALFactory = ReturnType<typeof identityProjectDALFac
export const identityProjectDALFactory = (db: TDbClient) => {
const identityProjectOrm = ormify(db, TableName.IdentityProjectMembership);
const findByProjectId = async (projectId: string, tx?: Knex) => {
const findByProjectId = async (projectId: string, filter: { identityId?: string } = {}, tx?: Knex) => {
try {
const docs = await (tx || db)(TableName.IdentityProjectMembership)
.where(`${TableName.IdentityProjectMembership}.projectId`, projectId)
.join(TableName.Identity, `${TableName.IdentityProjectMembership}.identityId`, `${TableName.Identity}.id`)
.where((qb) => {
if (filter.identityId) {
void qb.where("identityId", filter.identityId);
}
})
.join(
TableName.IdentityProjectMembershipRole,
`${TableName.IdentityProjectMembershipRole}.projectMembershipId`,

@ -18,6 +18,7 @@ import { TIdentityProjectMembershipRoleDALFactory } from "./identity-project-mem
import {
TCreateProjectIdentityDTO,
TDeleteProjectIdentityDTO,
TGetProjectIdentityByIdentityIdDTO,
TListProjectIdentityDTO,
TUpdateProjectIdentityDTO
} from "./identity-project-types";
@ -51,7 +52,7 @@ export const identityProjectServiceFactory = ({
actorOrgId,
actorAuthMethod,
projectId,
role
roles
}: TCreateProjectIdentityDTO) => {
const { permission } = await permissionService.getProjectPermission(
actor,
@ -78,17 +79,33 @@ export const identityProjectServiceFactory = ({
message: `Failed to find identity with id ${identityId}`
});
const { permission: rolePermission, role: customRole } = await permissionService.getProjectPermissionByRole(
role,
project.id
);
const hasPriviledge = isAtLeastAsPrivileged(permission, rolePermission);
if (!hasPriviledge)
throw new ForbiddenRequestError({
message: "Failed to add identity to project with more privileged role"
});
const isCustomRole = Boolean(customRole);
for await (const { role: requestedRoleChange } of roles) {
const { permission: rolePermission } = await permissionService.getProjectPermissionByRole(
requestedRoleChange,
projectId
);
const hasRequiredPriviledges = isAtLeastAsPrivileged(permission, rolePermission);
if (!hasRequiredPriviledges) {
throw new ForbiddenRequestError({ message: "Failed to change to a more privileged role" });
}
}
// validate custom roles input
const customInputRoles = roles.filter(
({ role }) => !Object.values(ProjectMembershipRole).includes(role as ProjectMembershipRole)
);
const hasCustomRole = Boolean(customInputRoles.length);
const customRoles = hasCustomRole
? await projectRoleDAL.find({
projectId,
$in: { slug: customInputRoles.map(({ role }) => role) }
})
: [];
if (customRoles.length !== customInputRoles.length) throw new BadRequestError({ message: "Custom role not found" });
const customRolesGroupBySlug = groupBy(customRoles, ({ slug }) => slug);
const projectIdentity = await identityProjectDAL.transaction(async (tx) => {
const identityProjectMembership = await identityProjectDAL.create(
{
@ -97,16 +114,32 @@ export const identityProjectServiceFactory = ({
},
tx
);
const sanitizedProjectMembershipRoles = roles.map((inputRole) => {
const isCustomRole = Boolean(customRolesGroupBySlug?.[inputRole.role]?.[0]);
if (!inputRole.isTemporary) {
return {
projectMembershipId: identityProjectMembership.id,
role: isCustomRole ? ProjectMembershipRole.Custom : inputRole.role,
customRoleId: customRolesGroupBySlug[inputRole.role] ? customRolesGroupBySlug[inputRole.role][0].id : null
};
}
await identityProjectMembershipRoleDAL.create(
{
// check cron or relative here later for now its just relative
const relativeTimeInMs = ms(inputRole.temporaryRange);
return {
projectMembershipId: identityProjectMembership.id,
role: isCustomRole ? ProjectMembershipRole.Custom : role,
customRoleId: customRole?.id
},
tx
);
return identityProjectMembership;
role: isCustomRole ? ProjectMembershipRole.Custom : inputRole.role,
customRoleId: customRolesGroupBySlug[inputRole.role] ? customRolesGroupBySlug[inputRole.role][0].id : null,
isTemporary: true,
temporaryMode: ProjectUserMembershipTemporaryMode.Relative,
temporaryRange: inputRole.temporaryRange,
temporaryAccessStartTime: new Date(inputRole.temporaryAccessStartTime),
temporaryAccessEndTime: new Date(new Date(inputRole.temporaryAccessStartTime).getTime() + relativeTimeInMs)
};
});
const identityRoles = await identityProjectMembershipRoleDAL.insertMany(sanitizedProjectMembershipRoles, tx);
return { ...identityProjectMembership, roles: identityRoles };
});
return projectIdentity;
};
@ -135,16 +168,18 @@ export const identityProjectServiceFactory = ({
message: `Identity with id ${identityId} doesn't exists in project with id ${projectId}`
});
const { permission: identityRolePermission } = await permissionService.getProjectPermission(
ActorType.IDENTITY,
projectIdentity.identityId,
projectIdentity.projectId,
actorAuthMethod,
actorOrgId
);
const hasRequiredPriviledges = isAtLeastAsPrivileged(permission, identityRolePermission);
if (!hasRequiredPriviledges)
throw new ForbiddenRequestError({ message: "Failed to delete more privileged identity" });
for await (const { role: requestedRoleChange } of roles) {
const { permission: rolePermission } = await permissionService.getProjectPermissionByRole(
requestedRoleChange,
projectId
);
const hasRequiredPriviledges = isAtLeastAsPrivileged(permission, rolePermission);
if (!hasRequiredPriviledges) {
throw new ForbiddenRequestError({ message: "Failed to change to a more privileged role" });
}
}
// validate custom roles input
const customInputRoles = roles.filter(
@ -248,10 +283,33 @@ export const identityProjectServiceFactory = ({
return identityMemberships;
};
const getProjectIdentityByIdentityId = async ({
projectId,
actor,
actorId,
actorAuthMethod,
actorOrgId,
identityId
}: TGetProjectIdentityByIdentityIdDTO) => {
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Identity);
const [identityMembership] = await identityProjectDAL.findByProjectId(projectId, { identityId });
if (!identityMembership) throw new BadRequestError({ message: `Membership not found for identity ${identityId}` });
return identityMembership;
};
return {
createProjectIdentity,
updateProjectIdentity,
deleteProjectIdentity,
listProjectIdentities
listProjectIdentities,
getProjectIdentityByIdentityId
};
};

@ -4,7 +4,19 @@ import { ProjectUserMembershipTemporaryMode } from "../project-membership/projec
export type TCreateProjectIdentityDTO = {
identityId: string;
role: string;
roles: (
| {
role: string;
isTemporary?: false;
}
| {
role: string;
isTemporary: true;
temporaryMode: ProjectUserMembershipTemporaryMode.Relative;
temporaryRange: string;
temporaryAccessStartTime: string;
}
)[];
} & TProjectPermission;
export type TUpdateProjectIdentityDTO = {
@ -29,3 +41,7 @@ export type TDeleteProjectIdentityDTO = {
} & TProjectPermission;
export type TListProjectIdentityDTO = TProjectPermission;
export type TGetProjectIdentityByIdentityIdDTO = {
identityId: string;
} & TProjectPermission;

@ -52,7 +52,7 @@ export const identityUaServiceFactory = ({
}: TIdentityUaServiceFactoryDep) => {
const login = async (clientId: string, clientSecret: string, ip: string) => {
const identityUa = await identityUaDAL.findOne({ clientId });
if (!identityUa) throw new UnauthorizedError();
if (!identityUa) throw new UnauthorizedError({ message: "Invalid credentials" });
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId: identityUa.identityId });
@ -68,7 +68,7 @@ export const identityUaServiceFactory = ({
const validClientSecretInfo = clientSecrtInfo.find(({ clientSecretHash }) =>
bcrypt.compareSync(clientSecret, clientSecretHash)
);
if (!validClientSecretInfo) throw new UnauthorizedError();
if (!validClientSecretInfo) throw new UnauthorizedError({ message: "Invalid credentials" });
const { clientSecretTTL, clientSecretNumUses, clientSecretNumUsesLimit } = validClientSecretInfo;
if (Number(clientSecretTTL) > 0) {

@ -9,9 +9,12 @@
import {
CreateSecretCommand,
DescribeSecretCommand,
GetSecretValueCommand,
ResourceNotFoundException,
SecretsManagerClient,
TagResourceCommand,
UntagResourceCommand,
UpdateSecretCommand
} from "@aws-sdk/client-secrets-manager";
import { Octokit } from "@octokit/rest";
@ -459,27 +462,39 @@ const syncSecretsAWSParameterStore = async ({
ssm.config.update(config);
const metadata = z.record(z.any()).parse(integration.metadata || {});
const awsParameterStoreSecretsObj: Record<string, AWS.SSM.Parameter> = {};
const params = {
Path: integration.path as string,
Recursive: false,
WithDecryption: true
};
// now fetch all aws parameter store secrets
let hasNext = true;
let nextToken: string | undefined;
while (hasNext) {
const parameters = await ssm
.getParametersByPath({
Path: integration.path as string,
Recursive: false,
WithDecryption: true,
MaxResults: 10,
NextToken: nextToken
})
.promise();
const parameterList = (await ssm.getParametersByPath(params).promise()).Parameters;
if (parameters.Parameters) {
parameters.Parameters.forEach((parameter) => {
if (parameter.Name) {
const secKey = parameter.Name.substring((integration.path as string).length);
awsParameterStoreSecretsObj[secKey] = parameter;
}
});
}
hasNext = Boolean(parameters.NextToken);
nextToken = parameters.NextToken;
}
const awsParameterStoreSecretsObj = (parameterList || [])
.filter(({ Name }) => Boolean(Name))
.reduce(
(obj, secret) => ({
...obj,
[(secret.Name as string).substring((integration.path as string).length)]: secret
}),
{} as Record<string, AWS.SSM.Parameter>
);
// Identify secrets to create
await Promise.all(
Object.keys(secrets).map(async (key) => {
// don't use Promise.all() and promise map here
// it will cause rate limit
for (const key in secrets) {
if (Object.hasOwn(secrets, key)) {
if (!(key in awsParameterStoreSecretsObj)) {
// case: secret does not exist in AWS parameter store
// -> create secret
@ -514,23 +529,31 @@ const syncSecretsAWSParameterStore = async ({
})
.promise();
}
})
);
// Identify secrets to delete
await Promise.all(
Object.keys(awsParameterStoreSecretsObj).map(async (key) => {
if (!(key in secrets)) {
// case:
// -> delete secret
await ssm
.deleteParameter({
Name: awsParameterStoreSecretsObj[key].Name as string
})
.promise();
await new Promise((resolve) => {
setTimeout(resolve, 50);
});
}
}
if (!metadata.shouldDisableDelete) {
for (const key in awsParameterStoreSecretsObj) {
if (Object.hasOwn(awsParameterStoreSecretsObj, key)) {
if (!(key in secrets)) {
// case:
// -> delete secret
await ssm
.deleteParameter({
Name: awsParameterStoreSecretsObj[key].Name as string
})
.promise();
}
await new Promise((resolve) => {
setTimeout(resolve, 50);
});
}
})
);
}
}
};
/**
@ -572,6 +595,7 @@ const syncSecretsAWSSecretManager = async ({
if (awsSecretManagerSecret?.SecretString) {
awsSecretManagerSecretObj = JSON.parse(awsSecretManagerSecret.SecretString);
}
if (!isEqual(awsSecretManagerSecretObj, secKeyVal)) {
await secretsManager.send(
new UpdateSecretCommand({
@ -580,7 +604,88 @@ const syncSecretsAWSSecretManager = async ({
})
);
}
const secretAWSTag = metadata.secretAWSTag as { key: string; value: string }[] | undefined;
if (secretAWSTag && secretAWSTag.length) {
const describedSecret = await secretsManager.send(
// requires secretsmanager:DescribeSecret policy
new DescribeSecretCommand({
SecretId: integration.app as string
})
);
if (!describedSecret.Tags) return;
const integrationTagObj = secretAWSTag.reduce(
(acc, item) => {
acc[item.key] = item.value;
return acc;
},
{} as Record<string, string>
);
const awsTagObj = (describedSecret.Tags || []).reduce(
(acc, item) => {
if (item.Key && item.Value) {
acc[item.Key] = item.Value;
}
return acc;
},
{} as Record<string, string>
);
const tagsToUpdate: { Key: string; Value: string }[] = [];
const tagsToDelete: { Key: string; Value: string }[] = [];
describedSecret.Tags?.forEach((tag) => {
if (tag.Key && tag.Value) {
if (!(tag.Key in integrationTagObj)) {
// delete tag from AWS secret manager
tagsToDelete.push({
Key: tag.Key,
Value: tag.Value
});
} else if (tag.Value !== integrationTagObj[tag.Key]) {
// update tag in AWS secret manager
tagsToUpdate.push({
Key: tag.Key,
Value: integrationTagObj[tag.Key]
});
}
}
});
secretAWSTag?.forEach((tag) => {
if (!(tag.key in awsTagObj)) {
// create tag in AWS secret manager
tagsToUpdate.push({
Key: tag.key,
Value: tag.value
});
}
});
if (tagsToUpdate.length) {
await secretsManager.send(
new TagResourceCommand({
SecretId: integration.app as string,
Tags: tagsToUpdate
})
);
}
if (tagsToDelete.length) {
await secretsManager.send(
new UntagResourceCommand({
SecretId: integration.app as string,
TagKeys: tagsToDelete.map((tag) => tag.Key)
})
);
}
}
} catch (err) {
// case when AWS manager can't find the specified secret
if (err instanceof ResourceNotFoundException && secretsManager) {
await secretsManager.send(
new CreateSecretCommand({
@ -2860,7 +2965,7 @@ const syncSecretsDigitalOceanAppPlatform = async ({
spec: {
name: integration.app,
...appSettings,
envs: Object.entries(secrets).map(([key, data]) => ({ key, value: data.value }))
envs: Object.entries(secrets).map(([key, data]) => ({ key, value: data.value, type: "SECRET" }))
}
},
{

@ -9,7 +9,12 @@ import { TIntegrationAuthDALFactory } from "../integration-auth/integration-auth
import { TSecretQueueFactory } from "../secret/secret-queue";
import { TSecretFolderDALFactory } from "../secret-folder/secret-folder-dal";
import { TIntegrationDALFactory } from "./integration-dal";
import { TCreateIntegrationDTO, TDeleteIntegrationDTO, TUpdateIntegrationDTO } from "./integration-types";
import {
TCreateIntegrationDTO,
TDeleteIntegrationDTO,
TSyncIntegrationDTO,
TUpdateIntegrationDTO
} from "./integration-types";
type TIntegrationServiceFactoryDep = {
integrationDAL: TIntegrationDALFactory;
@ -103,7 +108,8 @@ export const integrationServiceFactory = ({
owner,
isActive,
environment,
secretPath
secretPath,
metadata
}: TUpdateIntegrationDTO) => {
const integration = await integrationDAL.findById(id);
if (!integration) throw new BadRequestError({ message: "Integration auth not found" });
@ -127,7 +133,17 @@ export const integrationServiceFactory = ({
appId,
targetEnvironment,
owner,
secretPath
secretPath,
metadata: {
...(integration.metadata as object),
...metadata
}
});
await secretQueueService.syncIntegrations({
environment: folder.environment.slug,
secretPath,
projectId: folder.projectId
});
return updatedIntegration;
@ -190,10 +206,35 @@ export const integrationServiceFactory = ({
return integrations;
};
const syncIntegration = async ({ id, actorId, actor, actorOrgId, actorAuthMethod }: TSyncIntegrationDTO) => {
const integration = await integrationDAL.findById(id);
if (!integration) {
throw new BadRequestError({ message: "Integration not found" });
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integration.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
await secretQueueService.syncIntegrations({
environment: integration.environment.slug,
secretPath: integration.secretPath,
projectId: integration.projectId
});
return { ...integration, envId: integration.environment.id };
};
return {
createIntegration,
updateIntegration,
deleteIntegration,
listIntegrationByProject
listIntegrationByProject,
syncIntegration
};
};

@ -27,20 +27,39 @@ export type TCreateIntegrationDTO = {
value: string;
}[];
kmsKeyId?: string;
shouldDisableDelete?: boolean;
};
} & Omit<TProjectPermission, "projectId">;
export type TUpdateIntegrationDTO = {
id: string;
app: string;
appId: string;
app?: string;
appId?: string;
isActive?: boolean;
secretPath: string;
targetEnvironment: string;
owner: string;
environment: string;
metadata?: {
secretPrefix?: string;
secretSuffix?: string;
secretGCPLabel?: {
labelName: string;
labelValue: string;
};
secretAWSTag?: {
key: string;
value: string;
}[];
kmsKeyId?: string;
shouldDisableDelete?: boolean;
};
} & Omit<TProjectPermission, "projectId">;
export type TDeleteIntegrationDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;
export type TSyncIntegrationDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;

@ -546,6 +546,10 @@ export const orgServiceFactory = ({
code
});
await userDAL.updateById(user.id, {
isEmailVerified: true
});
if (user.isAccepted) {
// this means user has already completed signup process
// isAccepted is set true when keys are exchanged

@ -9,11 +9,19 @@ export const projectMembershipDALFactory = (db: TDbClient) => {
const projectMemberOrm = ormify(db, TableName.ProjectMembership);
// special query
const findAllProjectMembers = async (projectId: string) => {
const findAllProjectMembers = async (projectId: string, filter: { usernames?: string[]; username?: string } = {}) => {
try {
const docs = await db(TableName.ProjectMembership)
.where({ [`${TableName.ProjectMembership}.projectId` as "projectId"]: projectId })
.join(TableName.Users, `${TableName.ProjectMembership}.userId`, `${TableName.Users}.id`)
.where((qb) => {
if (filter.usernames) {
void qb.whereIn("username", filter.usernames);
}
if (filter.username) {
void qb.where("username", filter.username);
}
})
.join<TUserEncryptionKeys>(
TableName.UserEncryptionKey,
`${TableName.UserEncryptionKey}.userId`,

@ -34,6 +34,7 @@ import {
TAddUsersToWorkspaceNonE2EEDTO,
TDeleteProjectMembershipOldDTO,
TDeleteProjectMembershipsDTO,
TGetProjectMembershipByUsernameDTO,
TGetProjectMembershipDTO,
TUpdateProjectMembershipDTO
} from "./project-membership-types";
@ -89,6 +90,28 @@ export const projectMembershipServiceFactory = ({
return projectMembershipDAL.findAllProjectMembers(projectId);
};
const getProjectMembershipByUsername = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
projectId,
username
}: TGetProjectMembershipByUsernameDTO) => {
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Member);
const [membership] = await projectMembershipDAL.findAllProjectMembers(projectId, { username });
if (!membership) throw new BadRequestError({ message: `Project membership not found for user ${username}` });
return membership;
};
const addUsersToProject = async ({
projectId,
actorId,
@ -510,6 +533,7 @@ export const projectMembershipServiceFactory = ({
return {
getProjectMemberships,
getProjectMembershipByUsername,
updateProjectMembership,
addUsersToProjectNonE2EE,
deleteProjectMemberships,

@ -9,6 +9,10 @@ export type TInviteUserToProjectDTO = {
emails: string[];
} & TProjectPermission;
export type TGetProjectMembershipByUsernameDTO = {
username: string;
} & TProjectPermission;
export type TUpdateProjectMembershipDTO = {
membershipId: string;
roles: (

@ -8,9 +8,16 @@ import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services
import { TSecretSnapshotServiceFactory } from "@app/ee/services/secret-snapshot/secret-snapshot-service";
import { BadRequestError } from "@app/lib/errors";
import { TProjectDALFactory } from "../project/project-dal";
import { TProjectEnvDALFactory } from "../project-env/project-env-dal";
import { TSecretFolderDALFactory } from "./secret-folder-dal";
import { TCreateFolderDTO, TDeleteFolderDTO, TGetFolderDTO, TUpdateFolderDTO } from "./secret-folder-types";
import {
TCreateFolderDTO,
TDeleteFolderDTO,
TGetFolderDTO,
TUpdateFolderDTO,
TUpdateManyFoldersDTO
} from "./secret-folder-types";
import { TSecretFolderVersionDALFactory } from "./secret-folder-version-dal";
type TSecretFolderServiceFactoryDep = {
@ -19,6 +26,7 @@ type TSecretFolderServiceFactoryDep = {
folderDAL: TSecretFolderDALFactory;
projectEnvDAL: Pick<TProjectEnvDALFactory, "findOne">;
folderVersionDAL: TSecretFolderVersionDALFactory;
projectDAL: Pick<TProjectDALFactory, "findProjectBySlug">;
};
export type TSecretFolderServiceFactory = ReturnType<typeof secretFolderServiceFactory>;
@ -28,7 +36,8 @@ export const secretFolderServiceFactory = ({
snapshotService,
permissionService,
projectEnvDAL,
folderVersionDAL
folderVersionDAL,
projectDAL
}: TSecretFolderServiceFactoryDep) => {
const createFolder = async ({
projectId,
@ -116,6 +125,105 @@ export const secretFolderServiceFactory = ({
return folder;
};
const updateManyFolders = async ({
actor,
actorId,
projectSlug,
actorAuthMethod,
actorOrgId,
folders
}: TUpdateManyFoldersDTO) => {
const project = await projectDAL.findProjectBySlug(projectSlug, actorOrgId);
if (!project) {
throw new BadRequestError({ message: "Project not found" });
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
project.id,
actorAuthMethod,
actorOrgId
);
folders.forEach(({ environment, path: secretPath }) => {
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Edit,
subject(ProjectPermissionSub.Secrets, { environment, secretPath })
);
});
const result = await folderDAL.transaction(async (tx) =>
Promise.all(
folders.map(async (newFolder) => {
const { environment, path: secretPath, id, name } = newFolder;
const parentFolder = await folderDAL.findBySecretPath(project.id, environment, secretPath);
if (!parentFolder) {
throw new BadRequestError({ message: "Secret path not found", name: "Batch update folder" });
}
const env = await projectEnvDAL.findOne({ projectId: project.id, slug: environment });
if (!env) {
throw new BadRequestError({ message: "Environment not found", name: "Batch update folder" });
}
const folder = await folderDAL
.findOne({ envId: env.id, id, parentId: parentFolder.id })
// now folder api accepts id based change
// this is for cli backward compatiability and when cli removes this, we will remove this logic
.catch(() => folderDAL.findOne({ envId: env.id, name: id, parentId: parentFolder.id }));
if (!folder) {
throw new BadRequestError({ message: "Folder not found" });
}
if (name !== folder.name) {
// ensure that new folder name is unique
const folderToCheck = await folderDAL.findOne({
name,
envId: env.id,
parentId: parentFolder.id
});
if (folderToCheck) {
throw new BadRequestError({
message: "Folder with specified name already exists",
name: "Batch update folder"
});
}
}
const [doc] = await folderDAL.update(
{ envId: env.id, id: folder.id, parentId: parentFolder.id },
{ name },
tx
);
await folderVersionDAL.create(
{
name: doc.name,
envId: doc.envId,
version: doc.version,
folderId: doc.id
},
tx
);
if (!doc) {
throw new BadRequestError({ message: "Folder not found", name: "Batch update folder" });
}
return { oldFolder: folder, newFolder: doc };
})
)
);
await Promise.all(result.map(async (res) => snapshotService.performSnapshot(res.newFolder.parentId as string)));
return {
projectId: project.id,
newFolders: result.map((res) => res.newFolder),
oldFolders: result.map((res) => res.oldFolder)
};
};
const updateFolder = async ({
projectId,
actor,
@ -151,6 +259,21 @@ export const secretFolderServiceFactory = ({
.catch(() => folderDAL.findOne({ envId: env.id, name: id, parentId: parentFolder.id }));
if (!folder) throw new BadRequestError({ message: "Folder not found" });
if (name !== folder.name) {
// ensure that new folder name is unique
const folderToCheck = await folderDAL.findOne({
name,
envId: env.id,
parentId: parentFolder.id
});
if (folderToCheck) {
throw new BadRequestError({
message: "Folder with specified name already exists",
name: "Update folder"
});
}
}
const newFolder = await folderDAL.transaction(async (tx) => {
const [doc] = await folderDAL.update({ envId: env.id, id: folder.id, parentId: parentFolder.id }, { name }, tx);
@ -239,6 +362,7 @@ export const secretFolderServiceFactory = ({
return {
createFolder,
updateFolder,
updateManyFolders,
deleteFolder,
getFolders
};

@ -13,6 +13,16 @@ export type TUpdateFolderDTO = {
name: string;
} & TProjectPermission;
export type TUpdateManyFoldersDTO = {
projectSlug: string;
folders: {
environment: string;
path: string;
id: string;
name: string;
}[];
} & Omit<TProjectPermission, "projectId">;
export type TDeleteFolderDTO = {
environment: string;
path: string;

@ -243,6 +243,74 @@ export const secretDALFactory = (db: TDbClient) => {
}
};
const upsertSecretReferences = async (
data: {
secretId: string;
references: Array<{ environment: string; secretPath: string }>;
}[] = [],
tx?: Knex
) => {
try {
if (!data.length) return;
await (tx || db)(TableName.SecretReference)
.whereIn(
"secretId",
data.map(({ secretId }) => secretId)
)
.delete();
const newSecretReferences = data
.filter(({ references }) => references.length)
.flatMap(({ secretId, references }) =>
references.map(({ environment, secretPath }) => ({
secretPath,
secretId,
environment
}))
);
if (!newSecretReferences.length) return;
const secretReferences = await (tx || db)(TableName.SecretReference).insert(newSecretReferences);
return secretReferences;
} catch (error) {
throw new DatabaseError({ error, name: "UpsertSecretReference" });
}
};
const findReferencedSecretReferences = async (projectId: string, envSlug: string, secretPath: string, tx?: Knex) => {
try {
const docs = await (tx || db)(TableName.SecretReference)
.where({
secretPath,
environment: envSlug
})
.join(TableName.Secret, `${TableName.Secret}.id`, `${TableName.SecretReference}.secretId`)
.join(TableName.SecretFolder, `${TableName.Secret}.folderId`, `${TableName.SecretFolder}.id`)
.join(TableName.Environment, `${TableName.SecretFolder}.envId`, `${TableName.Environment}.id`)
.where("projectId", projectId)
.select(selectAllTableCols(TableName.SecretReference))
.select("folderId");
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindReferencedSecretReferences" });
}
};
// special query to backfill secret value
const findAllProjectSecretValues = async (projectId: string, tx?: Knex) => {
try {
const docs = await (tx || db)(TableName.Secret)
.join(TableName.SecretFolder, `${TableName.Secret}.folderId`, `${TableName.SecretFolder}.id`)
.join(TableName.Environment, `${TableName.SecretFolder}.envId`, `${TableName.Environment}.id`)
.where("projectId", projectId)
// not empty
.whereNotNull("secretValueCiphertext")
.select("secretValueTag", "secretValueCiphertext", "secretValueIV", `${TableName.Secret}.id` as "id");
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindAllProjectSecretValues" });
}
};
return {
...secretOrm,
update,
@ -252,6 +320,9 @@ export const secretDALFactory = (db: TDbClient) => {
getSecretTags,
findByFolderId,
findByFolderIds,
findByBlindIndexes
findByBlindIndexes,
upsertSecretReferences,
findReferencedSecretReferences,
findAllProjectSecretValues
};
};

@ -194,6 +194,7 @@ type TInterpolateSecretArg = {
folderDAL: Pick<TSecretFolderDALFactory, "findBySecretPath">;
};
const INTERPOLATION_SYNTAX_REG = /\${([^}]+)}/g;
export const interpolateSecrets = ({ projectId, secretEncKey, secretDAL, folderDAL }: TInterpolateSecretArg) => {
const fetchSecretsCrossEnv = () => {
const fetchCache: Record<string, Record<string, string>> = {};
@ -235,7 +236,6 @@ export const interpolateSecrets = ({ projectId, secretEncKey, secretDAL, folderD
};
};
const INTERPOLATION_SYNTAX_REG = /\${([^}]+)}/g;
const recursivelyExpandSecret = async (
expandedSec: Record<string, string>,
interpolatedSec: Record<string, string>,
@ -353,7 +353,7 @@ export const interpolateSecrets = ({ projectId, secretEncKey, secretDAL, folderD
};
export const decryptSecretRaw = (
secret: TSecrets & { workspace: string; environment: string; secretPath?: string },
secret: TSecrets & { workspace: string; environment: string; secretPath: string },
key: string
) => {
const secretKey = decryptSymmetric128BitHexKeyUTF8({
@ -396,6 +396,37 @@ export const decryptSecretRaw = (
};
};
/**
* Grabs and processes nested secret references from a string
*
* This function looks for patterns that match the interpolation syntax in the input string.
* It filters out references that include nested paths, splits them into environment and
* secret path parts, and then returns an array of objects with the environment and the
* joined secret path.
*
* @param {string} maybeSecretReference - The string that has the potential secret references.
* @returns {Array<{ environment: string, secretPath: string }>} - An array of objects
* with the environment and joined secret path.
*
* @example
* const value = "Hello ${dev.someFolder.OtherFolder.SECRET_NAME} and ${prod.anotherFolder.SECRET_NAME}";
* const result = getAllNestedSecretReferences(value);
* // result will be:
* // [
* // { environment: 'dev', secretPath: '/someFolder/OtherFolder' },
* // { environment: 'prod', secretPath: '/anotherFolder' }
* // ]
*/
export const getAllNestedSecretReferences = (maybeSecretReference: string) => {
const references = Array.from(maybeSecretReference.matchAll(INTERPOLATION_SYNTAX_REG), (m) => m[1]);
return references
.filter((el) => el.includes("."))
.map((el) => {
const [environment, ...secretPathList] = el.split(".");
return { environment, secretPath: path.join("/", ...secretPathList.slice(0, -1)) };
});
};
/**
* Checks and handles secrets using a blind index method.
* The function generates mappings between secret names and their blind indexes, validates user IDs for personal secrets, and retrieves secrets from the database based on their blind indexes.
@ -467,7 +498,7 @@ export const fnSecretBulkInsert = async ({
tx
}: TFnSecretBulkInsert) => {
const newSecrets = await secretDAL.insertMany(
inputSecrets.map(({ tags, ...el }) => ({ ...el, folderId })),
inputSecrets.map(({ tags, references, ...el }) => ({ ...el, folderId })),
tx
);
const newSecretGroupByBlindIndex = groupBy(newSecrets, (item) => item.secretBlindIndex as string);
@ -478,13 +509,20 @@ export const fnSecretBulkInsert = async ({
}))
);
const secretVersions = await secretVersionDAL.insertMany(
inputSecrets.map(({ tags, ...el }) => ({
inputSecrets.map(({ tags, references, ...el }) => ({
...el,
folderId,
secretId: newSecretGroupByBlindIndex[el.secretBlindIndex as string][0].id
})),
tx
);
await secretDAL.upsertSecretReferences(
inputSecrets.map(({ references = [], secretBlindIndex }) => ({
secretId: newSecretGroupByBlindIndex[secretBlindIndex as string][0].id,
references
})),
tx
);
if (newSecretTags.length) {
const secTags = await secretTagDAL.saveTagsToSecret(newSecretTags, tx);
const secVersionsGroupBySecId = groupBy(secretVersions, (i) => i.secretId);
@ -509,7 +547,7 @@ export const fnSecretBulkUpdate = async ({
secretVersionTagDAL
}: TFnSecretBulkUpdate) => {
const newSecrets = await secretDAL.bulkUpdate(
inputSecrets.map(({ filter, data: { tags, ...data } }) => ({
inputSecrets.map(({ filter, data: { tags, references, ...data } }) => ({
filter: { ...filter, folderId },
data
})),
@ -522,6 +560,15 @@ export const fnSecretBulkUpdate = async ({
})),
tx
);
await secretDAL.upsertSecretReferences(
inputSecrets
.filter(({ data: { references } }) => Boolean(references))
.map(({ data: { references = [] } }, i) => ({
secretId: newSecrets[i].id,
references
})),
tx
);
const secsUpdatedTag = inputSecrets.flatMap(({ data: { tags } }, i) =>
tags !== undefined ? { tags, secretId: newSecrets[i].id } : []
);
@ -591,50 +638,39 @@ export const createManySecretsRawFnFactory = ({
folderId,
isNew: true,
blindIndexCfg,
userId,
secretDAL
});
const inputSecrets = await Promise.all(
secrets.map(async (secret) => {
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretName, botKey);
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretValue || "", botKey);
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretComment || "", botKey);
const inputSecrets = secrets.map((secret) => {
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretName, botKey);
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretValue || "", botKey);
const secretReferences = getAllNestedSecretReferences(secret.secretValue || "");
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretComment || "", botKey);
if (secret.type === SecretType.Personal) {
if (!userId) throw new BadRequestError({ message: "Missing user id for personal secret" });
const sharedExist = await secretDAL.findOne({
secretBlindIndex: keyName2BlindIndex[secret.secretName],
folderId,
type: SecretType.Shared
});
return {
type: secret.type,
userId: secret.type === SecretType.Personal ? userId : null,
secretName: secret.secretName,
secretKeyCiphertext: secretKeyEncrypted.ciphertext,
secretKeyIV: secretKeyEncrypted.iv,
secretKeyTag: secretKeyEncrypted.tag,
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag,
skipMultilineEncoding: secret.skipMultilineEncoding,
tags: secret.tags,
references: secretReferences
};
});
if (!sharedExist)
throw new BadRequestError({
message: "Failed to create personal secret override for no corresponding shared secret"
});
}
const tags = secret.tags ? await secretTagDAL.findManyTagsById(projectId, secret.tags) : [];
if ((secret.tags || []).length !== tags.length) throw new BadRequestError({ message: "Tag not found" });
return {
type: secret.type,
userId: secret.type === SecretType.Personal ? userId : null,
secretName: secret.secretName,
secretKeyCiphertext: secretKeyEncrypted.ciphertext,
secretKeyIV: secretKeyEncrypted.iv,
secretKeyTag: secretKeyEncrypted.tag,
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag,
skipMultilineEncoding: secret.skipMultilineEncoding,
tags: secret.tags
};
})
);
// get all tags
const tagIds = inputSecrets.flatMap(({ tags = [] }) => tags);
const tags = tagIds.length ? await secretTagDAL.findManyTagsById(projectId, tagIds) : [];
if (tags.length !== tagIds.length) throw new BadRequestError({ message: "Tag not found" });
const newSecrets = await secretDAL.transaction(async (tx) =>
fnSecretBulkInsert({
@ -703,56 +739,35 @@ export const updateManySecretsRawFnFactory = ({
userId
});
const inputSecrets = await Promise.all(
secrets.map(async (secret) => {
if (secret.newSecretName === "") {
throw new BadRequestError({ message: "New secret name cannot be empty" });
}
const inputSecrets = secrets.map((secret) => {
if (secret.newSecretName === "") {
throw new BadRequestError({ message: "New secret name cannot be empty" });
}
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretName, botKey);
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretValue || "", botKey);
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretComment || "", botKey);
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretName, botKey);
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretValue || "", botKey);
const secretReferences = getAllNestedSecretReferences(secret.secretValue || "");
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8(secret.secretComment || "", botKey);
if (secret.type === SecretType.Personal) {
if (!userId) throw new BadRequestError({ message: "Missing user id for personal secret" });
const sharedExist = await secretDAL.findOne({
secretBlindIndex: keyName2BlindIndex[secret.secretName],
folderId,
type: SecretType.Shared
});
if (!sharedExist)
throw new BadRequestError({
message: "Failed to update personal secret override for no corresponding shared secret"
});
if (secret.newSecretName)
throw new BadRequestError({ message: "Personal secret cannot change the key name" });
}
const tags = secret.tags ? await secretTagDAL.findManyTagsById(projectId, secret.tags) : [];
if ((secret.tags || []).length !== tags.length) throw new BadRequestError({ message: "Tag not found" });
return {
type: secret.type,
userId: secret.type === SecretType.Personal ? userId : null,
secretName: secret.secretName,
newSecretName: secret.newSecretName,
secretKeyCiphertext: secretKeyEncrypted.ciphertext,
secretKeyIV: secretKeyEncrypted.iv,
secretKeyTag: secretKeyEncrypted.tag,
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag,
skipMultilineEncoding: secret.skipMultilineEncoding,
tags: secret.tags
};
})
);
return {
type: secret.type,
userId: secret.type === SecretType.Personal ? userId : null,
secretName: secret.secretName,
newSecretName: secret.newSecretName,
secretKeyCiphertext: secretKeyEncrypted.ciphertext,
secretKeyIV: secretKeyEncrypted.iv,
secretKeyTag: secretKeyEncrypted.tag,
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag,
skipMultilineEncoding: secret.skipMultilineEncoding,
tags: secret.tags,
references: secretReferences
};
});
const tagIds = inputSecrets.flatMap(({ tags = [] }) => tags);
const tags = tagIds.length ? await secretTagDAL.findManyTagsById(projectId, tagIds) : [];

@ -59,6 +59,7 @@ export type TGetSecrets = {
};
const MAX_SYNC_SECRET_DEPTH = 5;
const uniqueIntegrationKey = (environment: string, secretPath: string) => `integration-${environment}-${secretPath}`;
export const secretQueueFactory = ({
queueService,
@ -102,28 +103,35 @@ export const secretQueueFactory = ({
folderDAL
});
const syncIntegrations = async (dto: TGetSecrets) => {
const syncIntegrations = async (dto: TGetSecrets & { deDupeQueue?: Record<string, boolean> }) => {
await queueService.queue(QueueName.IntegrationSync, QueueJobs.IntegrationSync, dto, {
attempts: 5,
attempts: 3,
delay: 1000,
backoff: {
type: "exponential",
delay: 3000
},
removeOnComplete: true,
removeOnFail: {
count: 5 // keep the most recent jobs
}
removeOnFail: true
});
};
const syncSecrets = async (dto: TGetSecrets & { depth?: number }) => {
const syncSecrets = async ({
deDupeQueue = {},
...dto
}: TGetSecrets & { depth?: number; deDupeQueue?: Record<string, boolean> }) => {
const deDuplicationKey = uniqueIntegrationKey(dto.environment, dto.secretPath);
if (deDupeQueue?.[deDuplicationKey]) {
return;
}
// eslint-disable-next-line
deDupeQueue[deDuplicationKey] = true;
logger.info(
`syncSecrets: syncing project secrets where [projectId=${dto.projectId}] [environment=${dto.environment}] [path=${dto.secretPath}]`
);
await queueService.queue(QueueName.SecretWebhook, QueueJobs.SecWebhook, dto, {
jobId: `secret-webhook-${dto.environment}-${dto.projectId}-${dto.secretPath}`,
removeOnFail: { count: 5 },
removeOnFail: true,
removeOnComplete: true,
delay: 1000,
attempts: 5,
@ -132,7 +140,7 @@ export const secretQueueFactory = ({
delay: 3000
}
});
await syncIntegrations(dto);
await syncIntegrations({ ...dto, deDupeQueue });
};
const removeSecretReminder = async (dto: TRemoveSecretReminderDTO) => {
@ -326,7 +334,7 @@ export const secretQueueFactory = ({
};
queueService.start(QueueName.IntegrationSync, async (job) => {
const { environment, projectId, secretPath, depth = 1 } = job.data;
const { environment, projectId, secretPath, depth = 1, deDupeQueue = {} } = job.data;
const folder = await folderDAL.findBySecretPath(projectId, environment, secretPath);
if (!folder) {
@ -349,21 +357,68 @@ export const secretQueueFactory = ({
const importedFolderIds = unique(imports, (i) => i.folderId).map(({ folderId }) => folderId);
const importedFolders = await folderDAL.findSecretPathByFolderIds(projectId, importedFolderIds);
const foldersGroupedById = groupBy(importedFolders, (i) => i.child || i.id);
logger.info(
`getIntegrationSecrets: Syncing secret due to link change [jobId=${job.id}] [projectId=${job.data.projectId}] [environment=${job.data.environment}] [secretPath=${job.data.secretPath}] [depth=${depth}]`
);
await Promise.all(
imports
.filter(({ folderId }) => Boolean(foldersGroupedById[folderId][0].path))
.map(({ folderId }) => {
const syncDto = {
// filter out already synced ones
.filter(
({ folderId }) =>
!deDupeQueue[
uniqueIntegrationKey(
foldersGroupedById[folderId][0].environmentSlug,
foldersGroupedById[folderId][0].path
)
]
)
.map(({ folderId }) =>
syncSecrets({
depth: depth + 1,
projectId,
secretPath: foldersGroupedById[folderId][0].path,
environment: foldersGroupedById[folderId][0].environmentSlug
};
logger.info(
`getIntegrationSecrets: Syncing secret due to link change [jobId=${job.id}] [projectId=${job.data.projectId}] [environment=${job.data.environment}] [secretPath=${job.data.secretPath}] [depth=${depth}]`
);
return syncSecrets(syncDto);
})
environment: foldersGroupedById[folderId][0].environmentSlug,
deDupeQueue
})
)
);
}
const secretReferences = await secretDAL.findReferencedSecretReferences(
projectId,
folder.environment.slug,
secretPath
);
if (secretReferences.length) {
const referencedFolderIds = unique(secretReferences, (i) => i.folderId).map(({ folderId }) => folderId);
const referencedFolders = await folderDAL.findSecretPathByFolderIds(projectId, referencedFolderIds);
const referencedFoldersGroupedById = groupBy(referencedFolders, (i) => i.child || i.id);
logger.info(
`getIntegrationSecrets: Syncing secret due to reference change [jobId=${job.id}] [projectId=${job.data.projectId}] [environment=${job.data.environment}] [secretPath=${job.data.secretPath}] [depth=${depth}]`
);
await Promise.all(
secretReferences
.filter(({ folderId }) => Boolean(referencedFoldersGroupedById[folderId][0].path))
// filter out already synced ones
.filter(
({ folderId }) =>
!deDupeQueue[
uniqueIntegrationKey(
referencedFoldersGroupedById[folderId][0].environmentSlug,
referencedFoldersGroupedById[folderId][0].path
)
]
)
.map(({ folderId }) =>
syncSecrets({
depth: depth + 1,
projectId,
secretPath: referencedFoldersGroupedById[folderId][0].path,
environment: referencedFoldersGroupedById[folderId][0].environmentSlug,
deDupeQueue
})
)
);
}
} else {
@ -408,20 +463,37 @@ export const secretQueueFactory = ({
});
}
await syncIntegrationSecrets({
createManySecretsRawFn,
updateManySecretsRawFn,
integrationDAL,
integration,
integrationAuth,
secrets: Object.keys(suffixedSecrets).length !== 0 ? suffixedSecrets : secrets,
accessId: accessId as string,
accessToken,
appendices: {
prefix: metadata?.secretPrefix || "",
suffix: metadata?.secretSuffix || ""
}
});
try {
await syncIntegrationSecrets({
createManySecretsRawFn,
updateManySecretsRawFn,
integrationDAL,
integration,
integrationAuth,
secrets: Object.keys(suffixedSecrets).length !== 0 ? suffixedSecrets : secrets,
accessId: accessId as string,
accessToken,
appendices: {
prefix: metadata?.secretPrefix || "",
suffix: metadata?.secretSuffix || ""
}
});
await integrationDAL.updateById(integration.id, {
lastSyncJobId: job.id,
lastUsed: new Date(),
syncMessage: "",
isSynced: true
});
} catch (err: unknown) {
logger.info("Secret integration sync error:", err);
await integrationDAL.updateById(integration.id, {
lastSyncJobId: job.id,
lastUsed: new Date(),
syncMessage: (err as Error)?.message,
isSynced: false
});
}
}
logger.info("Secret integration sync ended: %s", job.id);

@ -2,12 +2,22 @@
/* eslint-disable no-await-in-loop */
import { ForbiddenError, subject } from "@casl/ability";
import { SecretEncryptionAlgo, SecretKeyEncoding, SecretsSchema, SecretType } from "@app/db/schemas";
import {
ProjectMembershipRole,
SecretEncryptionAlgo,
SecretKeyEncoding,
SecretsSchema,
SecretType
} from "@app/db/schemas";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { TSecretSnapshotServiceFactory } from "@app/ee/services/secret-snapshot/secret-snapshot-service";
import { getConfig } from "@app/lib/config/env";
import { buildSecretBlindIndexFromName, encryptSymmetric128BitHexKeyUTF8 } from "@app/lib/crypto";
import {
buildSecretBlindIndexFromName,
decryptSymmetric128BitHexKeyUTF8,
encryptSymmetric128BitHexKeyUTF8
} from "@app/lib/crypto";
import { BadRequestError } from "@app/lib/errors";
import { groupBy, pick } from "@app/lib/fn";
import { logger } from "@app/lib/logger";
@ -27,12 +37,14 @@ import {
fnSecretBlindIndexCheck,
fnSecretBulkInsert,
fnSecretBulkUpdate,
getAllNestedSecretReferences,
interpolateSecrets,
recursivelyGetSecretPaths
} from "./secret-fns";
import { TSecretQueueFactory } from "./secret-queue";
import {
TAttachSecretTagsDTO,
TBackFillSecretReferencesDTO,
TCreateBulkSecretDTO,
TCreateManySecretRawDTO,
TCreateSecretDTO,
@ -91,6 +103,22 @@ export const secretServiceFactory = ({
secretImportDAL,
secretVersionTagDAL
}: TSecretServiceFactoryDep) => {
const getSecretReference = async (projectId: string) => {
// if bot key missing means e2e still exist
const botKey = await projectBotService.getBotKey(projectId).catch(() => null);
return (el: { ciphertext?: string; iv: string; tag: string }) =>
botKey
? getAllNestedSecretReferences(
decryptSymmetric128BitHexKeyUTF8({
ciphertext: el.ciphertext || "",
iv: el.iv,
tag: el.tag,
key: botKey
})
)
: undefined;
};
// utility function to get secret blind index data
const interalGenSecBlindIndexByName = async (projectId: string, secretName: string) => {
const appCfg = getConfig();
@ -225,6 +253,7 @@ export const secretServiceFactory = ({
if ((inputSecret.tags || []).length !== tags.length) throw new BadRequestError({ message: "Tag not found" });
const { secretName, type, ...el } = inputSecret;
const references = await getSecretReference(projectId);
const secret = await secretDAL.transaction((tx) =>
fnSecretBulkInsert({
folderId,
@ -237,7 +266,12 @@ export const secretServiceFactory = ({
userId: inputSecret.type === SecretType.Personal ? actorId : null,
algorithm: SecretEncryptionAlgo.AES_256_GCM,
keyEncoding: SecretKeyEncoding.UTF8,
tags: inputSecret.tags
tags: inputSecret.tags,
references: references({
ciphertext: inputSecret.secretValueCiphertext,
iv: inputSecret.secretValueIV,
tag: inputSecret.secretValueTag
})
}
],
secretDAL,
@ -251,7 +285,7 @@ export const secretServiceFactory = ({
await snapshotService.performSnapshot(folderId);
await secretQueueService.syncSecrets({ secretPath: path, projectId, environment });
// TODO(akhilmhdh-pg): licence check, posthog service and snapshot
return { ...secret[0], environment, workspace: projectId, tags };
return { ...secret[0], environment, workspace: projectId, tags, secretPath: path };
};
const updateSecret = async ({
@ -335,6 +369,7 @@ export const secretServiceFactory = ({
const { secretName, ...el } = inputSecret;
const references = await getSecretReference(projectId);
const updatedSecret = await secretDAL.transaction(async (tx) =>
fnSecretBulkUpdate({
folderId,
@ -360,7 +395,12 @@ export const secretServiceFactory = ({
"secretReminderRepeatDays",
"tags"
]),
secretBlindIndex: newSecretNameBlindIndex || keyName2BlindIndex[secretName]
secretBlindIndex: newSecretNameBlindIndex || keyName2BlindIndex[secretName],
references: references({
ciphertext: inputSecret.secretValueCiphertext,
iv: inputSecret.secretValueIV,
tag: inputSecret.secretValueTag
})
}
}
],
@ -375,7 +415,7 @@ export const secretServiceFactory = ({
await snapshotService.performSnapshot(folderId);
await secretQueueService.syncSecrets({ secretPath: path, projectId, environment });
// TODO(akhilmhdh-pg): licence check, posthog service and snapshot
return { ...updatedSecret[0], workspace: projectId, environment };
return { ...updatedSecret[0], workspace: projectId, environment, secretPath: path };
};
const deleteSecret = async ({
@ -444,7 +484,7 @@ export const secretServiceFactory = ({
await secretQueueService.syncSecrets({ secretPath: path, projectId, environment });
// TODO(akhilmhdh-pg): licence check, posthog service and snapshot
return { ...deletedSecret[0], _id: deletedSecret[0].id, workspace: projectId, environment };
return { ...deletedSecret[0], _id: deletedSecret[0].id, workspace: projectId, environment, secretPath: path };
};
const getSecrets = async ({
@ -641,7 +681,8 @@ export const secretServiceFactory = ({
return {
...importedSecrets[i].secrets[j],
workspace: projectId,
environment: importedSecrets[i].environment
environment: importedSecrets[i].environment,
secretPath: importedSecrets[i].secretPath
};
}
}
@ -649,7 +690,7 @@ export const secretServiceFactory = ({
}
if (!secret) throw new BadRequestError({ message: "Secret not found" });
return { ...secret, workspace: projectId, environment };
return { ...secret, workspace: projectId, environment, secretPath: path };
};
const createManySecret = async ({
@ -700,6 +741,7 @@ export const secretServiceFactory = ({
const tags = tagIds.length ? await secretTagDAL.findManyTagsById(projectId, tagIds) : [];
if (tags.length !== tagIds.length) throw new BadRequestError({ message: "Tag not found" });
const references = await getSecretReference(projectId);
const newSecrets = await secretDAL.transaction(async (tx) =>
fnSecretBulkInsert({
inputSecrets: inputSecrets.map(({ secretName, ...el }) => ({
@ -708,7 +750,12 @@ export const secretServiceFactory = ({
secretBlindIndex: keyName2BlindIndex[secretName],
type: SecretType.Shared,
algorithm: SecretEncryptionAlgo.AES_256_GCM,
keyEncoding: SecretKeyEncoding.UTF8
keyEncoding: SecretKeyEncoding.UTF8,
references: references({
ciphertext: el.secretValueCiphertext,
iv: el.secretValueIV,
tag: el.secretValueTag
})
})),
folderId,
secretDAL,
@ -783,6 +830,8 @@ export const secretServiceFactory = ({
const tagIds = inputSecrets.flatMap(({ tags = [] }) => tags);
const tags = tagIds.length ? await secretTagDAL.findManyTagsById(projectId, tagIds) : [];
if (tagIds.length !== tags.length) throw new BadRequestError({ message: "Tag not found" });
const references = await getSecretReference(projectId);
const secrets = await secretDAL.transaction(async (tx) =>
fnSecretBulkUpdate({
folderId,
@ -799,7 +848,15 @@ export const secretServiceFactory = ({
? newKeyName2BlindIndex[newSecretName]
: keyName2BlindIndex[secretName],
algorithm: SecretEncryptionAlgo.AES_256_GCM,
keyEncoding: SecretKeyEncoding.UTF8
keyEncoding: SecretKeyEncoding.UTF8,
references:
el.secretValueIV && el.secretValueTag
? references({
ciphertext: el.secretValueCiphertext,
iv: el.secretValueIV,
tag: el.secretValueTag
})
: undefined
}
})),
secretDAL,
@ -924,34 +981,40 @@ export const secretServiceFactory = ({
});
const batchSecretsExpand = async (
secretBatch: {
secretKey: string;
secretValue: string;
secretComment?: string;
}[]
secretBatch: { secretKey: string; secretValue: string; secretComment?: string; secretPath: string }[]
) => {
const secretRecord: Record<
string,
{
value: string;
comment?: string;
skipMultilineEncoding?: boolean;
// Group secrets by secretPath
const secretsByPath: Record<string, { secretKey: string; secretValue: string; secretComment?: string }[]> = {};
secretBatch.forEach((secret) => {
if (!secretsByPath[secret.secretPath]) {
secretsByPath[secret.secretPath] = [];
}
> = {};
secretBatch.forEach((decryptedSecret) => {
secretRecord[decryptedSecret.secretKey] = {
value: decryptedSecret.secretValue,
comment: decryptedSecret.secretComment
};
secretsByPath[secret.secretPath].push(secret);
});
await expandSecrets(secretRecord);
// Expand secrets for each group
for (const secPath in secretsByPath) {
if (!Object.hasOwn(secretsByPath, path)) {
// eslint-disable-next-line no-continue
continue;
}
secretBatch.forEach((decryptedSecret, index) => {
// eslint-disable-next-line no-param-reassign
secretBatch[index].secretValue = secretRecord[decryptedSecret.secretKey].value;
});
const secretRecord: Record<string, { value: string; comment?: string; skipMultilineEncoding?: boolean }> = {};
secretsByPath[secPath].forEach((decryptedSecret) => {
secretRecord[decryptedSecret.secretKey] = {
value: decryptedSecret.secretValue,
comment: decryptedSecret.secretComment
};
});
await expandSecrets(secretRecord);
secretsByPath[secPath].forEach((decryptedSecret) => {
// eslint-disable-next-line no-param-reassign
decryptedSecret.secretValue = secretRecord[decryptedSecret.secretKey].value;
});
}
};
// expand secrets
@ -972,7 +1035,8 @@ export const secretServiceFactory = ({
path,
actor,
environment,
projectId,
projectId: workspaceId,
projectSlug,
actorId,
actorOrgId,
actorAuthMethod,
@ -980,6 +1044,8 @@ export const secretServiceFactory = ({
includeImports,
version
}: TGetASecretRawDTO) => {
const projectId = workspaceId || (await projectDAL.findProjectBySlug(projectSlug as string, actorOrgId)).id;
const botKey = await projectBotService.getBotKey(projectId);
if (!botKey) throw new BadRequestError({ message: "Project bot not found", name: "bot_not_found_error" });
@ -996,6 +1062,7 @@ export const secretServiceFactory = ({
includeImports,
version
});
return decryptSecretRaw(secret, botKey);
};
@ -1168,7 +1235,9 @@ export const secretServiceFactory = ({
await snapshotService.performSnapshot(secrets[0].folderId);
await secretQueueService.syncSecrets({ secretPath, projectId, environment });
return secrets.map((secret) => decryptSecretRaw({ ...secret, workspace: projectId, environment }, botKey));
return secrets.map((secret) =>
decryptSecretRaw({ ...secret, workspace: projectId, environment, secretPath }, botKey)
);
};
const updateManySecretsRaw = async ({
@ -1220,7 +1289,9 @@ export const secretServiceFactory = ({
await snapshotService.performSnapshot(secrets[0].folderId);
await secretQueueService.syncSecrets({ secretPath, projectId, environment });
return secrets.map((secret) => decryptSecretRaw({ ...secret, workspace: projectId, environment }, botKey));
return secrets.map((secret) =>
decryptSecretRaw({ ...secret, workspace: projectId, environment, secretPath }, botKey)
);
};
const deleteManySecretsRaw = async ({
@ -1254,7 +1325,9 @@ export const secretServiceFactory = ({
await snapshotService.performSnapshot(secrets[0].folderId);
await secretQueueService.syncSecrets({ secretPath, projectId, environment });
return secrets.map((secret) => decryptSecretRaw({ ...secret, workspace: projectId, environment }, botKey));
return secrets.map((secret) =>
decryptSecretRaw({ ...secret, workspace: projectId, environment, secretPath }, botKey)
);
};
const getSecretVersions = async ({
@ -1485,6 +1558,52 @@ export const secretServiceFactory = ({
};
};
// this is a backfilling API for secret references
// what it does is it will go through all the secret values and parse all references
// populate the secret reference to do sync integrations
const backfillSecretReferences = async ({
projectId,
actor,
actorId,
actorOrgId,
actorAuthMethod
}: TBackFillSecretReferencesDTO) => {
const { hasRole } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
if (!hasRole(ProjectMembershipRole.Admin))
throw new BadRequestError({ message: "Only admins are allowed to take this action" });
const botKey = await projectBotService.getBotKey(projectId);
if (!botKey)
throw new BadRequestError({ message: "Please upgrade your project first", name: "bot_not_found_error" });
await secretDAL.transaction(async (tx) => {
const secrets = await secretDAL.findAllProjectSecretValues(projectId, tx);
await secretDAL.upsertSecretReferences(
secrets.map(({ id, secretValueCiphertext, secretValueIV, secretValueTag }) => ({
secretId: id,
references: getAllNestedSecretReferences(
decryptSymmetric128BitHexKeyUTF8({
ciphertext: secretValueCiphertext,
iv: secretValueIV,
tag: secretValueTag,
key: botKey
})
)
})),
tx
);
});
return { message: "Successfully backfilled secret references" };
};
return {
attachTags,
detachTags,
@ -1505,6 +1624,7 @@ export const secretServiceFactory = ({
updateManySecretsRaw,
deleteManySecretsRaw,
getSecretVersions,
backfillSecretReferences,
// external services function
fnSecretBulkDelete,
fnSecretBulkUpdate,

@ -152,7 +152,9 @@ export type TGetASecretRawDTO = {
type: "shared" | "personal";
includeImports?: boolean;
version?: number;
} & TProjectPermission;
projectSlug?: string;
projectId?: string;
} & Omit<TProjectPermission, "projectId">;
export type TCreateSecretRawDTO = TProjectPermission & {
secretPath: string;
@ -221,11 +223,13 @@ export type TGetSecretVersionsDTO = Omit<TProjectPermission, "projectId"> & {
secretId: string;
};
export type TSecretReference = { environment: string; secretPath: string };
export type TFnSecretBulkInsert = {
folderId: string;
tx?: Knex;
inputSecrets: Array<Omit<TSecretsInsert, "folderId"> & { tags?: string[] }>;
secretDAL: Pick<TSecretDALFactory, "insertMany">;
inputSecrets: Array<Omit<TSecretsInsert, "folderId"> & { tags?: string[]; references?: TSecretReference[] }>;
secretDAL: Pick<TSecretDALFactory, "insertMany" | "upsertSecretReferences">;
secretVersionDAL: Pick<TSecretVersionDALFactory, "insertMany">;
secretTagDAL: Pick<TSecretTagDALFactory, "saveTagsToSecret">;
secretVersionTagDAL: Pick<TSecretVersionTagDALFactory, "insertMany">;
@ -234,8 +238,11 @@ export type TFnSecretBulkInsert = {
export type TFnSecretBulkUpdate = {
folderId: string;
projectId: string;
inputSecrets: { filter: Partial<TSecrets>; data: TSecretsUpdate & { tags?: string[] } }[];
secretDAL: Pick<TSecretDALFactory, "bulkUpdate">;
inputSecrets: {
filter: Partial<TSecrets>;
data: TSecretsUpdate & { tags?: string[]; references?: TSecretReference[] };
}[];
secretDAL: Pick<TSecretDALFactory, "bulkUpdate" | "upsertSecretReferences">;
secretVersionDAL: Pick<TSecretVersionDALFactory, "insertMany">;
secretTagDAL: Pick<TSecretTagDALFactory, "saveTagsToSecret" | "deleteTagsManySecret">;
secretVersionTagDAL: Pick<TSecretVersionTagDALFactory, "insertMany">;
@ -292,6 +299,8 @@ export type TRemoveSecretReminderDTO = {
repeatDays: number;
};
export type TBackFillSecretReferencesDTO = TProjectPermission;
// ---
export type TCreateManySecretsRawFnFactory = {

@ -91,6 +91,8 @@ services:
- TELEMETRY_ENABLED=false
volumes:
- ./backend/src:/app/src
extra_hosts:
- "host.docker.internal:host-gateway"
frontend:
container_name: infisical-dev-frontend
@ -128,7 +130,7 @@ services:
ports:
- 1025:1025 # SMTP server
- 8025:8025 # Web UI
openldap: # note: more advanced configuration is available
image: osixia/openldap:1.5.0
restart: always

@ -0,0 +1,4 @@
---
title: "Create Identity Membership"
openapi: "POST /api/v2/workspace/{projectId}/identity-memberships/{identityId}"
---

@ -0,0 +1,4 @@
---
title: "Get Identity by ID"
openapi: "GET /api/v2/workspace/{projectId}/identity-memberships/{identityId}"
---

@ -0,0 +1,4 @@
---
title: "Get By Username"
openapi: "POST /api/v1/workspace/{workspaceId}/memberships/details"
---

@ -1,4 +1,4 @@
---
title: "Invite Member"
openapi: "POST /api/v2/workspace/{projectId}/memberships"
---
---

@ -0,0 +1,4 @@
---
title: "Revoke Access Token"
openapi: "POST /api/v1/auth/token/revoke"
---

@ -128,6 +128,12 @@ infisical export --template=<path to template>
</Accordion>
<Accordion title="--include-imports">
By default imported secrets are available, you can disable it by setting this option to false.
Default value: `true`
</Accordion>
<Accordion title="--format">
Format of the output file. Accepted values: `dotenv`, `dotenv-export`, `csv`, `json` and `yaml`

@ -126,6 +126,12 @@ $ infisical run -- npm run dev
</Accordion>
<Accordion title="--include-imports">
By default imported secrets are available, you can disable it by setting this option to false.
Default value: `true`
</Accordion>
{" "}
<Accordion title="--env">

@ -13,6 +13,7 @@ If none of the available stores work for you, you can try using the `file` store
If you are still experiencing trouble, please seek support.
[Learn more about vault command](./commands/vault)
</Accordion>
<Accordion title="Can I fetch secrets with Infisical if I am offline?">

@ -4,59 +4,66 @@ sidebarTitle: "What is Infisical?"
description: "An Introduction to the Infisical secret management platform."
---
Infisical is an [open-source](https://github.com/infisical/infisical) secret management platform for developers.
It provides capabilities for storing, managing, and syncing application configuration and secrets like API keys, database
credentials, and certificates across infrastructure. In addition, Infisical prevents secrets leaks to git and enables secure
Infisical is an [open-source](https://github.com/infisical/infisical) secret management platform for developers.
It provides capabilities for storing, managing, and syncing application configuration and secrets like API keys, database
credentials, and certificates across infrastructure. In addition, Infisical prevents secrets leaks to git and enables secure
sharing of secrets among engineers.
Start managing secrets securely with [Infisical Cloud](https://app.infisical.com) or learn how to [host Infisical](/self-hosting/overview) yourself.
<CardGroup cols={2}>
<Card
title="Infisical Cloud"
href="https://app.infisical.com/signup"
icon="cloud"
color="#000000"
>
Get started with Infisical Cloud in just a few minutes.
</Card>
<Card
href="/self-hosting/overview"
title="Self-hosting"
icon="server"
color="#000000"
>
Self-host Infisical on your own infrastructure.
</Card>
<Card
title="Infisical Cloud"
href="https://app.infisical.com/signup"
icon="cloud"
color="#000000"
>
Get started with Infisical Cloud in just a few minutes.
</Card>
<Card
href="/self-hosting/overview"
title="Self-hosting"
icon="server"
color="#000000"
>
Self-host Infisical on your own infrastructure.
</Card>
</CardGroup>
## Why Infisical?
## Why Infisical?
Infisical helps developers achieve secure centralized secret management and provides all the tools to easily manage secrets in various environments and infrastructure components. In particular, here are some of the most common points that developers mention after adopting Infisical:
Infisical helps developers achieve secure centralized secret management and provides all the tools to easily manage secrets in various environments and infrastructure components. In particular, here are some of the most common points that developers mention after adopting Infisical:
- Streamlined **local development** processes (switching .env files to [Infisical CLI](/cli/commands/run) and removing secrets from developer machines).
- **Best-in-class developer experience** with an easy-to-use [Web Dashboard](/documentation/platform/project).
- Simple secret management inside **[CI/CD pipelines](/integrations/cicd/githubactions)** and staging environments.
- Secure and compliant secret management practices in **[production environments](/sdks/overview)**.
- **Best-in-class developer experience** with an easy-to-use [Web Dashboard](/documentation/platform/project).
- Simple secret management inside **[CI/CD pipelines](/integrations/cicd/githubactions)** and staging environments.
- Secure and compliant secret management practices in **[production environments](/sdks/overview)**.
- **Facilitated workflows** around [secret change management](/documentation/platform/pr-workflows), [access requests](/documentation/platform/access-controls/access-requests), [temporary access provisioning](/documentation/platform/access-controls/temporary-access), and more.
- **Improved security posture** thanks to [secret scanning](/cli/scanning-overview), [granular access control policies](/documentation/platform/access-controls/overview), [automated secret rotation](https://infisical.com/docs/documentation/platform/secret-rotation/overview), and [dynamic secrets](/documentation/platform/dynamic-secrets/overview) capabilities.
## How does Infisical work?
## How does Infisical work?
To make secret management effortless and secure, Infisical follows a certain structure for enabling secret management workflows as defined below.
To make secret management effortless and secure, Infisical follows a certain structure for enabling secret management workflows as defined below.
**Identities** in Infisical are users or machine which have a certain set of roles and permissions assigned to them. Such identities are able to manage secrets in various **Clients** throughout the entire infrastructure. To do that, identities have to verify themselves through one of the available **Authentication Methods**.
**Identities** in Infisical are users or machine which have a certain set of roles and permissions assigned to them. Such identities are able to manage secrets in various **Clients** throughout the entire infrastructure. To do that, identities have to verify themselves through one of the available **Authentication Methods**.
As a result, the 3 main concepts that are important to understand are:
- **[Identities](/documentation/platform/identities/overview)**: users or machines with a set permissions assigned to them.
As a result, the 3 main concepts that are important to understand are:
- **[Identities](/documentation/platform/identities/overview)**: users or machines with a set permissions assigned to them.
- **[Clients](/integrations/platforms/kubernetes)**: Infisical-developed tools for managing secrets in various infrastructure components (e.g., [Kubernetes Operator](/integrations/platforms/kubernetes), [Infisical Agent](/integrations/platforms/infisical-agent), [CLI](/cli/usage), [SDKs](/sdks/overview), [API](/api-reference/overview/introduction), [Web Dashboard](/documentation/platform/organization)).
- **[Authentication Methods](/documentation/platform/identities/universal-auth)**: ways for Identities to authenticate inside different clients (e.g., SAML SSO for Web Dashboard, Universal Auth for Infisical Agent, etc.).
- **[Authentication Methods](/documentation/platform/identities/universal-auth)**: ways for Identities to authenticate inside different clients (e.g., SAML SSO for Web Dashboard, Universal Auth for Infisical Agent, AWS Auth etc.).
## How to get started with Infisical?
## How to get started with Infisical?
Depending on your use case, it might be helpful to look into some of the resources and guides provided below.
<CardGroup cols={2}>
<Card href="../../cli/overview" title="Command Line Interface (CLI)" icon="square-terminal" color="#000000">
<Card
href="../../cli/overview"
title="Command Line Interface (CLI)"
icon="square-terminal"
color="#000000"
>
Inject secrets into any application process/environment.
</Card>
<Card
@ -67,7 +74,12 @@ Depending on your use case, it might be helpful to look into some of the resourc
>
Fetch secrets with any programming language on demand.
</Card>
<Card href="../../integrations/platforms/docker-intro" title="Docker" icon="docker" color="#000000">
<Card
href="../../integrations/platforms/docker-intro"
title="Docker"
icon="docker"
color="#000000"
>
Inject secrets into Docker containers.
</Card>
<Card

@ -0,0 +1,311 @@
---
title: AWS Auth
description: "Learn how to authenticate with Infisical for EC2 instances, Lambda functions, and other IAM principals."
---
**AWS Auth** is an AWS-native authentication method for IAM principals like EC2 instances or Lambda functions to access Infisical.
## Diagram
The following sequence digram illustrates the AWS Auth workflow for authenticating AWS IAM principals with Infisical.
```mermaid
sequenceDiagram
participant Client as Client
participant Infis as Infisical
participant AWS as AWS STS
Note over Client,Client: Step 1: Sign GetCallerIdentityQuery
Note over Client,Infis: Step 2: Login Operation
Client->>Infis: Send signed query details /api/v1/auth/aws-auth/login
Note over Infis,AWS: Step 3: Query verification
Infis->>AWS: Forward signed GetCallerIdentity query
AWS-->>Infis: Return IAM user/role details
Note over Infis: Step 4: Identity Property Validation
Infis->>Client: Return short-lived access token
Note over Client,Infis: Step 5: Access Infisical API with Token
Client->>Infis: Make authenticated requests using the short-lived access token
```
## Concept
At a high-level, Infisical authenticates an IAM principal by verifying its identity and checking that it meets specific requirements (e.g. it is an allowed IAM principal ARN) at the `/api/v1/auth/aws-auth/login` endpoint. If successful,
then Infisical returns a short-lived access token that can be used to make authenticated requests to the Infisical API.
To be more specific:
1. The client IAM principal signs a `GetCallerIdentity` query using the [AWS Signature v4 algorithm](https://docs.aws.amazon.com/IAM/latest/UserGuide/create-signed-request.html); this is done using the credentials from the AWS environment where the IAM principal is running.
2. The client sends the signed query data to Infisical including the request method, request body, and request headers at the `/api/v1/auth/aws-auth/login` endpoint.
3. Infisical reconstructs the query and sends it to AWS STS API via the [sts:GetCallerIdentity](https://docs.aws.amazon.com/STS/latest/APIReference/API_GetCallerIdentity.html) method for verification and obtains the identity associated with the IAM principal.
4. Infisical checks the identity's properties against set criteria such **Allowed Principal ARNs**.
5. If all is well, Infisical returns a short-lived access token that the IAM principal can use to make authenticated requests to the Infisical API.
<Note>
We recommend using one of Infisical's clients like SDKs or the Infisical Agent
to authenticate with Infisical using AWS Auth as they handle the
authentication process including the signed `GetCallerIdentity` query
construction for you.
Also, note that Infisical needs network-level access to send requests to the AWS STS API
as part of the AWS Auth workflow.
</Note>
## Guide
In the following steps, we explore how to create and use identities for your workloads and applications on AWS to
access the Infisical API using the AWS Auth authentication method.
<Steps>
<Step title="Creating an identity">
To create an identity, head to your Organization Settings > Access Control > Machine Identities and press **Create identity**.
![identities organization](/images/platform/identities/identities-org.png)
When creating an identity, you specify an organization level [role](/documentation/platform/role-based-access-controls) for it to assume; you can configure roles in Organization Settings > Access Control > Organization Roles.
![identities organization create](/images/platform/identities/identities-org-create.png)
Now input a few details for your new identity. Here's some guidance for each field:
- Name (required): A friendly name for the identity.
- Role (required): A role from the **Organization Roles** tab for the identity to assume. The organization role assigned will determine what organization level resources this identity can have access to.
Once you've created an identity, you'll be prompted to configure the authentication method for it. Here, select **AWS Auth**.
![identities create aws auth method](/images/platform/identities/identities-org-create-aws-auth-method.png)
Here's some more guidance on each field:
- Allowed Principal ARNs: A comma-separated list of trusted IAM principal ARNs that are allowed to authenticate with Infisical. The values should take one of three forms: `arn:aws:iam::123456789012:user/MyUserName`, `arn:aws:iam::123456789012:role/MyRoleName`, or `arn:aws:iam::123456789012:*`. Using a wildcard in this case allows any IAM principal in the account `123456789012` to authenticate with Infisical under the identity.
- Allowed Account IDs: A comma-separated list of trusted AWS account IDs that are allowed to authenticate with Infisical.
- STS Endpoint (default is `https://sts.amazonaws.com/`): The endpoint URL for the AWS STS API. This value should be adjusted based on the AWS region you are operating in (e.g. `https://sts.us-east-1.amazonaws.com/`); refer to the list of regional STS endpoints [here](https://docs.aws.amazon.com/general/latest/gr/sts.html).
- Access Token TTL (default is `2592000` equivalent to 30 days): The lifetime for an acccess token in seconds. This value will be referenced at renewal time.
- Access Token Max TTL (default is `2592000` equivalent to 30 days): The maximum lifetime for an acccess token in seconds. This value will be referenced at renewal time.
- Access Token Max Number of Uses (default is `0`): The maximum number of times that an access token can be used; a value of `0` implies infinite number of uses.
- Access Token Trusted IPs: The IPs or CIDR ranges that access tokens can be used from. By default, each token is given the `0.0.0.0/0`, allowing usage from any network address.
</Step>
<Step title="Adding an identity to a project">
To enable the identity to access project-level resources such as secrets within a specific project, you should add it to that project.
To do this, head over to the project you want to add the identity to and go to Project Settings > Access Control > Machine Identities and press **Add identity**.
Next, select the identity you want to add to the project and the project level role you want to allow it to assume. The project role assigned will determine what project level resources this identity can have access to.
![identities project](/images/platform/identities/identities-project.png)
![identities project create](/images/platform/identities/identities-project-create.png)
</Step>
<Step title="Accessing the Infisical API with the identity">
To access the Infisical API as the identity, you need to construct a signed `GetCallerIdentity` query using the [AWS Signature v4 algorithm](https://docs.aws.amazon.com/IAM/latest/UserGuide/create-signed-request.html) and make a request to the `/api/v1/auth/aws-auth/login` endpoint containing the query data
in exchange for an access token.
We provide a few code examples below of how you can authenticate with Infisical from inside a Lambda function, EC2 instance, etc. and obtain an access token to access the [Infisical API](/api-reference/overview/introduction).
<AccordionGroup>
<Accordion
title="Sample code for inside a Lambda function"
>
The following query construction is an example of how you can authenticate with Infisical from inside a Lambda function.
The shown example uses Node.js but you can use other languages supported by AWS Lambda.
```javascript
import AWS from "aws-sdk";
import axios from "axios";
export const handler = async (event, context) => {
try {
const region = process.env.AWS_REGION;
AWS.config.update({ region });
const iamRequestURL = `https://sts.${region}.amazonaws.com/`;
const iamRequestBody = "Action=GetCallerIdentity&Version=2011-06-15";
const iamRequestHeaders = {
"Content-Type": "application/x-www-form-urlencoded; charset=utf-8",
Host: `sts.${region}.amazonaws.com`,
};
// Create the request
const request = new AWS.HttpRequest(iamRequestURL, region);
request.method = "POST";
request.headers = iamRequestHeaders;
request.headers["X-Amz-Date"] = AWS.util.date
.iso8601(new Date())
.replace(/[:-]|\.\d{3}/g, "");
request.body = iamRequestBody;
request.headers["Content-Length"] =
Buffer.byteLength(iamRequestBody).toString();
// Sign the request
const signer = new AWS.Signers.V4(request, "sts");
signer.addAuthorization(AWS.config.credentials, new Date());
const infisicalUrl = "https://app.infisical.com"; // or your self-hosted Infisical URL
const identityId = "<your-identity-id>";
const { data } = await axios.post(
`${infisicalUrl}/api/v1/auth/aws-auth/login`,
{
identityId,
iamHttpRequestMethod: "POST",
iamRequestUrl: Buffer.from(iamRequestURL).toString("base64"),
iamRequestBody: Buffer.from(iamRequestBody).toString("base64"),
iamRequestHeaders: Buffer.from(
JSON.stringify(iamRequestHeaders)
).toString("base64"),
}
);
console.log("result data: ", data); // access token here
} catch (err) {
console.error(err);
}
};
````
</Accordion>
<Accordion
title="Sample code for inside an EC2 instance"
>
The following query construction is an example of how you can authenticate with Infisical from inside a EC2 instance.
The shown example uses Node.js but you can use other language you wish.
```javascript
import AWS from "aws-sdk";
import axios from "axios";
const main = async () => {
try {
// obtain region from EC2 instance metadata
const tokenResponse = await axios.put("http://169.254.169.254/latest/api/token", null, {
headers: {
"X-aws-ec2-metadata-token-ttl-seconds": "21600"
}
});
const url = "http://169.254.169.254/latest/dynamic/instance-identity/document";
const response = await axios.get(url, {
headers: {
"X-aws-ec2-metadata-token": tokenResponse.data
}
});
const region = response.data.region;
AWS.config.update({
region
});
const iamRequestURL = `https://sts.${region}.amazonaws.com/`;
const iamRequestBody = "Action=GetCallerIdentity&Version=2011-06-15";
const iamRequestHeaders = {
"Content-Type": "application/x-www-form-urlencoded; charset=utf-8",
Host: `sts.${region}.amazonaws.com`
};
const request = new AWS.HttpRequest(new AWS.Endpoint(iamRequestURL), AWS.config.region);
request.method = "POST";
request.headers = iamRequestHeaders;
request.headers["X-Amz-Date"] = AWS.util.date.iso8601(new Date()).replace(/[:-]|\.\d{3}/g, "");
request.body = iamRequestBody;
request.headers["Content-Length"] = Buffer.byteLength(iamRequestBody);
const signer = new AWS.Signers.V4(request, "sts");
signer.addAuthorization(AWS.config.credentials, new Date());
const infisicalUrl = "https://app.infisical.com"; // or your self-hosted Infisical URL
const identityId = "<your-identity-id>";
const { data } = await axios.post(`${infisicalUrl}/api/v1/auth/aws-auth/login`, {
identityId,
iamHttpRequestMethod: "POST",
iamRequestUrl: Buffer.from(iamRequestURL).toString("base64"),
iamRequestBody: Buffer.from(iamRequestBody).toString("base64"),
iamRequestHeaders: Buffer.from(JSON.stringify(iamRequestHeaders)).toString("base64")
});
console.log("result data: ", data); // access token here
} catch (err) {
console.error(err);
}
}
main();
````
</Accordion>
<Accordion
title="Sample code for general query construction"
>
The following query construction provides a generic example of how you can construct a signed `GetCallerIdentity` query and obtain the required payload components.
The shown example uses Node.js but you can use any language you wish.
```javascript
const AWS = require("aws-sdk");
const region = "<your-aws-region>";
const infisicalUrl = "https://app.infisical.com"; // or your self-hosted Infisical URL
const iamRequestURL = `https://sts.${region}.amazonaws.com/`;
const iamRequestBody = "Action=GetCallerIdentity&Version=2011-06-15";
const iamRequestHeaders = {
"Content-Type": "application/x-www-form-urlencoded; charset=utf-8",
Host: `sts.${region}.amazonaws.com`
};
const request = new AWS.HttpRequest(new AWS.Endpoint(iamRequestURL), region);
request.method = "POST";
request.headers = iamRequestHeaders;
request.headers["X-Amz-Date"] = AWS.util.date.iso8601(new Date()).replace(/[:-]|\.\d{3}/g, "");
request.body = iamRequestBody;
request.headers["Content-Length"] = Buffer.byteLength(iamRequestBody);
const signer = new AWS.Signers.V4(request, "sts");
signer.addAuthorization(AWS.config.credentials, new Date());
````
#### Sample request
```bash Request
curl --location --request POST 'https://app.infisical.com/api/v1/auth/aws-auth/login' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'identityId=...' \
--data-urlencode 'iamHttpRequestMethod=...' \
--data-urlencode 'iamRequestBody=...' \
--data-urlencode 'iamRequestHeaders=...'
```
#### Sample response
```bash Response
{
"accessToken": "...",
"expiresIn": 7200,
"accessTokenMaxTTL": 43244
"tokenType": "Bearer"
}
```
Next, you can use the access token to access the [Infisical API](/api-reference/overview/introduction)
</Accordion>
</AccordionGroup>
<Tip>
We recommend using one of Infisical's clients like SDKs or the Infisical Agent to authenticate with Infisical using AWS Auth as they handle the authentication process including the signed `GetCallerIdentity` query construction for you.
</Tip>
<Note>
Each identity access token has a time-to-live (TLL) which you can infer from the response of the login operation;
the default TTL is `7200` seconds which can be adjusted.
If an identity access token expires, it can no longer authenticate with the Infisical API. In this case,
a new access token should be obtained by performing another login operation.
</Note>
</Step>
</Steps>

Some files were not shown because too many files have changed in this diff Show More