Compare commits

..

142 Commits

Author SHA1 Message Date
23eac40740 Merge pull request #3081 from Infisical/secrets-overview-page-move-secrets
Feature: Secrets Overview Page Move Secrets
2025-02-05 08:54:06 -08:00
cf61390e52 improvements: address feedback 2025-02-04 20:14:47 -08:00
7adc103ed2 Merge pull request #3082 from Infisical/app-connections-and-secret-syncs-unique-constraint
Fix: Move App Connection and Secret Sync Unique Name Constraint to DB
2025-02-04 09:42:02 -08:00
5bdbf37171 improvement: add error codes enum for re-use 2025-02-04 08:37:06 -08:00
4f874734ab Update operator version 2025-02-04 10:10:59 -05:00
eb6fd8259b Merge pull request #3085 from Infisical/combine-helm-release
Combine image release with helm
2025-02-04 10:07:52 -05:00
1766a44dd0 Combine image release with helm
Combine image release with helm release so that one happens after the other. This will help reduce manual work.
2025-02-04 09:59:32 -05:00
624c9ef8da Merge pull request #3083 from akhilmhdh/fix/base64-decode-issue
Resolved base64 decode saving file as ansii
2025-02-04 20:04:02 +05:30
=
dfd4b13574 fix: resolved base64 decode saving file as ansii 2025-02-04 16:14:28 +05:30
22b57b7a74 chore: add migration file 2025-02-03 19:40:00 -08:00
1ba0b9c204 improvement: move unique name constraint to db for secret syncs and app connections 2025-02-03 19:36:37 -08:00
a903537441 fix: clear selection if modal is closed through cancel button and secrets have been moved 2025-02-03 18:44:52 -08:00
92c4d83714 improvement: make results look better 2025-02-03 18:29:38 -08:00
a6414104ad feature: secrets overview page move secrets 2025-02-03 18:18:00 -08:00
110d0e95b0 Merge pull request #3077 from carlosvargas9103/carlosvargas9103-fix-typo-readme
fixed typo in README.md
2025-02-03 13:26:32 -05:00
a8c0bbb7ca Merge pull request #3080 from Infisical/update-security-docs
Update Security Docs
2025-02-03 10:13:26 -08:00
6af8a4fab8 Update security docs 2025-02-03 10:07:57 -08:00
43ecd31b74 fixed typo in README.md 2025-02-03 16:18:17 +01:00
ccee0f5428 Merge pull request #3071 from Infisical/fix-oidc-doc-images
Fix: Remove Relative Paths for ODIC Overview Docs
2025-01-31 15:33:40 -08:00
14586c7cd0 fix: remove relative path for oidc docs 2025-01-31 15:30:38 -08:00
7090eea716 Merge pull request #3069 from Infisical/oidc-group-membership-mapping
Feature: OIDC Group Membership Mapping
2025-01-31 11:32:38 -08:00
01d3443139 improvement: update docker dev and makefile for keycloak dev 2025-01-31 11:14:49 -08:00
c4b23a8d4f improvement: improve grammar 2025-01-31 11:05:56 -08:00
90a2a11fff improvement: update tooltips 2025-01-31 11:04:20 -08:00
95d7c2082c improvements: address feedback 2025-01-31 11:01:54 -08:00
ab5eb4c696 Merge pull request #3070 from Infisical/misc/readded-operator-installation-flag
misc: readded operator installation flag for secret CRD
2025-01-31 16:53:57 +08:00
65aeb81934 Merge pull request #3011 from xinbenlv/patch-1
Fix grammar on overview.mdx
2025-01-31 14:22:03 +05:30
a406511405 Merge pull request #3048 from isaiahmartin847/refactor/copy-secret
Improve Visibility and Alignment of Tooltips and Copy Secret Key Icon
2025-01-31 14:20:02 +05:30
61da0db49e misc: readded operator installation flag for CRD 2025-01-31 16:03:42 +08:00
59666740ca chore: revert license and remove unused query key/doc reference 2025-01-30 10:35:23 -08:00
9cc7edc869 feature: oidc group membership mapping 2025-01-30 10:21:30 -08:00
e1b016f76d Merge pull request #3068 from nicogiard/patch-1
fix: wrong client variable in c# code example
2025-01-29 22:24:03 +01:00
1175b9b5af fix: wrong client variable
The InfisicalClient variable was wrong
2025-01-29 21:57:57 +01:00
09521144ec Merge pull request #3066 from akhilmhdh/fix/secret-list-plain
Resolved list secret plain to have key as well
2025-01-29 14:04:49 -05:00
=
8759944077 feat: resolved list secret plain to have key as well 2025-01-30 00:31:47 +05:30
aac3c355e9 Merge pull request #3061 from Infisical/secret-sync-ui-doc-improvements
improvements: Import Behavior Doc/UI Clarification and Minor Integration Layout Adjustments
2025-01-29 13:16:21 -05:00
2a28a462a5 Merge pull request #3053 from Infisical/daniel/k8s-insight
k8s: bug fixes and better prints
2025-01-29 23:16:46 +05:30
3328e0850f improvements: revise descriptions 2025-01-29 09:44:46 -08:00
216cae9b33 Merge pull request #3058 from Infisical/misc/improved-helper-text-for-gcp-sa-field
misc: improved helper text for GCP sa field
2025-01-29 09:54:20 -05:00
89d4d4bc92 Merge pull request #3064 from akhilmhdh/fix/secret-path-validation-permission
feat: added validation for secret path in permission
2025-01-29 18:46:38 +05:30
=
cffcb28bc9 feat: removed secret path check in glob 2025-01-29 17:50:02 +05:30
=
61388753cf feat: updated to support in error in ui 2025-01-29 17:32:13 +05:30
=
a6145120e6 feat: added validation for secret path in permission 2025-01-29 17:01:45 +05:30
dacffbef08 doc: documentation updates for gcp app connection 2025-01-29 18:12:17 +08:00
4db3e5d208 Merge remote-tracking branch 'origin/main' into misc/improved-helper-text-for-gcp-sa-field 2025-01-29 17:43:48 +08:00
2a84d61862 add guide for how to wrote a design doc 2025-01-28 23:31:12 -05:00
a5945204ad improvements: import behavior clarification and minor integration layout adjustments 2025-01-28 19:09:43 -08:00
e99eb47cf4 Merge pull request #3059 from Infisical/minor-doc-adjustments
Improvements: Integration Docs Nav Bar Reorder & Azure Integration Logo fix
2025-01-28 14:14:54 -08:00
cf107c0c0d improvements: change integration nav bar order and correct azure integrations image references 2025-01-28 12:51:24 -08:00
9fcb1c2161 misc: added emphasis on suffix 2025-01-29 04:38:16 +08:00
70515a1ca2 Merge pull request #3045 from Infisical/daniel/auditlogs-secret-path-query
feat(audit-logs): query by secret path
2025-01-28 21:17:42 +01:00
955cf9303a Merge pull request #3052 from Infisical/set-password-feature
Feature: Setup Password
2025-01-28 12:08:24 -08:00
a24ef46d7d requested changes 2025-01-28 20:44:45 +01:00
ee49f714b9 misc: added valid example to error thrown for sa mismatch 2025-01-29 03:41:24 +08:00
657aca516f Merge pull request #3049 from Infisical/daniel/vercel-custom-envs
feat(integrations/vercel): custom environments support
2025-01-28 20:38:40 +01:00
b5d60398d6 misc: improved helper text for GCP sa field 2025-01-29 03:10:37 +08:00
c3d515bb95 Merge pull request #3039 from Infisical/feat/gcp-secret-sync
feat: gcp app connections and secret sync
2025-01-29 02:23:22 +08:00
7f89a7c860 Merge remote-tracking branch 'origin/main' into feat/gcp-secret-sync 2025-01-29 01:57:54 +08:00
23cb05c16d misc: added support for copy suffix 2025-01-29 01:55:15 +08:00
d74b819f57 improvements: make logged in status disclaimer in email more prominent and only add email auth method if not already present 2025-01-28 09:53:40 -08:00
457056b600 misc: added handling for empty values 2025-01-29 01:41:59 +08:00
7dc9ea4f6a update notice 2025-01-28 11:48:21 -05:00
3b4b520d42 Merge pull request #3055 from Quintasan/patch-1
Update Docker .env examples to reflect `SMTP_FROM` changes
2025-01-28 11:29:07 -05:00
23f605bda7 misc: added credential hash 2025-01-28 22:37:27 +08:00
1c3c8dbdce Update Docker .env files to reflect SMT_FROM split 2025-01-28 10:57:09 +00:00
317c95384e misc: added secondary text 2025-01-28 16:48:06 +08:00
7dd959e124 misc: readded file 2025-01-28 16:40:17 +08:00
2049e5668f misc: deleted file 2025-01-28 16:39:05 +08:00
0a3e99b334 misc: added import support and a few ui/ux updates 2025-01-28 16:36:56 +08:00
c4ad0aa163 Merge pull request #3054 from Infisical/infisicalk8s-ha
K8s HA reference docs
2025-01-28 02:56:22 -05:00
5bb0b7a508 K8s HA reference docs
A complete guide to k8s HA reference docs
2025-01-28 02:53:02 -05:00
96bcd42753 Merge pull request #3029 from akhilmhdh/feat/min-ttl
Resolved ttl and max ttl to be zero
2025-01-28 12:00:28 +05:30
2c75e23acf helm 2025-01-28 04:21:29 +01:00
907dd4880a fix(k8): reconcile on status update 2025-01-28 04:20:51 +01:00
6af7c5c371 improvements: remove removed property reference and remove excess padding/margin on secret sync pages 2025-01-27 19:12:05 -08:00
72468d5428 feature: setup password 2025-01-27 18:51:35 -08:00
939ee892e0 chore: cleanup 2025-01-28 01:02:18 +01:00
c7ec9ff816 Merge pull request #3050 from Infisical/daniel/k8-logs
feat(k8-operator): better error status
2025-01-27 23:53:23 +01:00
554e268f88 chore: update helm 2025-01-27 23:51:08 +01:00
a8a27c3045 feat(k8-operator): better error status 2025-01-27 23:48:20 +01:00
27af943ee1 Update integration-sync-secret.ts 2025-01-27 23:18:46 +01:00
9b772ad55a Update VercelConfigurePage.tsx 2025-01-27 23:11:57 +01:00
94a1fc2809 chore: cleanup 2025-01-27 23:11:14 +01:00
10c10642a1 feat(integrations/vercel): custom environments support 2025-01-27 23:08:47 +01:00
=
3e0f04273c feat: resolved merge conflict 2025-01-28 02:01:24 +05:30
=
91f2d0384e feat: updated router to validate max ttl and ttl 2025-01-28 01:57:15 +05:30
=
811dc8dd75 fix: changed accessTokenMaxTTL in expireAt to accessTokenTTL 2025-01-28 01:57:15 +05:30
=
4ee9375a8d fix: resolved min and max ttl to be zero 2025-01-28 01:57:15 +05:30
92f697e195 I removed the hover opacity on the 'copy secret name' icon so the icon is always visible instead of appearing only on hover. I believe this will make it more noticeable to users.
As a user myself, I didn't realize it was possible to copy a secret name until I accidentally hovered over it.
2025-01-27 12:26:22 -07:00
8062f0238b I added a wrapper div with a class of relative to make the icon and tooltip align vertically inline. 2025-01-27 12:25:38 -07:00
1181c684db Merge pull request #3036 from Infisical/identity-auth-ui-improvements
Improvement: Overhaul Identity Auth UI Section
2025-01-27 10:51:39 -05:00
dda436bcd9 Merge pull request #3046 from akhilmhdh/fix/breadcrumb-bug-github
fix: resolved github breadcrumb issue
2025-01-27 20:36:06 +05:30
=
89124b18d2 fix: resolved github breadcrumb issue 2025-01-27 20:29:06 +05:30
effd88c4bd misc: improved doc wording 2025-01-27 22:57:16 +08:00
27efc908e2 feat(audit-logs): query by secret path 2025-01-27 15:53:07 +01:00
8e4226038b doc: add api enablement to docs 2025-01-27 22:51:49 +08:00
27425a1a64 fix: addressed hover effect for secret path input 2025-01-27 22:03:46 +08:00
18cf3c89c1 misc: renamed enum 2025-01-27 21:47:27 +08:00
49e6d7a861 misc: finalized endpoint and doc 2025-01-27 21:33:48 +08:00
c4446389b0 doc: add docs for gcp secret manager secret sync 2025-01-27 20:47:47 +08:00
7c21dec54d doc: add docs for gcp app connection 2025-01-27 19:32:02 +08:00
2ea5710896 misc: addressed lint issues 2025-01-27 17:33:01 +08:00
f9ac7442df misc: added validation against confused deputy 2025-01-27 17:30:26 +08:00
a534a4975c chore: revert license 2025-01-24 20:50:54 -08:00
79a616dc1c improvements: address feedback 2025-01-24 20:21:21 -08:00
a93bfa69c9 Merge pull request #3042 from Infisical/daniel/fix-approvals-for-personal-secrets
fix: approvals triggering for personal secrets
2025-01-25 04:50:19 +01:00
598d14fc54 improvement: move edit/delete identity buttons to dropdown 2025-01-24 19:34:03 -08:00
08a0550cd7 fix: correct dependency arra 2025-01-24 19:21:33 -08:00
d7503573b1 Merge pull request #3041 from Infisical/daniel/remove-caching-from-docs
docs: update node guid eand remove cache references
2025-01-25 04:15:53 +01:00
b5a89edeed Update node.mdx 2025-01-25 03:59:06 +01:00
860eaae4c8 fix: approvals triggering for personal secrets 2025-01-25 03:44:43 +01:00
c7a4b6c4e9 docs: update node guid eand remove cache references 2025-01-25 03:12:36 +01:00
c12c6dcc6e Merge pull request #2987 from Infisical/daniel/k8s-multi-managed-secrets
feat(k8-operator/infisicalsecret-crd): multiple secret references
2025-01-25 02:59:07 +01:00
99c9b644df improvements: address feedback 2025-01-24 12:55:56 -08:00
d0d5556bd0 feat: gcp integration sync and removal 2025-01-25 04:04:38 +08:00
753c28a2d3 feat: gcp secret sync management 2025-01-25 03:01:10 +08:00
8741414cfa Update routeTree.gen.ts 2025-01-24 18:28:48 +01:00
b8d29793ec fix: rename managedSecretReferneces to managedKubeSecretReferences 2025-01-24 18:26:56 +01:00
92013dbfbc fix: routes 2025-01-24 18:26:34 +01:00
c5319588fe chore: fix routes geneartion 2025-01-24 18:26:23 +01:00
9efb8eaf78 Update infisical-secret-crd.mdx 2025-01-24 18:24:26 +01:00
dfc973c7f7 chore(k8-operator): update helm 2025-01-24 18:24:26 +01:00
3013d1977c docs(k8-operator): updated infisicalsecret crd docs 2025-01-24 18:24:26 +01:00
f358e8942d feat(k8-operator): multiple managed secrets 2025-01-24 18:24:26 +01:00
58f51411c0 feat: gcp secret sync 2025-01-24 22:33:56 +08:00
c3970d1ea2 Merge pull request #3038 from isaiahmartin847/typo-fix/Role-based-Access-Controls
Fixed the typo in the Role-based Access Controls docs.
2025-01-24 01:30:34 -05:00
2dc00a638a fixed the typo in the /access-controls/role-based-access-controls page in the docs. 2025-01-23 23:15:40 -07:00
94aed485a5 chore: optimize imports 2025-01-23 12:22:40 -08:00
e382941424 improvement: overhaul identity auth ui section 2025-01-23 12:18:09 -08:00
bab9c1f454 Merge pull request #3024 from Infisical/team-city-integration-fix
Fix: UI Fix for Team City Integrations Create Page
2025-01-23 18:14:32 +01:00
2bd4770fb4 Merge pull request #3035 from akhilmhdh/fix/env-ui
feat: updated ui validation for env to 64 like api
2025-01-23 16:32:04 +05:30
=
31905fab6e feat: updated ui validation for env to 64 like api 2025-01-23 16:26:13 +05:30
784acf16d0 Merge pull request #3032 from Infisical/correct-app-connections-docs
Improvements: Minor Secret Sync improvements and Correct App Connections Env Vars and Move Sync/Connections to Groups in Docs
2025-01-23 03:29:33 -05:00
114b89c952 Merge pull request #3033 from Infisical/daniel/update-python-docs
docs(guides): updated python guide
2025-01-23 03:28:11 -05:00
81420198cb fix: display aws connection credentials error and sync status on details page 2025-01-22 21:00:01 -08:00
0ff18e277f docs: redact info in image 2025-01-22 20:02:03 -08:00
e093f70301 docs: add new aws connection images 2025-01-22 19:58:24 -08:00
8e2ff18f35 docs: improve aws connection docs 2025-01-22 19:58:06 -08:00
3fbfecf7a9 docs: correct aws env vars in aws connection self-hosted docs 2025-01-22 18:46:36 -08:00
9087def21c docs: correct github connection env vars and move connections and syncs to group 2025-01-22 18:40:24 -08:00
586dbd79b0 fix: fix team city integrations create page 2025-01-21 18:37:01 -08:00
645dfafba0 Fix grammar on overview.mdx 2025-01-20 09:02:18 -08:00
280 changed files with 9115 additions and 4746 deletions

View File

@ -26,7 +26,8 @@ SITE_URL=http://localhost:8080
# Mail/SMTP # Mail/SMTP
SMTP_HOST= SMTP_HOST=
SMTP_PORT= SMTP_PORT=
SMTP_NAME= SMTP_FROM_ADDRESS=
SMTP_FROM_NAME=
SMTP_USERNAME= SMTP_USERNAME=
SMTP_PASSWORD= SMTP_PASSWORD=
@ -105,3 +106,6 @@ INF_APP_CONNECTION_GITHUB_APP_CLIENT_SECRET=
INF_APP_CONNECTION_GITHUB_APP_PRIVATE_KEY= INF_APP_CONNECTION_GITHUB_APP_PRIVATE_KEY=
INF_APP_CONNECTION_GITHUB_APP_SLUG= INF_APP_CONNECTION_GITHUB_APP_SLUG=
INF_APP_CONNECTION_GITHUB_APP_ID= INF_APP_CONNECTION_GITHUB_APP_ID=
#gcp app
INF_APP_CONNECTION_GCP_SERVICE_ACCOUNT_CREDENTIAL=

View File

@ -1,4 +1,4 @@
name: Release Helm Charts name: Release Infisical Core Helm chart
on: [workflow_dispatch] on: [workflow_dispatch]
@ -17,6 +17,6 @@ jobs:
- name: Install Cloudsmith CLI - name: Install Cloudsmith CLI
run: pip install --upgrade cloudsmith-cli run: pip install --upgrade cloudsmith-cli
- name: Build and push helm package to Cloudsmith - name: Build and push helm package to Cloudsmith
run: cd helm-charts && sh upload-to-cloudsmith.sh run: cd helm-charts && sh upload-infisical-core-helm-cloudsmith.sh
env: env:
CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }} CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }}

View File

@ -1,4 +1,4 @@
name: Release Docker image for K8 operator name: Release image + Helm chart K8s Operator
on: on:
push: push:
tags: tags:
@ -35,3 +35,18 @@ jobs:
tags: | tags: |
infisical/kubernetes-operator:latest infisical/kubernetes-operator:latest
infisical/kubernetes-operator:${{ steps.extract_version.outputs.version }} infisical/kubernetes-operator:${{ steps.extract_version.outputs.version }}
- name: Checkout
uses: actions/checkout@v2
- name: Install Helm
uses: azure/setup-helm@v3
with:
version: v3.10.0
- name: Install python
uses: actions/setup-python@v4
- name: Install Cloudsmith CLI
run: pip install --upgrade cloudsmith-cli
- name: Build and push helm package to Cloudsmith
run: cd helm-charts && sh upload-k8s-operator-cloudsmith.sh
env:
CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }}

View File

@ -30,3 +30,6 @@ reviewable-api:
npm run type:check npm run type:check
reviewable: reviewable-ui reviewable-api reviewable: reviewable-ui reviewable-api
up-dev-sso:
docker compose -f docker-compose.dev.yml --profile sso up --build

View File

@ -125,7 +125,7 @@ Install pre commit hook to scan each commit before you push to your repository
infisical scan install --pre-commit-hook infisical scan install --pre-commit-hook
``` ```
Lean about Infisical's code scanning feature [here](https://infisical.com/docs/cli/scanning-overview) Learn about Infisical's code scanning feature [here](https://infisical.com/docs/cli/scanning-overview)
## Open-source vs. paid ## Open-source vs. paid

View File

@ -0,0 +1,23 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
const hasManageGroupMembershipsCol = await knex.schema.hasColumn(TableName.OidcConfig, "manageGroupMemberships");
await knex.schema.alterTable(TableName.OidcConfig, (tb) => {
if (!hasManageGroupMembershipsCol) {
tb.boolean("manageGroupMemberships").notNullable().defaultTo(false);
}
});
}
export async function down(knex: Knex): Promise<void> {
const hasManageGroupMembershipsCol = await knex.schema.hasColumn(TableName.OidcConfig, "manageGroupMemberships");
await knex.schema.alterTable(TableName.OidcConfig, (t) => {
if (hasManageGroupMembershipsCol) {
t.dropColumn("manageGroupMemberships");
}
});
}

View File

@ -0,0 +1,23 @@
import { Knex } from "knex";
import { TableName } from "@app/db/schemas";
export async function up(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.AppConnection, (t) => {
t.unique(["orgId", "name"]);
});
await knex.schema.alterTable(TableName.SecretSync, (t) => {
t.unique(["projectId", "name"]);
});
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.AppConnection, (t) => {
t.dropUnique(["orgId", "name"]);
});
await knex.schema.alterTable(TableName.SecretSync, (t) => {
t.dropUnique(["projectId", "name"]);
});
}

View File

@ -27,7 +27,8 @@ export const OidcConfigsSchema = z.object({
createdAt: z.date(), createdAt: z.date(),
updatedAt: z.date(), updatedAt: z.date(),
orgId: z.string().uuid(), orgId: z.string().uuid(),
lastUsed: z.date().nullable().optional() lastUsed: z.date().nullable().optional(),
manageGroupMemberships: z.boolean().default(false)
}); });
export type TOidcConfigs = z.infer<typeof OidcConfigsSchema>; export type TOidcConfigs = z.infer<typeof OidcConfigsSchema>;

View File

@ -153,7 +153,8 @@ export const registerOidcRouter = async (server: FastifyZodProvider) => {
discoveryURL: true, discoveryURL: true,
isActive: true, isActive: true,
orgId: true, orgId: true,
allowedEmailDomains: true allowedEmailDomains: true,
manageGroupMemberships: true
}).extend({ }).extend({
clientId: z.string(), clientId: z.string(),
clientSecret: z.string() clientSecret: z.string()
@ -207,7 +208,8 @@ export const registerOidcRouter = async (server: FastifyZodProvider) => {
userinfoEndpoint: z.string().trim(), userinfoEndpoint: z.string().trim(),
clientId: z.string().trim(), clientId: z.string().trim(),
clientSecret: z.string().trim(), clientSecret: z.string().trim(),
isActive: z.boolean() isActive: z.boolean(),
manageGroupMemberships: z.boolean().optional()
}) })
.partial() .partial()
.merge(z.object({ orgSlug: z.string() })), .merge(z.object({ orgSlug: z.string() })),
@ -223,7 +225,8 @@ export const registerOidcRouter = async (server: FastifyZodProvider) => {
userinfoEndpoint: true, userinfoEndpoint: true,
orgId: true, orgId: true,
allowedEmailDomains: true, allowedEmailDomains: true,
isActive: true isActive: true,
manageGroupMemberships: true
}) })
} }
}, },
@ -272,7 +275,8 @@ export const registerOidcRouter = async (server: FastifyZodProvider) => {
clientId: z.string().trim(), clientId: z.string().trim(),
clientSecret: z.string().trim(), clientSecret: z.string().trim(),
isActive: z.boolean(), isActive: z.boolean(),
orgSlug: z.string().trim() orgSlug: z.string().trim(),
manageGroupMemberships: z.boolean().optional().default(false)
}) })
.superRefine((data, ctx) => { .superRefine((data, ctx) => {
if (data.configurationType === OIDCConfigurationType.CUSTOM) { if (data.configurationType === OIDCConfigurationType.CUSTOM) {
@ -334,7 +338,8 @@ export const registerOidcRouter = async (server: FastifyZodProvider) => {
userinfoEndpoint: true, userinfoEndpoint: true,
orgId: true, orgId: true,
isActive: true, isActive: true,
allowedEmailDomains: true allowedEmailDomains: true,
manageGroupMemberships: true
}) })
} }
}, },
@ -350,4 +355,25 @@ export const registerOidcRouter = async (server: FastifyZodProvider) => {
return oidc; return oidc;
} }
}); });
server.route({
method: "GET",
url: "/manage-group-memberships",
schema: {
querystring: z.object({
orgId: z.string().trim().min(1, "Org ID is required")
}),
response: {
200: z.object({
isEnabled: z.boolean()
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const isEnabled = await server.services.oidc.isOidcManageGroupMembershipsEnabled(req.query.orgId, req.permission);
return { isEnabled };
}
});
}; };

View File

@ -39,11 +39,13 @@ export const auditLogDALFactory = (db: TDbClient) => {
offset = 0, offset = 0,
actorId, actorId,
actorType, actorType,
secretPath,
eventType, eventType,
eventMetadata eventMetadata
}: Omit<TFindQuery, "actor" | "eventType"> & { }: Omit<TFindQuery, "actor" | "eventType"> & {
actorId?: string; actorId?: string;
actorType?: ActorType; actorType?: ActorType;
secretPath?: string;
eventType?: EventType[]; eventType?: EventType[];
eventMetadata?: Record<string, string>; eventMetadata?: Record<string, string>;
}, },
@ -88,6 +90,10 @@ export const auditLogDALFactory = (db: TDbClient) => {
}); });
} }
if (projectId && secretPath) {
void sqlQuery.whereRaw(`"eventMetadata" @> jsonb_build_object('secretPath', ?::text)`, [secretPath]);
}
// Filter by actor type // Filter by actor type
if (actorType) { if (actorType) {
void sqlQuery.where("actor", actorType); void sqlQuery.where("actor", actorType);

View File

@ -46,10 +46,6 @@ export const auditLogServiceFactory = ({
actorOrgId actorOrgId
); );
/**
* NOTE (dangtony98): Update this to organization-level audit log permission check once audit logs are moved
* to the organization level ✅
*/
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.AuditLogs); ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.AuditLogs);
} }
@ -64,6 +60,7 @@ export const auditLogServiceFactory = ({
actorId: filter.auditLogActorId, actorId: filter.auditLogActorId,
actorType: filter.actorType, actorType: filter.actorType,
eventMetadata: filter.eventMetadata, eventMetadata: filter.eventMetadata,
secretPath: filter.secretPath,
...(filter.projectId ? { projectId: filter.projectId } : { orgId: actorOrgId }) ...(filter.projectId ? { projectId: filter.projectId } : { orgId: actorOrgId })
}); });

View File

@ -32,6 +32,7 @@ export type TListProjectAuditLogDTO = {
projectId?: string; projectId?: string;
auditLogActorId?: string; auditLogActorId?: string;
actorType?: ActorType; actorType?: ActorType;
secretPath?: string;
eventMetadata?: Record<string, string>; eventMetadata?: Record<string, string>;
}; };
} & Omit<TProjectPermission, "projectId">; } & Omit<TProjectPermission, "projectId">;
@ -248,7 +249,9 @@ export enum EventType {
DELETE_SECRET_SYNC = "delete-secret-sync", DELETE_SECRET_SYNC = "delete-secret-sync",
SECRET_SYNC_SYNC_SECRETS = "secret-sync-sync-secrets", SECRET_SYNC_SYNC_SECRETS = "secret-sync-sync-secrets",
SECRET_SYNC_IMPORT_SECRETS = "secret-sync-import-secrets", SECRET_SYNC_IMPORT_SECRETS = "secret-sync-import-secrets",
SECRET_SYNC_REMOVE_SECRETS = "secret-sync-remove-secrets" SECRET_SYNC_REMOVE_SECRETS = "secret-sync-remove-secrets",
OIDC_GROUP_MEMBERSHIP_MAPPING_ASSIGN_USER = "oidc-group-membership-mapping-assign-user",
OIDC_GROUP_MEMBERSHIP_MAPPING_REMOVE_USER = "oidc-group-membership-mapping-remove-user"
} }
interface UserActorMetadata { interface UserActorMetadata {
@ -2043,6 +2046,26 @@ interface SecretSyncRemoveSecretsEvent {
}; };
} }
interface OidcGroupMembershipMappingAssignUserEvent {
type: EventType.OIDC_GROUP_MEMBERSHIP_MAPPING_ASSIGN_USER;
metadata: {
assignedToGroups: { id: string; name: string }[];
userId: string;
userEmail: string;
userGroupsClaim: string[];
};
}
interface OidcGroupMembershipMappingRemoveUserEvent {
type: EventType.OIDC_GROUP_MEMBERSHIP_MAPPING_REMOVE_USER;
metadata: {
removedFromGroups: { id: string; name: string }[];
userId: string;
userEmail: string;
userGroupsClaim: string[];
};
}
export type Event = export type Event =
| GetSecretsEvent | GetSecretsEvent
| GetSecretEvent | GetSecretEvent
@ -2231,4 +2254,6 @@ export type Event =
| DeleteSecretSyncEvent | DeleteSecretSyncEvent
| SecretSyncSyncSecretsEvent | SecretSyncSyncSecretsEvent
| SecretSyncImportSecretsEvent | SecretSyncImportSecretsEvent
| SecretSyncRemoveSecretsEvent; | SecretSyncRemoveSecretsEvent
| OidcGroupMembershipMappingAssignUserEvent
| OidcGroupMembershipMappingRemoveUserEvent;

View File

@ -2,6 +2,7 @@ import { ForbiddenError } from "@casl/ability";
import slugify from "@sindresorhus/slugify"; import slugify from "@sindresorhus/slugify";
import { OrgMembershipRole, TOrgRoles } from "@app/db/schemas"; import { OrgMembershipRole, TOrgRoles } from "@app/db/schemas";
import { TOidcConfigDALFactory } from "@app/ee/services/oidc/oidc-config-dal";
import { isAtLeastAsPrivileged } from "@app/lib/casl"; import { isAtLeastAsPrivileged } from "@app/lib/casl";
import { BadRequestError, ForbiddenRequestError, NotFoundError, UnauthorizedError } from "@app/lib/errors"; import { BadRequestError, ForbiddenRequestError, NotFoundError, UnauthorizedError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid"; import { alphaNumericNanoId } from "@app/lib/nanoid";
@ -45,6 +46,7 @@ type TGroupServiceFactoryDep = {
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "delete" | "findLatestProjectKey" | "insertMany">; projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "delete" | "findLatestProjectKey" | "insertMany">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission" | "getOrgPermissionByRole">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission" | "getOrgPermissionByRole">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">; licenseService: Pick<TLicenseServiceFactory, "getPlan">;
oidcConfigDAL: Pick<TOidcConfigDALFactory, "findOne">;
}; };
export type TGroupServiceFactory = ReturnType<typeof groupServiceFactory>; export type TGroupServiceFactory = ReturnType<typeof groupServiceFactory>;
@ -59,7 +61,8 @@ export const groupServiceFactory = ({
projectBotDAL, projectBotDAL,
projectKeyDAL, projectKeyDAL,
permissionService, permissionService,
licenseService licenseService,
oidcConfigDAL
}: TGroupServiceFactoryDep) => { }: TGroupServiceFactoryDep) => {
const createGroup = async ({ name, slug, role, actor, actorId, actorAuthMethod, actorOrgId }: TCreateGroupDTO) => { const createGroup = async ({ name, slug, role, actor, actorId, actorAuthMethod, actorOrgId }: TCreateGroupDTO) => {
if (!actorOrgId) throw new UnauthorizedError({ message: "No organization ID provided in request" }); if (!actorOrgId) throw new UnauthorizedError({ message: "No organization ID provided in request" });
@ -311,6 +314,18 @@ export const groupServiceFactory = ({
message: `Failed to find group with ID ${id}` message: `Failed to find group with ID ${id}`
}); });
const oidcConfig = await oidcConfigDAL.findOne({
orgId: group.orgId,
isActive: true
});
if (oidcConfig?.manageGroupMemberships) {
throw new BadRequestError({
message:
"Cannot add user to group: OIDC group membership mapping is enabled - user must be assigned to this group in your OIDC provider."
});
}
const { permission: groupRolePermission } = await permissionService.getOrgPermissionByRole(group.role, actorOrgId); const { permission: groupRolePermission } = await permissionService.getOrgPermissionByRole(group.role, actorOrgId);
// check if user has broader or equal to privileges than group // check if user has broader or equal to privileges than group
@ -366,6 +381,18 @@ export const groupServiceFactory = ({
message: `Failed to find group with ID ${id}` message: `Failed to find group with ID ${id}`
}); });
const oidcConfig = await oidcConfigDAL.findOne({
orgId: group.orgId,
isActive: true
});
if (oidcConfig?.manageGroupMemberships) {
throw new BadRequestError({
message:
"Cannot remove user from group: OIDC group membership mapping is enabled - user must be removed from this group in your OIDC provider."
});
}
const { permission: groupRolePermission } = await permissionService.getOrgPermissionByRole(group.role, actorOrgId); const { permission: groupRolePermission } = await permissionService.getOrgPermissionByRole(group.role, actorOrgId);
// check if user has broader or equal to privileges than group // check if user has broader or equal to privileges than group

View File

@ -5,6 +5,11 @@ import { Issuer, Issuer as OpenIdIssuer, Strategy as OpenIdStrategy, TokenSet }
import { OrgMembershipStatus, SecretKeyEncoding, TableName, TUsers } from "@app/db/schemas"; import { OrgMembershipStatus, SecretKeyEncoding, TableName, TUsers } from "@app/db/schemas";
import { TOidcConfigsUpdate } from "@app/db/schemas/oidc-configs"; import { TOidcConfigsUpdate } from "@app/db/schemas/oidc-configs";
import { TAuditLogServiceFactory } from "@app/ee/services/audit-log/audit-log-service";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { TGroupDALFactory } from "@app/ee/services/group/group-dal";
import { addUsersToGroupByUserIds, removeUsersFromGroupByUserIds } from "@app/ee/services/group/group-fns";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service"; import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission"; import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service"; import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
@ -18,13 +23,18 @@ import {
infisicalSymmetricEncypt infisicalSymmetricEncypt
} from "@app/lib/crypto/encryption"; } from "@app/lib/crypto/encryption";
import { BadRequestError, ForbiddenRequestError, NotFoundError, OidcAuthError } from "@app/lib/errors"; import { BadRequestError, ForbiddenRequestError, NotFoundError, OidcAuthError } from "@app/lib/errors";
import { AuthMethod, AuthTokenType } from "@app/services/auth/auth-type"; import { OrgServiceActor } from "@app/lib/types";
import { ActorType, AuthMethod, AuthTokenType } from "@app/services/auth/auth-type";
import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-service"; import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-service";
import { TokenType } from "@app/services/auth-token/auth-token-types"; import { TokenType } from "@app/services/auth-token/auth-token-types";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TOrgBotDALFactory } from "@app/services/org/org-bot-dal"; import { TOrgBotDALFactory } from "@app/services/org/org-bot-dal";
import { TOrgDALFactory } from "@app/services/org/org-dal"; import { TOrgDALFactory } from "@app/services/org/org-dal";
import { getDefaultOrgMembershipRole } from "@app/services/org/org-role-fns"; import { getDefaultOrgMembershipRole } from "@app/services/org/org-role-fns";
import { TOrgMembershipDALFactory } from "@app/services/org-membership/org-membership-dal"; import { TOrgMembershipDALFactory } from "@app/services/org-membership/org-membership-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service"; import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { getServerCfg } from "@app/services/super-admin/super-admin-service"; import { getServerCfg } from "@app/services/super-admin/super-admin-service";
import { LoginMethod } from "@app/services/super-admin/super-admin-types"; import { LoginMethod } from "@app/services/super-admin/super-admin-types";
@ -45,7 +55,14 @@ import {
type TOidcConfigServiceFactoryDep = { type TOidcConfigServiceFactoryDep = {
userDAL: Pick< userDAL: Pick<
TUserDALFactory, TUserDALFactory,
"create" | "findOne" | "transaction" | "updateById" | "findById" | "findUserEncKeyByUserId" | "create"
| "findOne"
| "updateById"
| "findById"
| "findUserEncKeyByUserId"
| "findUserEncKeyByUserIdsBatch"
| "find"
| "transaction"
>; >;
userAliasDAL: Pick<TUserAliasDALFactory, "create" | "findOne">; userAliasDAL: Pick<TUserAliasDALFactory, "create" | "findOne">;
orgDAL: Pick< orgDAL: Pick<
@ -57,8 +74,23 @@ type TOidcConfigServiceFactoryDep = {
licenseService: Pick<TLicenseServiceFactory, "getPlan" | "updateSubscriptionOrgMemberCount">; licenseService: Pick<TLicenseServiceFactory, "getPlan" | "updateSubscriptionOrgMemberCount">;
tokenService: Pick<TAuthTokenServiceFactory, "createTokenForUser">; tokenService: Pick<TAuthTokenServiceFactory, "createTokenForUser">;
smtpService: Pick<TSmtpService, "sendMail" | "verify">; smtpService: Pick<TSmtpService, "sendMail" | "verify">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission" | "getUserOrgPermission">;
oidcConfigDAL: Pick<TOidcConfigDALFactory, "findOne" | "update" | "create">; oidcConfigDAL: Pick<TOidcConfigDALFactory, "findOne" | "update" | "create">;
groupDAL: Pick<TGroupDALFactory, "findByOrgId">;
userGroupMembershipDAL: Pick<
TUserGroupMembershipDALFactory,
| "find"
| "transaction"
| "insertMany"
| "findGroupMembershipsByUserIdInOrg"
| "delete"
| "filterProjectsByUserMembership"
>;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany" | "delete">;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
auditLogService: Pick<TAuditLogServiceFactory, "createAuditLog">;
}; };
export type TOidcConfigServiceFactory = ReturnType<typeof oidcConfigServiceFactory>; export type TOidcConfigServiceFactory = ReturnType<typeof oidcConfigServiceFactory>;
@ -73,7 +105,14 @@ export const oidcConfigServiceFactory = ({
tokenService, tokenService,
orgBotDAL, orgBotDAL,
smtpService, smtpService,
oidcConfigDAL oidcConfigDAL,
userGroupMembershipDAL,
groupDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
auditLogService
}: TOidcConfigServiceFactoryDep) => { }: TOidcConfigServiceFactoryDep) => {
const getOidc = async (dto: TGetOidcCfgDTO) => { const getOidc = async (dto: TGetOidcCfgDTO) => {
const org = await orgDAL.findOne({ slug: dto.orgSlug }); const org = await orgDAL.findOne({ slug: dto.orgSlug });
@ -156,11 +195,21 @@ export const oidcConfigServiceFactory = ({
isActive: oidcCfg.isActive, isActive: oidcCfg.isActive,
allowedEmailDomains: oidcCfg.allowedEmailDomains, allowedEmailDomains: oidcCfg.allowedEmailDomains,
clientId, clientId,
clientSecret clientSecret,
manageGroupMemberships: oidcCfg.manageGroupMemberships
}; };
}; };
const oidcLogin = async ({ externalId, email, firstName, lastName, orgId, callbackPort }: TOidcLoginDTO) => { const oidcLogin = async ({
externalId,
email,
firstName,
lastName,
orgId,
callbackPort,
groups = [],
manageGroupMemberships
}: TOidcLoginDTO) => {
const serverCfg = await getServerCfg(); const serverCfg = await getServerCfg();
if (serverCfg.enabledLoginMethods && !serverCfg.enabledLoginMethods.includes(LoginMethod.OIDC)) { if (serverCfg.enabledLoginMethods && !serverCfg.enabledLoginMethods.includes(LoginMethod.OIDC)) {
@ -315,6 +364,83 @@ export const oidcConfigServiceFactory = ({
}); });
} }
if (manageGroupMemberships) {
const userGroups = await userGroupMembershipDAL.findGroupMembershipsByUserIdInOrg(user.id, orgId);
const orgGroups = await groupDAL.findByOrgId(orgId);
const userGroupsNames = userGroups.map((membership) => membership.groupName);
const missingGroupsMemberships = groups.filter((groupName) => !userGroupsNames.includes(groupName));
const groupsToAddUserTo = orgGroups.filter((group) => missingGroupsMemberships.includes(group.name));
for await (const group of groupsToAddUserTo) {
await addUsersToGroupByUserIds({
userIds: [user.id],
group,
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL
});
}
if (groupsToAddUserTo.length) {
await auditLogService.createAuditLog({
actor: {
type: ActorType.PLATFORM,
metadata: {}
},
orgId,
event: {
type: EventType.OIDC_GROUP_MEMBERSHIP_MAPPING_ASSIGN_USER,
metadata: {
userId: user.id,
userEmail: user.email ?? user.username,
assignedToGroups: groupsToAddUserTo.map(({ id, name }) => ({ id, name })),
userGroupsClaim: groups
}
}
});
}
const membershipsToRemove = userGroups
.filter((membership) => !groups.includes(membership.groupName))
.map((membership) => membership.groupId);
const groupsToRemoveUserFrom = orgGroups.filter((group) => membershipsToRemove.includes(group.id));
for await (const group of groupsToRemoveUserFrom) {
await removeUsersFromGroupByUserIds({
userIds: [user.id],
group,
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL
});
}
if (groupsToRemoveUserFrom.length) {
await auditLogService.createAuditLog({
actor: {
type: ActorType.PLATFORM,
metadata: {}
},
orgId,
event: {
type: EventType.OIDC_GROUP_MEMBERSHIP_MAPPING_REMOVE_USER,
metadata: {
userId: user.id,
userEmail: user.email ?? user.username,
removedFromGroups: groupsToRemoveUserFrom.map(({ id, name }) => ({ id, name })),
userGroupsClaim: groups
}
}
});
}
}
await licenseService.updateSubscriptionOrgMemberCount(organization.id); await licenseService.updateSubscriptionOrgMemberCount(organization.id);
const userEnc = await userDAL.findUserEncKeyByUserId(user.id); const userEnc = await userDAL.findUserEncKeyByUserId(user.id);
@ -385,7 +511,8 @@ export const oidcConfigServiceFactory = ({
tokenEndpoint, tokenEndpoint,
userinfoEndpoint, userinfoEndpoint,
clientId, clientId,
clientSecret clientSecret,
manageGroupMemberships
}: TUpdateOidcCfgDTO) => { }: TUpdateOidcCfgDTO) => {
const org = await orgDAL.findOne({ const org = await orgDAL.findOne({
slug: orgSlug slug: orgSlug
@ -448,7 +575,8 @@ export const oidcConfigServiceFactory = ({
userinfoEndpoint, userinfoEndpoint,
jwksUri, jwksUri,
isActive, isActive,
lastUsed: null lastUsed: null,
manageGroupMemberships
}; };
if (clientId !== undefined) { if (clientId !== undefined) {
@ -491,7 +619,8 @@ export const oidcConfigServiceFactory = ({
tokenEndpoint, tokenEndpoint,
userinfoEndpoint, userinfoEndpoint,
clientId, clientId,
clientSecret clientSecret,
manageGroupMemberships
}: TCreateOidcCfgDTO) => { }: TCreateOidcCfgDTO) => {
const org = await orgDAL.findOne({ const org = await orgDAL.findOne({
slug: orgSlug slug: orgSlug
@ -589,7 +718,8 @@ export const oidcConfigServiceFactory = ({
clientIdTag, clientIdTag,
encryptedClientSecret, encryptedClientSecret,
clientSecretIV, clientSecretIV,
clientSecretTag clientSecretTag,
manageGroupMemberships
}); });
return oidcCfg; return oidcCfg;
@ -683,7 +813,9 @@ export const oidcConfigServiceFactory = ({
firstName: claims.given_name ?? "", firstName: claims.given_name ?? "",
lastName: claims.family_name ?? "", lastName: claims.family_name ?? "",
orgId: org.id, orgId: org.id,
callbackPort groups: claims.groups as string[] | undefined,
callbackPort,
manageGroupMemberships: oidcCfg.manageGroupMemberships
}) })
.then(({ isUserCompleted, providerAuthToken }) => { .then(({ isUserCompleted, providerAuthToken }) => {
cb(null, { isUserCompleted, providerAuthToken }); cb(null, { isUserCompleted, providerAuthToken });
@ -697,5 +829,16 @@ export const oidcConfigServiceFactory = ({
return strategy; return strategy;
}; };
return { oidcLogin, getOrgAuthStrategy, getOidc, updateOidcCfg, createOidcCfg }; const isOidcManageGroupMembershipsEnabled = async (orgId: string, actor: OrgServiceActor) => {
await permissionService.getUserOrgPermission(actor.id, orgId, actor.authMethod, actor.orgId);
const oidcConfig = await oidcConfigDAL.findOne({
orgId,
isActive: true
});
return Boolean(oidcConfig?.manageGroupMemberships);
};
return { oidcLogin, getOrgAuthStrategy, getOidc, updateOidcCfg, createOidcCfg, isOidcManageGroupMembershipsEnabled };
}; };

View File

@ -12,6 +12,8 @@ export type TOidcLoginDTO = {
lastName?: string; lastName?: string;
orgId: string; orgId: string;
callbackPort?: string; callbackPort?: string;
groups?: string[];
manageGroupMemberships?: boolean | null;
}; };
export type TGetOidcCfgDTO = export type TGetOidcCfgDTO =
@ -37,6 +39,7 @@ export type TCreateOidcCfgDTO = {
clientSecret: string; clientSecret: string;
isActive: boolean; isActive: boolean;
orgSlug: string; orgSlug: string;
manageGroupMemberships: boolean;
} & TGenericPermission; } & TGenericPermission;
export type TUpdateOidcCfgDTO = Partial<{ export type TUpdateOidcCfgDTO = Partial<{
@ -52,5 +55,6 @@ export type TUpdateOidcCfgDTO = Partial<{
clientSecret: string; clientSecret: string;
isActive: boolean; isActive: boolean;
orgSlug: string; orgSlug: string;
manageGroupMemberships: boolean;
}> & }> &
TGenericPermission; TGenericPermission;

View File

@ -163,6 +163,27 @@ export type ProjectPermissionSet =
| [ProjectPermissionActions.Create, ProjectPermissionSub.SecretRollback] | [ProjectPermissionActions.Create, ProjectPermissionSub.SecretRollback]
| [ProjectPermissionActions.Edit, ProjectPermissionSub.Kms]; | [ProjectPermissionActions.Edit, ProjectPermissionSub.Kms];
const SECRET_PATH_MISSING_SLASH_ERR_MSG = "Invalid Secret Path; it must start with a '/'";
const SECRET_PATH_PERMISSION_OPERATOR_SCHEMA = z.union([
z.string().refine((val) => val.startsWith("/"), SECRET_PATH_MISSING_SLASH_ERR_MSG),
z
.object({
[PermissionConditionOperators.$EQ]: PermissionConditionSchema[PermissionConditionOperators.$EQ].refine(
(val) => val.startsWith("/"),
SECRET_PATH_MISSING_SLASH_ERR_MSG
),
[PermissionConditionOperators.$NEQ]: PermissionConditionSchema[PermissionConditionOperators.$NEQ].refine(
(val) => val.startsWith("/"),
SECRET_PATH_MISSING_SLASH_ERR_MSG
),
[PermissionConditionOperators.$IN]: PermissionConditionSchema[PermissionConditionOperators.$IN].refine(
(val) => val.every((el) => el.startsWith("/")),
SECRET_PATH_MISSING_SLASH_ERR_MSG
),
[PermissionConditionOperators.$GLOB]: PermissionConditionSchema[PermissionConditionOperators.$GLOB]
})
.partial()
]);
// akhilmhdh: don't modify this for v2 // akhilmhdh: don't modify this for v2
// if you want to update create a new schema // if you want to update create a new schema
const SecretConditionV1Schema = z const SecretConditionV1Schema = z
@ -177,17 +198,7 @@ const SecretConditionV1Schema = z
}) })
.partial() .partial()
]), ]),
secretPath: z.union([ secretPath: SECRET_PATH_PERMISSION_OPERATOR_SCHEMA
z.string(),
z
.object({
[PermissionConditionOperators.$EQ]: PermissionConditionSchema[PermissionConditionOperators.$EQ],
[PermissionConditionOperators.$NEQ]: PermissionConditionSchema[PermissionConditionOperators.$NEQ],
[PermissionConditionOperators.$IN]: PermissionConditionSchema[PermissionConditionOperators.$IN],
[PermissionConditionOperators.$GLOB]: PermissionConditionSchema[PermissionConditionOperators.$GLOB]
})
.partial()
])
}) })
.partial(); .partial();
@ -204,17 +215,7 @@ const SecretConditionV2Schema = z
}) })
.partial() .partial()
]), ]),
secretPath: z.union([ secretPath: SECRET_PATH_PERMISSION_OPERATOR_SCHEMA,
z.string(),
z
.object({
[PermissionConditionOperators.$EQ]: PermissionConditionSchema[PermissionConditionOperators.$EQ],
[PermissionConditionOperators.$NEQ]: PermissionConditionSchema[PermissionConditionOperators.$NEQ],
[PermissionConditionOperators.$IN]: PermissionConditionSchema[PermissionConditionOperators.$IN],
[PermissionConditionOperators.$GLOB]: PermissionConditionSchema[PermissionConditionOperators.$GLOB]
})
.partial()
]),
secretName: z.union([ secretName: z.union([
z.string(), z.string(),
z z

View File

@ -828,6 +828,8 @@ export const AUDIT_LOGS = {
projectId: projectId:
"Optionally filter logs by project ID. If not provided, logs from the entire organization will be returned.", "Optionally filter logs by project ID. If not provided, logs from the entire organization will be returned.",
eventType: "The type of the event to export.", eventType: "The type of the event to export.",
secretPath:
"The path of the secret to query audit logs for. Note that the projectId parameter must also be provided.",
userAgentType: "Choose which consuming application to export audit logs for.", userAgentType: "Choose which consuming application to export audit logs for.",
eventMetadata: eventMetadata:
"Filter by event metadata key-value pairs. Formatted as `key1=value1,key2=value2`, with comma-separation.", "Filter by event metadata key-value pairs. Formatted as `key1=value1,key2=value2`, with comma-separation.",

View File

@ -201,6 +201,9 @@ const envSchema = z
INF_APP_CONNECTION_GITHUB_APP_SLUG: zpStr(z.string().optional()), INF_APP_CONNECTION_GITHUB_APP_SLUG: zpStr(z.string().optional()),
INF_APP_CONNECTION_GITHUB_APP_ID: zpStr(z.string().optional()), INF_APP_CONNECTION_GITHUB_APP_ID: zpStr(z.string().optional()),
// gcp app
INF_APP_CONNECTION_GCP_SERVICE_ACCOUNT_CREDENTIAL: zpStr(z.string().optional()),
/* CORS ----------------------------------------------------------------------------- */ /* CORS ----------------------------------------------------------------------------- */
CORS_ALLOWED_ORIGINS: zpStr( CORS_ALLOWED_ORIGINS: zpStr(

View File

@ -116,7 +116,7 @@ export const decryptAsymmetric = ({ ciphertext, nonce, publicKey, privateKey }:
export const generateSymmetricKey = (size = 32) => crypto.randomBytes(size).toString("base64"); export const generateSymmetricKey = (size = 32) => crypto.randomBytes(size).toString("base64");
export const generateHash = (value: string) => crypto.createHash("sha256").update(value).digest("hex"); export const generateHash = (value: string | Buffer) => crypto.createHash("sha256").update(value).digest("hex");
export const generateAsymmetricKeyPair = () => { export const generateAsymmetricKeyPair = () => {
const pair = nacl.box.keyPair(); const pair = nacl.box.keyPair();

View File

@ -0,0 +1,4 @@
export enum DatabaseErrorCode {
ForeignKeyViolation = "23503",
UniqueViolation = "23505"
}

View File

@ -0,0 +1 @@
export * from "./database";

View File

@ -467,7 +467,8 @@ export const registerRoutes = async (
projectBotDAL, projectBotDAL,
projectKeyDAL, projectKeyDAL,
permissionService, permissionService,
licenseService licenseService,
oidcConfigDAL
}); });
const groupProjectService = groupProjectServiceFactory({ const groupProjectService = groupProjectServiceFactory({
groupDAL, groupDAL,
@ -1337,7 +1338,14 @@ export const registerRoutes = async (
smtpService, smtpService,
orgBotDAL, orgBotDAL,
permissionService, permissionService,
oidcConfigDAL oidcConfigDAL,
projectBotDAL,
projectKeyDAL,
projectDAL,
userGroupMembershipDAL,
groupProjectDAL,
groupDAL,
auditLogService
}); });
const userEngagementService = userEngagementServiceFactory({ const userEngagementService = userEngagementServiceFactory({

View File

@ -4,18 +4,21 @@ import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { readLimit } from "@app/server/config/rateLimiter"; import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth"; import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AwsConnectionListItemSchema, SanitizedAwsConnectionSchema } from "@app/services/app-connection/aws"; import { AwsConnectionListItemSchema, SanitizedAwsConnectionSchema } from "@app/services/app-connection/aws";
import { GcpConnectionListItemSchema, SanitizedGcpConnectionSchema } from "@app/services/app-connection/gcp";
import { GitHubConnectionListItemSchema, SanitizedGitHubConnectionSchema } from "@app/services/app-connection/github"; import { GitHubConnectionListItemSchema, SanitizedGitHubConnectionSchema } from "@app/services/app-connection/github";
import { AuthMode } from "@app/services/auth/auth-type"; import { AuthMode } from "@app/services/auth/auth-type";
// can't use discriminated due to multiple schemas for certain apps // can't use discriminated due to multiple schemas for certain apps
const SanitizedAppConnectionSchema = z.union([ const SanitizedAppConnectionSchema = z.union([
...SanitizedAwsConnectionSchema.options, ...SanitizedAwsConnectionSchema.options,
...SanitizedGitHubConnectionSchema.options ...SanitizedGitHubConnectionSchema.options,
...SanitizedGcpConnectionSchema.options
]); ]);
const AppConnectionOptionsSchema = z.discriminatedUnion("app", [ const AppConnectionOptionsSchema = z.discriminatedUnion("app", [
AwsConnectionListItemSchema, AwsConnectionListItemSchema,
GitHubConnectionListItemSchema GitHubConnectionListItemSchema,
GcpConnectionListItemSchema
]); ]);
export const registerAppConnectionRouter = async (server: FastifyZodProvider) => { export const registerAppConnectionRouter = async (server: FastifyZodProvider) => {

View File

@ -0,0 +1,48 @@
import z from "zod";
import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
CreateGcpConnectionSchema,
SanitizedGcpConnectionSchema,
UpdateGcpConnectionSchema
} from "@app/services/app-connection/gcp";
import { AuthMode } from "@app/services/auth/auth-type";
import { registerAppConnectionEndpoints } from "./app-connection-endpoints";
export const registerGcpConnectionRouter = async (server: FastifyZodProvider) => {
registerAppConnectionEndpoints({
app: AppConnection.GCP,
server,
sanitizedResponseSchema: SanitizedGcpConnectionSchema,
createSchema: CreateGcpConnectionSchema,
updateSchema: UpdateGcpConnectionSchema
});
// The below endpoints are not exposed and for Infisical App use
server.route({
method: "GET",
url: `/:connectionId/secret-manager-projects`,
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
connectionId: z.string().uuid()
}),
response: {
200: z.object({ id: z.string(), name: z.string() }).array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const { connectionId } = req.params;
const projects = await server.services.appConnection.gcp.listSecretManagerProjects(connectionId, req.permission);
return projects;
}
});
};

View File

@ -1,6 +1,7 @@
import { AppConnection } from "@app/services/app-connection/app-connection-enums"; import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { registerAwsConnectionRouter } from "./aws-connection-router"; import { registerAwsConnectionRouter } from "./aws-connection-router";
import { registerGcpConnectionRouter } from "./gcp-connection-router";
import { registerGitHubConnectionRouter } from "./github-connection-router"; import { registerGitHubConnectionRouter } from "./github-connection-router";
export * from "./app-connection-router"; export * from "./app-connection-router";
@ -8,5 +9,6 @@ export * from "./app-connection-router";
export const APP_CONNECTION_REGISTER_ROUTER_MAP: Record<AppConnection, (server: FastifyZodProvider) => Promise<void>> = export const APP_CONNECTION_REGISTER_ROUTER_MAP: Record<AppConnection, (server: FastifyZodProvider) => Promise<void>> =
{ {
[AppConnection.AWS]: registerAwsConnectionRouter, [AppConnection.AWS]: registerAwsConnectionRouter,
[AppConnection.GitHub]: registerGitHubConnectionRouter [AppConnection.GitHub]: registerGitHubConnectionRouter,
[AppConnection.GCP]: registerGcpConnectionRouter
}; };

View File

@ -79,7 +79,8 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
params: z.object({ params: z.object({
identityId: z.string().trim().describe(AWS_AUTH.ATTACH.identityId) identityId: z.string().trim().describe(AWS_AUTH.ATTACH.identityId)
}), }),
body: z.object({ body: z
.object({
stsEndpoint: z stsEndpoint: z
.string() .string()
.trim() .trim()
@ -99,24 +100,23 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(AWS_AUTH.ATTACH.accessTokenTTL), .describe(AWS_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(1)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(AWS_AUTH.ATTACH.accessTokenMaxTTL), .describe(AWS_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(AWS_AUTH.ATTACH.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(AWS_AUTH.ATTACH.accessTokenNumUsesLimit)
}), })
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityAwsAuth: IdentityAwsAuthsSchema identityAwsAuth: IdentityAwsAuthsSchema
@ -172,7 +172,8 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
params: z.object({ params: z.object({
identityId: z.string().describe(AWS_AUTH.UPDATE.identityId) identityId: z.string().describe(AWS_AUTH.UPDATE.identityId)
}), }),
body: z.object({ body: z
.object({
stsEndpoint: z.string().trim().min(1).optional().describe(AWS_AUTH.UPDATE.stsEndpoint), stsEndpoint: z.string().trim().min(1).optional().describe(AWS_AUTH.UPDATE.stsEndpoint),
allowedPrincipalArns: validatePrincipalArns.describe(AWS_AUTH.UPDATE.allowedPrincipalArns), allowedPrincipalArns: validatePrincipalArns.describe(AWS_AUTH.UPDATE.allowedPrincipalArns),
allowedAccountIds: validateAccountIds.describe(AWS_AUTH.UPDATE.allowedAccountIds), allowedAccountIds: validateAccountIds.describe(AWS_AUTH.UPDATE.allowedAccountIds),
@ -190,12 +191,14 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
.number() .number()
.int() .int()
.max(315360000) .max(315360000)
.refine((value) => value !== 0, { .min(0)
message: "accessTokenMaxTTL must have a non zero number"
})
.optional() .optional()
.describe(AWS_AUTH.UPDATE.accessTokenMaxTTL) .describe(AWS_AUTH.UPDATE.accessTokenMaxTTL)
}), })
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityAwsAuth: IdentityAwsAuthsSchema identityAwsAuth: IdentityAwsAuthsSchema

View File

@ -76,7 +76,8 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
params: z.object({ params: z.object({
identityId: z.string().trim().describe(AZURE_AUTH.LOGIN.identityId) identityId: z.string().trim().describe(AZURE_AUTH.LOGIN.identityId)
}), }),
body: z.object({ body: z
.object({
tenantId: z.string().trim().describe(AZURE_AUTH.ATTACH.tenantId), tenantId: z.string().trim().describe(AZURE_AUTH.ATTACH.tenantId),
resource: z.string().trim().describe(AZURE_AUTH.ATTACH.resource), resource: z.string().trim().describe(AZURE_AUTH.ATTACH.resource),
allowedServicePrincipalIds: validateAzureAuthField.describe(AZURE_AUTH.ATTACH.allowedServicePrincipalIds), allowedServicePrincipalIds: validateAzureAuthField.describe(AZURE_AUTH.ATTACH.allowedServicePrincipalIds),
@ -91,24 +92,28 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(AZURE_AUTH.ATTACH.accessTokenTTL), .describe(AZURE_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(AZURE_AUTH.ATTACH.accessTokenMaxTTL), .describe(AZURE_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(AZURE_AUTH.ATTACH.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z
}), .number()
.int()
.min(0)
.default(0)
.describe(AZURE_AUTH.ATTACH.accessTokenNumUsesLimit)
})
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityAzureAuth: IdentityAzureAuthsSchema identityAzureAuth: IdentityAzureAuthsSchema
@ -163,7 +168,8 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
params: z.object({ params: z.object({
identityId: z.string().trim().describe(AZURE_AUTH.UPDATE.identityId) identityId: z.string().trim().describe(AZURE_AUTH.UPDATE.identityId)
}), }),
body: z.object({ body: z
.object({
tenantId: z.string().trim().optional().describe(AZURE_AUTH.UPDATE.tenantId), tenantId: z.string().trim().optional().describe(AZURE_AUTH.UPDATE.tenantId),
resource: z.string().trim().optional().describe(AZURE_AUTH.UPDATE.resource), resource: z.string().trim().optional().describe(AZURE_AUTH.UPDATE.resource),
allowedServicePrincipalIds: validateAzureAuthField allowedServicePrincipalIds: validateAzureAuthField
@ -178,17 +184,24 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
.optional() .optional()
.describe(AZURE_AUTH.UPDATE.accessTokenTrustedIps), .describe(AZURE_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(AZURE_AUTH.UPDATE.accessTokenTTL), accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(AZURE_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z.number().int().min(0).optional().describe(AZURE_AUTH.UPDATE.accessTokenNumUsesLimit), accessTokenNumUsesLimit: z
.number()
.int()
.min(0)
.optional()
.describe(AZURE_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.max(315360000) .max(315360000)
.refine((value) => value !== 0, { .min(0)
message: "accessTokenMaxTTL must have a non zero number"
})
.optional() .optional()
.describe(AZURE_AUTH.UPDATE.accessTokenMaxTTL) .describe(AZURE_AUTH.UPDATE.accessTokenMaxTTL)
}), })
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityAzureAuth: IdentityAzureAuthsSchema identityAzureAuth: IdentityAzureAuthsSchema

View File

@ -74,7 +74,8 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
params: z.object({ params: z.object({
identityId: z.string().trim().describe(GCP_AUTH.ATTACH.identityId) identityId: z.string().trim().describe(GCP_AUTH.ATTACH.identityId)
}), }),
body: z.object({ body: z
.object({
type: z.enum(["iam", "gce"]), type: z.enum(["iam", "gce"]),
allowedServiceAccounts: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedServiceAccounts), allowedServiceAccounts: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedServiceAccounts),
allowedProjects: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedProjects), allowedProjects: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedProjects),
@ -90,24 +91,23 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(GCP_AUTH.ATTACH.accessTokenTTL), .describe(GCP_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(GCP_AUTH.ATTACH.accessTokenMaxTTL), .describe(GCP_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(GCP_AUTH.ATTACH.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(GCP_AUTH.ATTACH.accessTokenNumUsesLimit)
}), })
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityGcpAuth: IdentityGcpAuthsSchema identityGcpAuth: IdentityGcpAuthsSchema
@ -164,7 +164,8 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
params: z.object({ params: z.object({
identityId: z.string().trim().describe(GCP_AUTH.UPDATE.identityId) identityId: z.string().trim().describe(GCP_AUTH.UPDATE.identityId)
}), }),
body: z.object({ body: z
.object({
type: z.enum(["iam", "gce"]).optional(), type: z.enum(["iam", "gce"]).optional(),
allowedServiceAccounts: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedServiceAccounts), allowedServiceAccounts: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedServiceAccounts),
allowedProjects: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedProjects), allowedProjects: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedProjects),
@ -182,13 +183,15 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional() .optional()
.describe(GCP_AUTH.UPDATE.accessTokenMaxTTL) .describe(GCP_AUTH.UPDATE.accessTokenMaxTTL)
}), })
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityGcpAuth: IdentityGcpAuthsSchema identityGcpAuth: IdentityGcpAuthsSchema

View File

@ -34,23 +34,12 @@ const CreateBaseSchema = z.object({
.min(1) .min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]) .default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(JWT_AUTH.ATTACH.accessTokenTrustedIps), .describe(JWT_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z accessTokenTTL: z.number().int().min(0).max(315360000).default(2592000).describe(JWT_AUTH.ATTACH.accessTokenTTL),
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000)
.describe(JWT_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(JWT_AUTH.ATTACH.accessTokenMaxTTL), .describe(JWT_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(JWT_AUTH.ATTACH.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(JWT_AUTH.ATTACH.accessTokenNumUsesLimit)
@ -70,23 +59,12 @@ const UpdateBaseSchema = z
.min(1) .min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]) .default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(JWT_AUTH.UPDATE.accessTokenTrustedIps), .describe(JWT_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z accessTokenTTL: z.number().int().min(0).max(315360000).default(2592000).describe(JWT_AUTH.UPDATE.accessTokenTTL),
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000)
.describe(JWT_AUTH.UPDATE.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(JWT_AUTH.UPDATE.accessTokenMaxTTL), .describe(JWT_AUTH.UPDATE.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(JWT_AUTH.UPDATE.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(JWT_AUTH.UPDATE.accessTokenNumUsesLimit)

View File

@ -87,7 +87,8 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
params: z.object({ params: z.object({
identityId: z.string().trim().describe(KUBERNETES_AUTH.ATTACH.identityId) identityId: z.string().trim().describe(KUBERNETES_AUTH.ATTACH.identityId)
}), }),
body: z.object({ body: z
.object({
kubernetesHost: z.string().trim().min(1).describe(KUBERNETES_AUTH.ATTACH.kubernetesHost), kubernetesHost: z.string().trim().min(1).describe(KUBERNETES_AUTH.ATTACH.kubernetesHost),
caCert: z.string().trim().default("").describe(KUBERNETES_AUTH.ATTACH.caCert), caCert: z.string().trim().default("").describe(KUBERNETES_AUTH.ATTACH.caCert),
tokenReviewerJwt: z.string().trim().min(1).describe(KUBERNETES_AUTH.ATTACH.tokenReviewerJwt), tokenReviewerJwt: z.string().trim().min(1).describe(KUBERNETES_AUTH.ATTACH.tokenReviewerJwt),
@ -105,20 +106,15 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(KUBERNETES_AUTH.ATTACH.accessTokenTTL), .describe(KUBERNETES_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(KUBERNETES_AUTH.ATTACH.accessTokenMaxTTL), .describe(KUBERNETES_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z accessTokenNumUsesLimit: z
@ -127,7 +123,11 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
.min(0) .min(0)
.default(0) .default(0)
.describe(KUBERNETES_AUTH.ATTACH.accessTokenNumUsesLimit) .describe(KUBERNETES_AUTH.ATTACH.accessTokenNumUsesLimit)
}), })
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityKubernetesAuth: IdentityKubernetesAuthResponseSchema identityKubernetesAuth: IdentityKubernetesAuthResponseSchema
@ -183,7 +183,8 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
params: z.object({ params: z.object({
identityId: z.string().describe(KUBERNETES_AUTH.UPDATE.identityId) identityId: z.string().describe(KUBERNETES_AUTH.UPDATE.identityId)
}), }),
body: z.object({ body: z
.object({
kubernetesHost: z.string().trim().min(1).optional().describe(KUBERNETES_AUTH.UPDATE.kubernetesHost), kubernetesHost: z.string().trim().min(1).optional().describe(KUBERNETES_AUTH.UPDATE.kubernetesHost),
caCert: z.string().trim().optional().describe(KUBERNETES_AUTH.UPDATE.caCert), caCert: z.string().trim().optional().describe(KUBERNETES_AUTH.UPDATE.caCert),
tokenReviewerJwt: z.string().trim().min(1).optional().describe(KUBERNETES_AUTH.UPDATE.tokenReviewerJwt), tokenReviewerJwt: z.string().trim().min(1).optional().describe(KUBERNETES_AUTH.UPDATE.tokenReviewerJwt),
@ -214,13 +215,15 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional() .optional()
.describe(KUBERNETES_AUTH.UPDATE.accessTokenMaxTTL) .describe(KUBERNETES_AUTH.UPDATE.accessTokenMaxTTL)
}), })
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityKubernetesAuth: IdentityKubernetesAuthResponseSchema identityKubernetesAuth: IdentityKubernetesAuthResponseSchema

View File

@ -87,7 +87,8 @@ export const registerIdentityOidcAuthRouter = async (server: FastifyZodProvider)
params: z.object({ params: z.object({
identityId: z.string().trim().describe(OIDC_AUTH.ATTACH.identityId) identityId: z.string().trim().describe(OIDC_AUTH.ATTACH.identityId)
}), }),
body: z.object({ body: z
.object({
oidcDiscoveryUrl: z.string().url().min(1).describe(OIDC_AUTH.ATTACH.oidcDiscoveryUrl), oidcDiscoveryUrl: z.string().url().min(1).describe(OIDC_AUTH.ATTACH.oidcDiscoveryUrl),
caCert: z.string().trim().default("").describe(OIDC_AUTH.ATTACH.caCert), caCert: z.string().trim().default("").describe(OIDC_AUTH.ATTACH.caCert),
boundIssuer: z.string().min(1).describe(OIDC_AUTH.ATTACH.boundIssuer), boundIssuer: z.string().min(1).describe(OIDC_AUTH.ATTACH.boundIssuer),
@ -105,24 +106,23 @@ export const registerIdentityOidcAuthRouter = async (server: FastifyZodProvider)
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(OIDC_AUTH.ATTACH.accessTokenTTL), .describe(OIDC_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(OIDC_AUTH.ATTACH.accessTokenMaxTTL), .describe(OIDC_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(OIDC_AUTH.ATTACH.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(OIDC_AUTH.ATTACH.accessTokenNumUsesLimit)
}), })
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityOidcAuth: IdentityOidcAuthResponseSchema identityOidcAuth: IdentityOidcAuthResponseSchema
@ -202,26 +202,24 @@ export const registerIdentityOidcAuthRouter = async (server: FastifyZodProvider)
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(OIDC_AUTH.UPDATE.accessTokenTTL), .describe(OIDC_AUTH.UPDATE.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(OIDC_AUTH.UPDATE.accessTokenMaxTTL), .describe(OIDC_AUTH.UPDATE.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(OIDC_AUTH.UPDATE.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(OIDC_AUTH.UPDATE.accessTokenNumUsesLimit)
}) })
.partial(), .partial()
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityOidcAuth: IdentityOidcAuthResponseSchema identityOidcAuth: IdentityOidcAuthResponseSchema

View File

@ -26,7 +26,8 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
params: z.object({ params: z.object({
identityId: z.string().trim().describe(TOKEN_AUTH.ATTACH.identityId) identityId: z.string().trim().describe(TOKEN_AUTH.ATTACH.identityId)
}), }),
body: z.object({ body: z
.object({
accessTokenTrustedIps: z accessTokenTrustedIps: z
.object({ .object({
ipAddress: z.string().trim() ipAddress: z.string().trim()
@ -38,24 +39,28 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(TOKEN_AUTH.ATTACH.accessTokenTTL), .describe(TOKEN_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(TOKEN_AUTH.ATTACH.accessTokenMaxTTL), .describe(TOKEN_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(TOKEN_AUTH.ATTACH.accessTokenNumUsesLimit) accessTokenNumUsesLimit: z
}), .number()
.int()
.min(0)
.default(0)
.describe(TOKEN_AUTH.ATTACH.accessTokenNumUsesLimit)
})
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityTokenAuth: IdentityTokenAuthsSchema identityTokenAuth: IdentityTokenAuthsSchema
@ -110,7 +115,8 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
params: z.object({ params: z.object({
identityId: z.string().trim().describe(TOKEN_AUTH.UPDATE.identityId) identityId: z.string().trim().describe(TOKEN_AUTH.UPDATE.identityId)
}), }),
body: z.object({ body: z
.object({
accessTokenTrustedIps: z accessTokenTrustedIps: z
.object({ .object({
ipAddress: z.string().trim() ipAddress: z.string().trim()
@ -120,17 +126,24 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
.optional() .optional()
.describe(TOKEN_AUTH.UPDATE.accessTokenTrustedIps), .describe(TOKEN_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(TOKEN_AUTH.UPDATE.accessTokenTTL), accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(TOKEN_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z.number().int().min(0).optional().describe(TOKEN_AUTH.UPDATE.accessTokenNumUsesLimit), accessTokenNumUsesLimit: z
.number()
.int()
.min(0)
.optional()
.describe(TOKEN_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional() .optional()
.describe(TOKEN_AUTH.UPDATE.accessTokenMaxTTL) .describe(TOKEN_AUTH.UPDATE.accessTokenMaxTTL)
}), })
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityTokenAuth: IdentityTokenAuthsSchema identityTokenAuth: IdentityTokenAuthsSchema

View File

@ -86,7 +86,8 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
params: z.object({ params: z.object({
identityId: z.string().trim().describe(UNIVERSAL_AUTH.ATTACH.identityId) identityId: z.string().trim().describe(UNIVERSAL_AUTH.ATTACH.identityId)
}), }),
body: z.object({ body: z
.object({
clientSecretTrustedIps: z clientSecretTrustedIps: z
.object({ .object({
ipAddress: z.string().trim() ipAddress: z.string().trim()
@ -106,20 +107,15 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
accessTokenTTL: z accessTokenTTL: z
.number() .number()
.int() .int()
.min(1) .min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(UNIVERSAL_AUTH.ATTACH.accessTokenTTL), // 30 days .describe(UNIVERSAL_AUTH.ATTACH.accessTokenTTL), // 30 days
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000) .default(2592000)
.describe(UNIVERSAL_AUTH.ATTACH.accessTokenMaxTTL), // 30 days .describe(UNIVERSAL_AUTH.ATTACH.accessTokenMaxTTL), // 30 days
accessTokenNumUsesLimit: z accessTokenNumUsesLimit: z
@ -128,7 +124,11 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
.min(0) .min(0)
.default(0) .default(0)
.describe(UNIVERSAL_AUTH.ATTACH.accessTokenNumUsesLimit) .describe(UNIVERSAL_AUTH.ATTACH.accessTokenNumUsesLimit)
}), })
.refine(
(val) => val.accessTokenTTL <= val.accessTokenMaxTTL,
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityUniversalAuth: IdentityUniversalAuthsSchema identityUniversalAuth: IdentityUniversalAuthsSchema
@ -181,7 +181,8 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
params: z.object({ params: z.object({
identityId: z.string().describe(UNIVERSAL_AUTH.UPDATE.identityId) identityId: z.string().describe(UNIVERSAL_AUTH.UPDATE.identityId)
}), }),
body: z.object({ body: z
.object({
clientSecretTrustedIps: z clientSecretTrustedIps: z
.object({ .object({
ipAddress: z.string().trim() ipAddress: z.string().trim()
@ -214,13 +215,15 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
accessTokenMaxTTL: z accessTokenMaxTTL: z
.number() .number()
.int() .int()
.min(0)
.max(315360000) .max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional() .optional()
.describe(UNIVERSAL_AUTH.UPDATE.accessTokenMaxTTL) .describe(UNIVERSAL_AUTH.UPDATE.accessTokenMaxTTL)
}), })
.refine(
(val) => (val.accessTokenMaxTTL && val.accessTokenTTL ? val.accessTokenTTL <= val.accessTokenMaxTTL : true),
"Access Token TTL cannot be greater than Access Token Max TTL."
),
response: { response: {
200: z.object({ 200: z.object({
identityUniversalAuth: IdentityUniversalAuthsSchema identityUniversalAuth: IdentityUniversalAuthsSchema

View File

@ -1151,6 +1151,50 @@ export const registerIntegrationAuthRouter = async (server: FastifyZodProvider)
} }
}); });
server.route({
method: "GET",
url: "/:integrationAuthId/vercel/custom-environments",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
querystring: z.object({
teamId: z.string().trim()
}),
params: z.object({
integrationAuthId: z.string().trim()
}),
response: {
200: z.object({
environments: z
.object({
appId: z.string(),
customEnvironments: z
.object({
id: z.string(),
slug: z.string()
})
.array()
})
.array()
})
}
},
handler: async (req) => {
const environments = await server.services.integrationAuth.getVercelCustomEnvironments({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationAuthId,
teamId: req.query.teamId
});
return { environments };
}
});
server.route({ server.route({
method: "GET", method: "GET",
url: "/:integrationAuthId/octopus-deploy/spaces", url: "/:integrationAuthId/octopus-deploy/spaces",

View File

@ -11,7 +11,7 @@ import {
} from "@app/db/schemas"; } from "@app/db/schemas";
import { EventType, UserAgentType } from "@app/ee/services/audit-log/audit-log-types"; import { EventType, UserAgentType } from "@app/ee/services/audit-log/audit-log-types";
import { AUDIT_LOGS, ORGANIZATIONS } from "@app/lib/api-docs"; import { AUDIT_LOGS, ORGANIZATIONS } from "@app/lib/api-docs";
import { getLastMidnightDateISO } from "@app/lib/fn"; import { getLastMidnightDateISO, removeTrailingSlash } from "@app/lib/fn";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter"; import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { slugSchema } from "@app/server/lib/schemas"; import { slugSchema } from "@app/server/lib/schemas";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth"; import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
@ -113,6 +113,12 @@ export const registerOrgRouter = async (server: FastifyZodProvider) => {
querystring: z.object({ querystring: z.object({
projectId: z.string().optional().describe(AUDIT_LOGS.EXPORT.projectId), projectId: z.string().optional().describe(AUDIT_LOGS.EXPORT.projectId),
actorType: z.nativeEnum(ActorType).optional(), actorType: z.nativeEnum(ActorType).optional(),
secretPath: z
.string()
.optional()
.transform((val) => (!val ? val : removeTrailingSlash(val)))
.describe(AUDIT_LOGS.EXPORT.secretPath),
// eventType is split with , for multiple values, we need to transform it to array // eventType is split with , for multiple values, we need to transform it to array
eventType: z eventType: z
.string() .string()

View File

@ -203,7 +203,8 @@ export const registerPasswordRouter = async (server: FastifyZodProvider) => {
encryptedPrivateKeyIV: z.string().trim(), encryptedPrivateKeyIV: z.string().trim(),
encryptedPrivateKeyTag: z.string().trim(), encryptedPrivateKeyTag: z.string().trim(),
salt: z.string().trim(), salt: z.string().trim(),
verifier: z.string().trim() verifier: z.string().trim(),
password: z.string().trim()
}), }),
response: { response: {
200: z.object({ 200: z.object({
@ -218,7 +219,69 @@ export const registerPasswordRouter = async (server: FastifyZodProvider) => {
userId: token.userId userId: token.userId
}); });
return { message: "Successfully updated backup private key" }; return { message: "Successfully reset password" };
}
});
server.route({
method: "POST",
url: "/email/password-setup",
config: {
rateLimit: authRateLimit
},
schema: {
response: {
200: z.object({
message: z.string()
})
}
},
handler: async (req) => {
await server.services.password.sendPasswordSetupEmail(req.permission);
return {
message: "A password setup link has been sent"
};
}
});
server.route({
method: "POST",
url: "/password-setup",
config: {
rateLimit: authRateLimit
},
schema: {
body: z.object({
protectedKey: z.string().trim(),
protectedKeyIV: z.string().trim(),
protectedKeyTag: z.string().trim(),
encryptedPrivateKey: z.string().trim(),
encryptedPrivateKeyIV: z.string().trim(),
encryptedPrivateKeyTag: z.string().trim(),
salt: z.string().trim(),
verifier: z.string().trim(),
password: z.string().trim(),
token: z.string().trim()
}),
response: {
200: z.object({
message: z.string()
})
}
},
handler: async (req, res) => {
await server.services.password.setupPassword(req.body, req.permission);
const appCfg = getConfig();
void res.cookie("jid", "", {
httpOnly: true,
path: "/",
sameSite: "strict",
secure: appCfg.HTTPS_ENABLED
});
return { message: "Successfully setup password" };
} }
}); });
}; };

View File

@ -0,0 +1,13 @@
import { CreateGcpSyncSchema, GcpSyncSchema, UpdateGcpSyncSchema } from "@app/services/secret-sync/gcp";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { registerSyncSecretsEndpoints } from "./secret-sync-endpoints";
export const registerGcpSyncRouter = async (server: FastifyZodProvider) =>
registerSyncSecretsEndpoints({
destination: SecretSync.GCPSecretManager,
server,
responseSchema: GcpSyncSchema,
createSchema: CreateGcpSyncSchema,
updateSchema: UpdateGcpSyncSchema
});

View File

@ -1,11 +1,13 @@
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums"; import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { registerAwsParameterStoreSyncRouter } from "./aws-parameter-store-sync-router"; import { registerAwsParameterStoreSyncRouter } from "./aws-parameter-store-sync-router";
import { registerGcpSyncRouter } from "./gcp-sync-router";
import { registerGitHubSyncRouter } from "./github-sync-router"; import { registerGitHubSyncRouter } from "./github-sync-router";
export * from "./secret-sync-router"; export * from "./secret-sync-router";
export const SECRET_SYNC_REGISTER_ROUTER_MAP: Record<SecretSync, (server: FastifyZodProvider) => Promise<void>> = { export const SECRET_SYNC_REGISTER_ROUTER_MAP: Record<SecretSync, (server: FastifyZodProvider) => Promise<void>> = {
[SecretSync.AWSParameterStore]: registerAwsParameterStoreSyncRouter, [SecretSync.AWSParameterStore]: registerAwsParameterStoreSyncRouter,
[SecretSync.GitHub]: registerGitHubSyncRouter [SecretSync.GitHub]: registerGitHubSyncRouter,
[SecretSync.GCPSecretManager]: registerGcpSyncRouter
}; };

View File

@ -9,13 +9,19 @@ import {
AwsParameterStoreSyncListItemSchema, AwsParameterStoreSyncListItemSchema,
AwsParameterStoreSyncSchema AwsParameterStoreSyncSchema
} from "@app/services/secret-sync/aws-parameter-store"; } from "@app/services/secret-sync/aws-parameter-store";
import { GcpSyncListItemSchema, GcpSyncSchema } from "@app/services/secret-sync/gcp";
import { GitHubSyncListItemSchema, GitHubSyncSchema } from "@app/services/secret-sync/github"; import { GitHubSyncListItemSchema, GitHubSyncSchema } from "@app/services/secret-sync/github";
const SecretSyncSchema = z.discriminatedUnion("destination", [AwsParameterStoreSyncSchema, GitHubSyncSchema]); const SecretSyncSchema = z.discriminatedUnion("destination", [
AwsParameterStoreSyncSchema,
GitHubSyncSchema,
GcpSyncSchema
]);
const SecretSyncOptionsSchema = z.discriminatedUnion("destination", [ const SecretSyncOptionsSchema = z.discriminatedUnion("destination", [
AwsParameterStoreSyncListItemSchema, AwsParameterStoreSyncListItemSchema,
GitHubSyncListItemSchema GitHubSyncListItemSchema,
GcpSyncListItemSchema
]); ]);
export const registerSecretSyncRouter = async (server: FastifyZodProvider) => { export const registerSecretSyncRouter = async (server: FastifyZodProvider) => {

View File

@ -1,6 +1,7 @@
export enum AppConnection { export enum AppConnection {
GitHub = "github", GitHub = "github",
AWS = "aws" AWS = "aws",
GCP = "gcp"
} }
export enum AWSRegion { export enum AWSRegion {

View File

@ -1,4 +1,5 @@
import { TAppConnections } from "@app/db/schemas/app-connections"; import { TAppConnections } from "@app/db/schemas/app-connections";
import { generateHash } from "@app/lib/crypto/encryption";
import { AppConnection } from "@app/services/app-connection/app-connection-enums"; import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { TAppConnectionServiceFactoryDep } from "@app/services/app-connection/app-connection-service"; import { TAppConnectionServiceFactoryDep } from "@app/services/app-connection/app-connection-service";
import { TAppConnection, TAppConnectionConfig } from "@app/services/app-connection/app-connection-types"; import { TAppConnection, TAppConnectionConfig } from "@app/services/app-connection/app-connection-types";
@ -7,6 +8,11 @@ import {
getAwsAppConnectionListItem, getAwsAppConnectionListItem,
validateAwsConnectionCredentials validateAwsConnectionCredentials
} from "@app/services/app-connection/aws"; } from "@app/services/app-connection/aws";
import {
GcpConnectionMethod,
getGcpAppConnectionListItem,
validateGcpConnectionCredentials
} from "@app/services/app-connection/gcp";
import { import {
getGitHubConnectionListItem, getGitHubConnectionListItem,
GitHubConnectionMethod, GitHubConnectionMethod,
@ -15,7 +21,9 @@ import {
import { KmsDataKey } from "@app/services/kms/kms-types"; import { KmsDataKey } from "@app/services/kms/kms-types";
export const listAppConnectionOptions = () => { export const listAppConnectionOptions = () => {
return [getAwsAppConnectionListItem(), getGitHubConnectionListItem()].sort((a, b) => a.name.localeCompare(b.name)); return [getAwsAppConnectionListItem(), getGitHubConnectionListItem(), getGcpAppConnectionListItem()].sort((a, b) =>
a.name.localeCompare(b.name)
);
}; };
export const encryptAppConnectionCredentials = async ({ export const encryptAppConnectionCredentials = async ({
@ -69,6 +77,8 @@ export const validateAppConnectionCredentials = async (
return validateAwsConnectionCredentials(appConnection); return validateAwsConnectionCredentials(appConnection);
case AppConnection.GitHub: case AppConnection.GitHub:
return validateGitHubConnectionCredentials(appConnection); return validateGitHubConnectionCredentials(appConnection);
case AppConnection.GCP:
return validateGcpConnectionCredentials(appConnection);
default: default:
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
throw new Error(`Unhandled App Connection ${app}`); throw new Error(`Unhandled App Connection ${app}`);
@ -85,6 +95,8 @@ export const getAppConnectionMethodName = (method: TAppConnection["method"]) =>
return "Access Key"; return "Access Key";
case AwsConnectionMethod.AssumeRole: case AwsConnectionMethod.AssumeRole:
return "Assume Role"; return "Assume Role";
case GcpConnectionMethod.ServiceAccountImpersonation:
return "Service Account Impersonation";
default: default:
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
throw new Error(`Unhandled App Connection Method: ${method}`); throw new Error(`Unhandled App Connection Method: ${method}`);
@ -101,6 +113,7 @@ export const decryptAppConnection = async (
encryptedCredentials: appConnection.encryptedCredentials, encryptedCredentials: appConnection.encryptedCredentials,
orgId: appConnection.orgId, orgId: appConnection.orgId,
kmsService kmsService
}) }),
credentialsHash: generateHash(appConnection.encryptedCredentials)
} as TAppConnection; } as TAppConnection;
}; };

View File

@ -2,5 +2,6 @@ import { AppConnection } from "./app-connection-enums";
export const APP_CONNECTION_NAME_MAP: Record<AppConnection, string> = { export const APP_CONNECTION_NAME_MAP: Record<AppConnection, string> = {
[AppConnection.AWS]: "AWS", [AppConnection.AWS]: "AWS",
[AppConnection.GitHub]: "GitHub" [AppConnection.GitHub]: "GitHub",
[AppConnection.GCP]: "GCP"
}; };

View File

@ -10,6 +10,8 @@ export const BaseAppConnectionSchema = AppConnectionsSchema.omit({
encryptedCredentials: true, encryptedCredentials: true,
app: true, app: true,
method: true method: true
}).extend({
credentialsHash: z.string().optional()
}); });
export const GenericCreateAppConnectionFieldsSchema = (app: AppConnection) => export const GenericCreateAppConnectionFieldsSchema = (app: AppConnection) =>

View File

@ -2,6 +2,8 @@ import { ForbiddenError, subject } from "@casl/ability";
import { OrgPermissionAppConnectionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission"; import { OrgPermissionAppConnectionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service"; import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { generateHash } from "@app/lib/crypto/encryption";
import { DatabaseErrorCode } from "@app/lib/error-codes";
import { BadRequestError, DatabaseError, NotFoundError } from "@app/lib/errors"; import { BadRequestError, DatabaseError, NotFoundError } from "@app/lib/errors";
import { DiscriminativePick, OrgServiceActor } from "@app/lib/types"; import { DiscriminativePick, OrgServiceActor } from "@app/lib/types";
import { AppConnection } from "@app/services/app-connection/app-connection-enums"; import { AppConnection } from "@app/services/app-connection/app-connection-enums";
@ -26,6 +28,8 @@ import { githubConnectionService } from "@app/services/app-connection/github/git
import { TKmsServiceFactory } from "@app/services/kms/kms-service"; import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TAppConnectionDALFactory } from "./app-connection-dal"; import { TAppConnectionDALFactory } from "./app-connection-dal";
import { ValidateGcpConnectionCredentialsSchema } from "./gcp";
import { gcpConnectionService } from "./gcp/gcp-connection-service";
export type TAppConnectionServiceFactoryDep = { export type TAppConnectionServiceFactoryDep = {
appConnectionDAL: TAppConnectionDALFactory; appConnectionDAL: TAppConnectionDALFactory;
@ -37,7 +41,8 @@ export type TAppConnectionServiceFactory = ReturnType<typeof appConnectionServic
const VALIDATE_APP_CONNECTION_CREDENTIALS_MAP: Record<AppConnection, TValidateAppConnectionCredentials> = { const VALIDATE_APP_CONNECTION_CREDENTIALS_MAP: Record<AppConnection, TValidateAppConnectionCredentials> = {
[AppConnection.AWS]: ValidateAwsConnectionCredentialsSchema, [AppConnection.AWS]: ValidateAwsConnectionCredentialsSchema,
[AppConnection.GitHub]: ValidateGitHubConnectionCredentialsSchema [AppConnection.GitHub]: ValidateGitHubConnectionCredentialsSchema,
[AppConnection.GCP]: ValidateGcpConnectionCredentialsSchema
}; };
export const appConnectionServiceFactory = ({ export const appConnectionServiceFactory = ({
@ -140,22 +145,6 @@ export const appConnectionServiceFactory = ({
OrgPermissionSubjects.AppConnections OrgPermissionSubjects.AppConnections
); );
const appConnection = await appConnectionDAL.transaction(async (tx) => {
const isConflictingName = Boolean(
await appConnectionDAL.findOne(
{
name: params.name,
orgId: actor.orgId
},
tx
)
);
if (isConflictingName)
throw new BadRequestError({
message: `An App Connection with the name "${params.name}" already exists`
});
const validatedCredentials = await validateAppConnectionCredentials({ const validatedCredentials = await validateAppConnectionCredentials({
app, app,
credentials, credentials,
@ -169,24 +158,27 @@ export const appConnectionServiceFactory = ({
kmsService kmsService
}); });
const connection = await appConnectionDAL.create( try {
{ const connection = await appConnectionDAL.create({
orgId: actor.orgId, orgId: actor.orgId,
encryptedCredentials, encryptedCredentials,
method, method,
app, app,
...params ...params
}, });
tx
);
return { return {
...connection, ...connection,
credentialsHash: generateHash(connection.encryptedCredentials),
credentials: validatedCredentials credentials: validatedCredentials
}; } as TAppConnection;
}); } catch (err) {
if (err instanceof DatabaseError && (err.error as { code: string })?.code === DatabaseErrorCode.UniqueViolation) {
throw new BadRequestError({ message: `An App Connection with the name "${params.name}" already exists` });
}
return appConnection as TAppConnection; throw err;
}
}; };
const updateAppConnection = async ( const updateAppConnection = async (
@ -210,24 +202,6 @@ export const appConnectionServiceFactory = ({
OrgPermissionSubjects.AppConnections OrgPermissionSubjects.AppConnections
); );
const updatedAppConnection = await appConnectionDAL.transaction(async (tx) => {
if (params.name && appConnection.name !== params.name) {
const isConflictingName = Boolean(
await appConnectionDAL.findOne(
{
name: params.name,
orgId: appConnection.orgId
},
tx
)
);
if (isConflictingName)
throw new BadRequestError({
message: `An App Connection with the name "${params.name}" already exists`
});
}
let encryptedCredentials: undefined | Buffer; let encryptedCredentials: undefined | Buffer;
if (credentials) { if (credentials) {
@ -262,20 +236,21 @@ export const appConnectionServiceFactory = ({
}); });
} }
const updatedConnection = await appConnectionDAL.updateById( try {
connectionId, const updatedConnection = await appConnectionDAL.updateById(connectionId, {
{
orgId: actor.orgId, orgId: actor.orgId,
encryptedCredentials, encryptedCredentials,
...params ...params
},
tx
);
return updatedConnection;
}); });
return decryptAppConnection(updatedAppConnection, kmsService); return await decryptAppConnection(updatedConnection, kmsService);
} catch (err) {
if (err instanceof DatabaseError && (err.error as { code: string })?.code === DatabaseErrorCode.UniqueViolation) {
throw new BadRequestError({ message: `An App Connection with the name "${params.name}" already exists` });
}
throw err;
}
}; };
const deleteAppConnection = async (app: AppConnection, connectionId: string, actor: OrgServiceActor) => { const deleteAppConnection = async (app: AppConnection, connectionId: string, actor: OrgServiceActor) => {
@ -306,7 +281,10 @@ export const appConnectionServiceFactory = ({
return await decryptAppConnection(deletedAppConnection, kmsService); return await decryptAppConnection(deletedAppConnection, kmsService);
} catch (err) { } catch (err) {
if (err instanceof DatabaseError && (err.error as { code: string })?.code === "23503") { if (
err instanceof DatabaseError &&
(err.error as { code: string })?.code === DatabaseErrorCode.ForeignKeyViolation
) {
throw new BadRequestError({ throw new BadRequestError({
message: message:
"Cannot delete App Connection with existing connections. Remove all existing connections and try again." "Cannot delete App Connection with existing connections. Remove all existing connections and try again."
@ -382,6 +360,7 @@ export const appConnectionServiceFactory = ({
deleteAppConnection, deleteAppConnection,
connectAppConnectionById, connectAppConnectionById,
listAvailableAppConnectionsForUser, listAvailableAppConnectionsForUser,
github: githubConnectionService(connectAppConnectionById) github: githubConnectionService(connectAppConnectionById),
gcp: gcpConnectionService(connectAppConnectionById)
}; };
}; };

View File

@ -11,9 +11,11 @@ import {
TValidateGitHubConnectionCredentials TValidateGitHubConnectionCredentials
} from "@app/services/app-connection/github"; } from "@app/services/app-connection/github";
export type TAppConnection = { id: string } & (TAwsConnection | TGitHubConnection); import { TGcpConnection, TGcpConnectionConfig, TGcpConnectionInput, TValidateGcpConnectionCredentials } from "./gcp";
export type TAppConnectionInput = { id: string } & (TAwsConnectionInput | TGitHubConnectionInput); export type TAppConnection = { id: string } & (TAwsConnection | TGitHubConnection | TGcpConnection);
export type TAppConnectionInput = { id: string } & (TAwsConnectionInput | TGitHubConnectionInput | TGcpConnectionInput);
export type TCreateAppConnectionDTO = Pick< export type TCreateAppConnectionDTO = Pick<
TAppConnectionInput, TAppConnectionInput,
@ -24,8 +26,9 @@ export type TUpdateAppConnectionDTO = Partial<Omit<TCreateAppConnectionDTO, "met
connectionId: string; connectionId: string;
}; };
export type TAppConnectionConfig = TAwsConnectionConfig | TGitHubConnectionConfig; export type TAppConnectionConfig = TAwsConnectionConfig | TGitHubConnectionConfig | TGcpConnectionConfig;
export type TValidateAppConnectionCredentials = export type TValidateAppConnectionCredentials =
| TValidateAwsConnectionCredentials | TValidateAwsConnectionCredentials
| TValidateGitHubConnectionCredentials; | TValidateGitHubConnectionCredentials
| TValidateGcpConnectionCredentials;

View File

@ -81,11 +81,14 @@ export const getAwsConnectionConfig = async (appConnection: TAwsConnectionConfig
}; };
export const validateAwsConnectionCredentials = async (appConnection: TAwsConnectionConfig) => { export const validateAwsConnectionCredentials = async (appConnection: TAwsConnectionConfig) => {
const awsConfig = await getAwsConnectionConfig(appConnection); let resp: AWS.STS.GetCallerIdentityResponse & {
const sts = new AWS.STS(awsConfig); $response: AWS.Response<AWS.STS.GetCallerIdentityResponse, AWS.AWSError>;
let resp: Awaited<ReturnType<ReturnType<typeof sts.getCallerIdentity>["promise"]>>; };
try { try {
const awsConfig = await getAwsConnectionConfig(appConnection);
const sts = new AWS.STS(awsConfig);
resp = await sts.getCallerIdentity().promise(); resp = await sts.getCallerIdentity().promise();
} catch (e: unknown) { } catch (e: unknown) {
throw new BadRequestError({ throw new BadRequestError({
@ -93,7 +96,7 @@ export const validateAwsConnectionCredentials = async (appConnection: TAwsConnec
}); });
} }
if (resp.$response.httpResponse.statusCode !== 200) if (resp?.$response.httpResponse.statusCode !== 200)
throw new InternalServerError({ throw new InternalServerError({
message: `Unable to validate credentials: ${ message: `Unable to validate credentials: ${
resp.$response.error?.message ?? resp.$response.error?.message ??

View File

@ -0,0 +1,3 @@
export enum GcpConnectionMethod {
ServiceAccountImpersonation = "service-account-impersonation"
}

View File

@ -0,0 +1,164 @@
import { gaxios, Impersonated, JWT } from "google-auth-library";
import { GetAccessTokenResponse } from "google-auth-library/build/src/auth/oauth2client";
import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { BadRequestError, InternalServerError } from "@app/lib/errors";
import { IntegrationUrls } from "@app/services/integration-auth/integration-list";
import { AppConnection } from "../app-connection-enums";
import { getAppConnectionMethodName } from "../app-connection-fns";
import { GcpConnectionMethod } from "./gcp-connection-enums";
import {
GCPApp,
GCPGetProjectsRes,
GCPGetServiceRes,
TGcpConnection,
TGcpConnectionConfig
} from "./gcp-connection-types";
export const getGcpAppConnectionListItem = () => {
return {
name: "GCP" as const,
app: AppConnection.GCP as const,
methods: Object.values(GcpConnectionMethod) as [GcpConnectionMethod.ServiceAccountImpersonation]
};
};
export const getGcpConnectionAuthToken = async (appConnection: TGcpConnectionConfig) => {
const appCfg = getConfig();
if (!appCfg.INF_APP_CONNECTION_GCP_SERVICE_ACCOUNT_CREDENTIAL) {
throw new InternalServerError({
message: `Environment variables have not been configured for GCP ${getAppConnectionMethodName(
GcpConnectionMethod.ServiceAccountImpersonation
)}`
});
}
const credJson = JSON.parse(appCfg.INF_APP_CONNECTION_GCP_SERVICE_ACCOUNT_CREDENTIAL) as {
client_email: string;
private_key: string;
};
const sourceClient = new JWT({
email: credJson.client_email,
key: credJson.private_key,
scopes: ["https://www.googleapis.com/auth/cloud-platform"]
});
const impersonatedCredentials = new Impersonated({
sourceClient,
targetPrincipal: appConnection.credentials.serviceAccountEmail,
lifetime: 3600,
delegates: [],
targetScopes: ["https://www.googleapis.com/auth/cloud-platform"]
});
let tokenResponse: GetAccessTokenResponse | undefined;
try {
tokenResponse = await impersonatedCredentials.getAccessToken();
} catch (error) {
let message = "Unable to validate connection";
if (error instanceof gaxios.GaxiosError) {
message = error.message;
}
throw new BadRequestError({
message
});
}
if (!tokenResponse || !tokenResponse.token) {
throw new BadRequestError({
message: `Unable to validate connection`
});
}
return tokenResponse.token;
};
export const getGcpSecretManagerProjects = async (appConnection: TGcpConnection) => {
const accessToken = await getGcpConnectionAuthToken(appConnection);
let gcpApps: GCPApp[] = [];
const pageSize = 100;
let pageToken: string | undefined;
let hasMorePages = true;
const projects: {
name: string;
id: string;
}[] = [];
while (hasMorePages) {
const params = new URLSearchParams({
pageSize: String(pageSize),
...(pageToken ? { pageToken } : {})
});
// eslint-disable-next-line no-await-in-loop
const { data } = await request.get<GCPGetProjectsRes>(`${IntegrationUrls.GCP_API_URL}/v1/projects`, {
params,
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
});
gcpApps = gcpApps.concat(data.projects);
if (!data.nextPageToken) {
hasMorePages = false;
}
pageToken = data.nextPageToken;
}
// eslint-disable-next-line
for await (const gcpApp of gcpApps) {
try {
const res = (
await request.get<GCPGetServiceRes>(
`${IntegrationUrls.GCP_SERVICE_USAGE_URL}/v1/projects/${gcpApp.projectId}/services/${IntegrationUrls.GCP_SECRET_MANAGER_SERVICE_NAME}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
)
).data;
if (res.state === "ENABLED") {
projects.push({
name: gcpApp.name,
id: gcpApp.projectId
});
}
} catch {
// eslint-disable-next-line
continue;
}
}
return projects;
};
export const validateGcpConnectionCredentials = async (appConnection: TGcpConnectionConfig) => {
// Check if provided service account email suffix matches organization ID.
// We do this to mitigate confused deputy attacks in multi-tenant instances
if (appConnection.credentials.serviceAccountEmail) {
const expectedAccountIdSuffix = appConnection.orgId.split("-").slice(0, 2).join("-");
const serviceAccountId = appConnection.credentials.serviceAccountEmail.split("@")[0];
if (!serviceAccountId.endsWith(expectedAccountIdSuffix)) {
throw new BadRequestError({
message: `GCP service account ID must have a suffix of "${expectedAccountIdSuffix}" e.g. service-account-${expectedAccountIdSuffix}@my-project.iam.gserviceaccount.com"`
});
}
}
await getGcpConnectionAuthToken(appConnection);
return appConnection.credentials;
};

View File

@ -0,0 +1,65 @@
import z from "zod";
import { AppConnections } from "@app/lib/api-docs";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
BaseAppConnectionSchema,
GenericCreateAppConnectionFieldsSchema,
GenericUpdateAppConnectionFieldsSchema
} from "@app/services/app-connection/app-connection-schemas";
import { GcpConnectionMethod } from "./gcp-connection-enums";
export const GcpConnectionServiceAccountImpersonationCredentialsSchema = z.object({
serviceAccountEmail: z.string().email().trim().min(1, "Service account email required")
});
const BaseGcpConnectionSchema = BaseAppConnectionSchema.extend({ app: z.literal(AppConnection.GCP) });
export const GcpConnectionSchema = z.intersection(
BaseGcpConnectionSchema,
z.discriminatedUnion("method", [
z.object({
method: z.literal(GcpConnectionMethod.ServiceAccountImpersonation),
credentials: GcpConnectionServiceAccountImpersonationCredentialsSchema
})
])
);
export const SanitizedGcpConnectionSchema = z.discriminatedUnion("method", [
BaseGcpConnectionSchema.extend({
method: z.literal(GcpConnectionMethod.ServiceAccountImpersonation),
credentials: GcpConnectionServiceAccountImpersonationCredentialsSchema.pick({})
})
]);
export const ValidateGcpConnectionCredentialsSchema = z.discriminatedUnion("method", [
z.object({
method: z
.literal(GcpConnectionMethod.ServiceAccountImpersonation)
.describe(AppConnections?.CREATE(AppConnection.GCP).method),
credentials: GcpConnectionServiceAccountImpersonationCredentialsSchema.describe(
AppConnections.CREATE(AppConnection.GCP).credentials
)
})
]);
export const CreateGcpConnectionSchema = ValidateGcpConnectionCredentialsSchema.and(
GenericCreateAppConnectionFieldsSchema(AppConnection.GCP)
);
export const UpdateGcpConnectionSchema = z
.object({
credentials: GcpConnectionServiceAccountImpersonationCredentialsSchema.optional().describe(
AppConnections.UPDATE(AppConnection.GCP).credentials
)
})
.and(GenericUpdateAppConnectionFieldsSchema(AppConnection.GCP));
export const GcpConnectionListItemSchema = z.object({
name: z.literal("GCP"),
app: z.literal(AppConnection.GCP),
// the below is preferable but currently breaks with our zod to json schema parser
// methods: z.tuple([z.literal(GitHubConnectionMethod.App), z.literal(GitHubConnectionMethod.OAuth)]),
methods: z.nativeEnum(GcpConnectionMethod).array()
});

View File

@ -0,0 +1,29 @@
import { OrgServiceActor } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import { getGcpSecretManagerProjects } from "./gcp-connection-fns";
import { TGcpConnection } from "./gcp-connection-types";
type TGetAppConnectionFunc = (
app: AppConnection,
connectionId: string,
actor: OrgServiceActor
) => Promise<TGcpConnection>;
export const gcpConnectionService = (getAppConnection: TGetAppConnectionFunc) => {
const listSecretManagerProjects = async (connectionId: string, actor: OrgServiceActor) => {
const appConnection = await getAppConnection(AppConnection.GCP, connectionId, actor);
try {
const projects = await getGcpSecretManagerProjects(appConnection);
return projects;
} catch (error) {
return [];
}
};
return {
listSecretManagerProjects
};
};

View File

@ -0,0 +1,45 @@
import z from "zod";
import { DiscriminativePick } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import {
CreateGcpConnectionSchema,
GcpConnectionSchema,
ValidateGcpConnectionCredentialsSchema
} from "./gcp-connection-schemas";
export type TGcpConnection = z.infer<typeof GcpConnectionSchema>;
export type TGcpConnectionInput = z.infer<typeof CreateGcpConnectionSchema> & {
app: AppConnection.GCP;
};
export type TValidateGcpConnectionCredentials = typeof ValidateGcpConnectionCredentialsSchema;
export type TGcpConnectionConfig = DiscriminativePick<TGcpConnectionInput, "method" | "app" | "credentials"> & {
orgId: string;
};
export type GCPApp = {
projectNumber: string;
projectId: string;
lifecycleState: "ACTIVE" | "LIFECYCLE_STATE_UNSPECIFIED" | "DELETE_REQUESTED" | "DELETE_IN_PROGRESS";
name: string;
createTime: string;
parent: {
type: "organization" | "folder" | "project";
id: string;
};
};
export type GCPGetProjectsRes = {
projects: GCPApp[];
nextPageToken?: string;
};
export type GCPGetServiceRes = {
name: string;
parent: string;
state: "ENABLED" | "DISABLED" | "STATE_UNSPECIFIED";
};

View File

@ -0,0 +1,4 @@
export * from "./gcp-connection-enums";
export * from "./gcp-connection-fns";
export * from "./gcp-connection-schemas";
export * from "./gcp-connection-types";

View File

@ -57,6 +57,12 @@ export const getTokenConfig = (tokenType: TokenType) => {
const expiresAt = new Date(new Date().getTime() + 86400000); const expiresAt = new Date(new Date().getTime() + 86400000);
return { token, expiresAt }; return { token, expiresAt };
} }
case TokenType.TOKEN_EMAIL_PASSWORD_SETUP: {
// generate random hex
const token = crypto.randomBytes(16).toString("hex");
const expiresAt = new Date(new Date().getTime() + 86400000);
return { token, expiresAt };
}
case TokenType.TOKEN_USER_UNLOCK: { case TokenType.TOKEN_USER_UNLOCK: {
const token = crypto.randomBytes(16).toString("hex"); const token = crypto.randomBytes(16).toString("hex");
const expiresAt = new Date(new Date().getTime() + 259200000); const expiresAt = new Date(new Date().getTime() + 259200000);

View File

@ -6,6 +6,7 @@ export enum TokenType {
TOKEN_EMAIL_MFA = "emailMfa", TOKEN_EMAIL_MFA = "emailMfa",
TOKEN_EMAIL_ORG_INVITATION = "organizationInvitation", TOKEN_EMAIL_ORG_INVITATION = "organizationInvitation",
TOKEN_EMAIL_PASSWORD_RESET = "passwordReset", TOKEN_EMAIL_PASSWORD_RESET = "passwordReset",
TOKEN_EMAIL_PASSWORD_SETUP = "passwordSetup",
TOKEN_USER_UNLOCK = "userUnlock" TOKEN_USER_UNLOCK = "userUnlock"
} }

View File

@ -4,6 +4,8 @@ import jwt from "jsonwebtoken";
import { SecretEncryptionAlgo, SecretKeyEncoding } from "@app/db/schemas"; import { SecretEncryptionAlgo, SecretKeyEncoding } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { generateSrpServerKey, srpCheckClientProof } from "@app/lib/crypto"; import { generateSrpServerKey, srpCheckClientProof } from "@app/lib/crypto";
import { BadRequestError } from "@app/lib/errors";
import { OrgServiceActor } from "@app/lib/types";
import { TAuthTokenServiceFactory } from "../auth-token/auth-token-service"; import { TAuthTokenServiceFactory } from "../auth-token/auth-token-service";
import { TokenType } from "../auth-token/auth-token-types"; import { TokenType } from "../auth-token/auth-token-types";
@ -11,8 +13,13 @@ import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { TTotpConfigDALFactory } from "../totp/totp-config-dal"; import { TTotpConfigDALFactory } from "../totp/totp-config-dal";
import { TUserDALFactory } from "../user/user-dal"; import { TUserDALFactory } from "../user/user-dal";
import { TAuthDALFactory } from "./auth-dal"; import { TAuthDALFactory } from "./auth-dal";
import { TChangePasswordDTO, TCreateBackupPrivateKeyDTO, TResetPasswordViaBackupKeyDTO } from "./auth-password-type"; import {
import { AuthTokenType } from "./auth-type"; TChangePasswordDTO,
TCreateBackupPrivateKeyDTO,
TResetPasswordViaBackupKeyDTO,
TSetupPasswordViaBackupKeyDTO
} from "./auth-password-type";
import { ActorType, AuthMethod, AuthTokenType } from "./auth-type";
type TAuthPasswordServiceFactoryDep = { type TAuthPasswordServiceFactoryDep = {
authDAL: TAuthDALFactory; authDAL: TAuthDALFactory;
@ -169,8 +176,13 @@ export const authPaswordServiceFactory = ({
verifier, verifier,
encryptedPrivateKeyIV, encryptedPrivateKeyIV,
encryptedPrivateKeyTag, encryptedPrivateKeyTag,
userId userId,
password
}: TResetPasswordViaBackupKeyDTO) => { }: TResetPasswordViaBackupKeyDTO) => {
const cfg = getConfig();
const hashedPassword = await bcrypt.hash(password, cfg.BCRYPT_SALT_ROUND);
await userDAL.updateUserEncryptionByUserId(userId, { await userDAL.updateUserEncryptionByUserId(userId, {
encryptionVersion: 2, encryptionVersion: 2,
protectedKey, protectedKey,
@ -180,7 +192,8 @@ export const authPaswordServiceFactory = ({
iv: encryptedPrivateKeyIV, iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag, tag: encryptedPrivateKeyTag,
salt, salt,
verifier verifier,
hashedPassword
}); });
await userDAL.updateById(userId, { await userDAL.updateById(userId, {
@ -267,6 +280,108 @@ export const authPaswordServiceFactory = ({
return backupKey; return backupKey;
}; };
const sendPasswordSetupEmail = async (actor: OrgServiceActor) => {
if (actor.type !== ActorType.USER)
throw new BadRequestError({ message: `Actor of type ${actor.type} cannot set password` });
const user = await userDAL.findById(actor.id);
if (!user) throw new BadRequestError({ message: `Could not find user with ID ${actor.id}` });
if (!user.isAccepted || !user.authMethods)
throw new BadRequestError({ message: `You must complete signup to set a password` });
const cfg = getConfig();
const token = await tokenService.createTokenForUser({
type: TokenType.TOKEN_EMAIL_PASSWORD_SETUP,
userId: user.id
});
const email = user.email ?? user.username;
await smtpService.sendMail({
template: SmtpTemplates.SetupPassword,
recipients: [email],
subjectLine: "Infisical Password Setup",
substitutions: {
email,
token,
callback_url: cfg.SITE_URL ? `${cfg.SITE_URL}/password-setup` : ""
}
});
};
const setupPassword = async (
{
encryptedPrivateKey,
protectedKeyTag,
protectedKey,
protectedKeyIV,
salt,
verifier,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
password,
token
}: TSetupPasswordViaBackupKeyDTO,
actor: OrgServiceActor
) => {
try {
await tokenService.validateTokenForUser({
type: TokenType.TOKEN_EMAIL_PASSWORD_SETUP,
userId: actor.id,
code: token
});
} catch (e) {
throw new BadRequestError({ message: "Expired or invalid token. Please try again." });
}
await userDAL.transaction(async (tx) => {
const user = await userDAL.findById(actor.id, tx);
if (!user) throw new BadRequestError({ message: `Could not find user with ID ${actor.id}` });
if (!user.isAccepted || !user.authMethods)
throw new BadRequestError({ message: `You must complete signup to set a password` });
if (!user.authMethods.includes(AuthMethod.EMAIL)) {
await userDAL.updateById(
actor.id,
{
authMethods: [...user.authMethods, AuthMethod.EMAIL]
},
tx
);
}
const cfg = getConfig();
const hashedPassword = await bcrypt.hash(password, cfg.BCRYPT_SALT_ROUND);
await userDAL.updateUserEncryptionByUserId(
actor.id,
{
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier,
hashedPassword,
serverPrivateKey: null,
clientPublicKey: null
},
tx
);
});
await tokenService.revokeAllMySessions(actor.id);
};
return { return {
generateServerPubKey, generateServerPubKey,
changePassword, changePassword,
@ -274,6 +389,8 @@ export const authPaswordServiceFactory = ({
sendPasswordResetEmail, sendPasswordResetEmail,
verifyPasswordResetEmail, verifyPasswordResetEmail,
createBackupPrivateKey, createBackupPrivateKey,
getBackupPrivateKeyOfUser getBackupPrivateKeyOfUser,
sendPasswordSetupEmail,
setupPassword
}; };
}; };

View File

@ -23,6 +23,20 @@ export type TResetPasswordViaBackupKeyDTO = {
encryptedPrivateKeyTag: string; encryptedPrivateKeyTag: string;
salt: string; salt: string;
verifier: string; verifier: string;
password: string;
};
export type TSetupPasswordViaBackupKeyDTO = {
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
encryptedPrivateKey: string;
encryptedPrivateKeyIV: string;
encryptedPrivateKeyTag: string;
salt: string;
verifier: string;
password: string;
token: string;
}; };
export type TCreateBackupPrivateKeyDTO = { export type TCreateBackupPrivateKeyDTO = {

View File

@ -126,11 +126,11 @@ export const identityAwsAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -99,11 +99,11 @@ export const identityAzureAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -138,11 +138,11 @@ export const identityGcpAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -212,11 +212,11 @@ export const identityJwtAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -229,11 +229,11 @@ export const identityKubernetesAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -194,11 +194,11 @@ export const identityOidcAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -328,11 +328,11 @@ export const identityTokenAuthServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -129,11 +129,11 @@ export const identityUaServiceFactory = ({
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload, } as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET, appCfg.AUTH_SECRET,
{ // akhilmhdh: for non-expiry tokens you should not even set the value, including undefined. Even for undefined jsonwebtoken throws error
expiresIn: Number(identityAccessToken.accessTokenTTL) === 0
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined ? undefined
: Number(identityAccessToken.accessTokenMaxTTL) : {
expiresIn: Number(identityAccessToken.accessTokenTTL)
} }
); );

View File

@ -132,16 +132,26 @@ const getAppsHeroku = async ({ accessToken }: { accessToken: string }) => {
/** /**
* Return list of names of apps for Vercel integration * Return list of names of apps for Vercel integration
* This is re-used for getting custom environments for Vercel
*/ */
const getAppsVercel = async ({ accessToken, teamId }: { teamId?: string | null; accessToken: string }) => { export const getAppsVercel = async ({ accessToken, teamId }: { teamId?: string | null; accessToken: string }) => {
const apps: Array<{ name: string; appId: string }> = []; const apps: Array<{ name: string; appId: string; customEnvironments: Array<{ slug: string; id: string }> }> = [];
const limit = "20"; const limit = "20";
let hasMorePages = true; let hasMorePages = true;
let next: number | null = null; let next: number | null = null;
interface Response { interface Response {
projects: { name: string; id: string }[]; projects: {
name: string;
id: string;
customEnvironments?: {
id: string;
type: string;
description: string;
slug: string;
}[];
}[];
pagination: { pagination: {
count: number; count: number;
next: number | null; next: number | null;
@ -173,7 +183,12 @@ const getAppsVercel = async ({ accessToken, teamId }: { teamId?: string | null;
data.projects.forEach((a) => { data.projects.forEach((a) => {
apps.push({ apps.push({
name: a.name, name: a.name,
appId: a.id appId: a.id,
customEnvironments:
a.customEnvironments?.map((env) => ({
slug: env.slug,
id: env.id
})) ?? []
}); });
}); });

View File

@ -25,11 +25,12 @@ import { TIntegrationDALFactory } from "../integration/integration-dal";
import { TKmsServiceFactory } from "../kms/kms-service"; import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types"; import { KmsDataKey } from "../kms/kms-types";
import { TProjectBotServiceFactory } from "../project-bot/project-bot-service"; import { TProjectBotServiceFactory } from "../project-bot/project-bot-service";
import { getApps } from "./integration-app-list"; import { getApps, getAppsVercel } from "./integration-app-list";
import { TCircleCIContext } from "./integration-app-types"; import { TCircleCIContext } from "./integration-app-types";
import { TIntegrationAuthDALFactory } from "./integration-auth-dal"; import { TIntegrationAuthDALFactory } from "./integration-auth-dal";
import { IntegrationAuthMetadataSchema, TIntegrationAuthMetadata } from "./integration-auth-schema"; import { IntegrationAuthMetadataSchema, TIntegrationAuthMetadata } from "./integration-auth-schema";
import { import {
GetVercelCustomEnvironmentsDTO,
OctopusDeployScope, OctopusDeployScope,
TBitbucketEnvironment, TBitbucketEnvironment,
TBitbucketWorkspace, TBitbucketWorkspace,
@ -1825,6 +1826,41 @@ export const integrationAuthServiceFactory = ({
return integrationAuthDAL.create(newIntegrationAuth); return integrationAuthDAL.create(newIntegrationAuth);
}; };
const getVercelCustomEnvironments = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
teamId,
id
}: GetVercelCustomEnvironmentsDTO) => {
const integrationAuth = await integrationAuthDAL.findById(id);
if (!integrationAuth) throw new NotFoundError({ message: `Integration auth with ID '${id}' not found` });
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: integrationAuth.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SecretManager
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const { botKey, shouldUseSecretV2Bridge } = await projectBotService.getBotKey(integrationAuth.projectId);
const { accessToken } = await getIntegrationAccessToken(integrationAuth, shouldUseSecretV2Bridge, botKey);
const vercelApps = await getAppsVercel({
accessToken,
teamId
});
return vercelApps.map((app) => ({
customEnvironments: app.customEnvironments,
appId: app.appId
}));
};
const getOctopusDeploySpaces = async ({ const getOctopusDeploySpaces = async ({
actorId, actorId,
actor, actor,
@ -1944,6 +1980,7 @@ export const integrationAuthServiceFactory = ({
getIntegrationAccessToken, getIntegrationAccessToken,
duplicateIntegrationAuth, duplicateIntegrationAuth,
getOctopusDeploySpaces, getOctopusDeploySpaces,
getOctopusDeployScopeValues getOctopusDeployScopeValues,
getVercelCustomEnvironments
}; };
}; };

View File

@ -284,3 +284,8 @@ export type TOctopusDeployVariableSet = {
Self: string; Self: string;
}; };
}; };
export type GetVercelCustomEnvironmentsDTO = {
teamId: string;
id: string;
} & Omit<TProjectPermission, "projectId">;

View File

@ -1450,9 +1450,13 @@ const syncSecretsVercel = async ({
secrets: Record<string, { value: string; comment?: string } | null>; secrets: Record<string, { value: string; comment?: string } | null>;
accessToken: string; accessToken: string;
}) => { }) => {
const isCustomEnvironment = !["development", "preview", "production"].includes(
integration.targetEnvironment as string
);
interface VercelSecret { interface VercelSecret {
id?: string; id?: string;
type: string; type: string;
customEnvironmentIds?: string[];
key: string; key: string;
value: string; value: string;
target: string[]; target: string[];
@ -1486,6 +1490,16 @@ const syncSecretsVercel = async ({
} }
) )
).data.envs.filter((secret) => { ).data.envs.filter((secret) => {
if (isCustomEnvironment) {
if (!secret.customEnvironmentIds?.includes(integration.targetEnvironment as string)) {
// case: secret does not have the same custom environment
return false;
}
// no need to check for preview environment, as custom environments are not available in preview
return true;
}
if (!secret.target.includes(integration.targetEnvironment as string)) { if (!secret.target.includes(integration.targetEnvironment as string)) {
// case: secret does not have the same target environment // case: secret does not have the same target environment
return false; return false;
@ -1583,7 +1597,13 @@ const syncSecretsVercel = async ({
key, key,
value: infisicalSecrets[key]?.value, value: infisicalSecrets[key]?.value,
type: "encrypted", type: "encrypted",
target: [integration.targetEnvironment as string], ...(isCustomEnvironment
? {
customEnvironmentIds: [integration.targetEnvironment as string]
}
: {
target: [integration.targetEnvironment as string]
}),
...(integration.path ...(integration.path
? { ? {
gitBranch: integration.path gitBranch: integration.path
@ -1607,9 +1627,19 @@ const syncSecretsVercel = async ({
key, key,
value: infisicalSecrets[key]?.value, value: infisicalSecrets[key]?.value,
type: res[key].type, type: res[key].type,
...(!isCustomEnvironment
? {
target: res[key].target.includes(integration.targetEnvironment as string) target: res[key].target.includes(integration.targetEnvironment as string)
? [...res[key].target] ? [...res[key].target]
: [...res[key].target, integration.targetEnvironment as string], : [...res[key].target, integration.targetEnvironment as string]
}
: {
customEnvironmentIds: res[key].customEnvironmentIds?.includes(integration.targetEnvironment as string)
? [...(res[key].customEnvironmentIds || [])]
: [...(res[key]?.customEnvironmentIds || []), integration.targetEnvironment as string]
}),
...(integration.path ...(integration.path
? { ? {
gitBranch: integration.path gitBranch: integration.path

View File

@ -0,0 +1,10 @@
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { TSecretSyncListItem } from "@app/services/secret-sync/secret-sync-types";
export const GCP_SYNC_LIST_OPTION: TSecretSyncListItem = {
name: "GCP Secret Manager",
destination: SecretSync.GCPSecretManager,
connection: AppConnection.GCP,
canImportSecrets: true
};

View File

@ -0,0 +1,3 @@
export enum GcpSyncScope {
Global = "global"
}

View File

@ -0,0 +1,218 @@
import { AxiosError } from "axios";
import { request } from "@app/lib/config/request";
import { logger } from "@app/lib/logger";
import { getGcpConnectionAuthToken } from "@app/services/app-connection/gcp";
import { IntegrationUrls } from "@app/services/integration-auth/integration-list";
import { SecretSyncError } from "../secret-sync-errors";
import { TSecretMap } from "../secret-sync-types";
import {
GCPLatestSecretVersionAccess,
GCPSecret,
GCPSMListSecretsRes,
TGcpSyncWithCredentials
} from "./gcp-sync-types";
const getGcpSecrets = async (accessToken: string, secretSync: TGcpSyncWithCredentials) => {
const { destinationConfig } = secretSync;
let gcpSecrets: GCPSecret[] = [];
const pageSize = 100;
let pageToken: string | undefined;
let hasMorePages = true;
while (hasMorePages) {
const params = new URLSearchParams({
pageSize: String(pageSize),
...(pageToken ? { pageToken } : {})
});
// eslint-disable-next-line no-await-in-loop
const { data: secretsRes } = await request.get<GCPSMListSecretsRes>(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${secretSync.destinationConfig.projectId}/secrets`,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
if (secretsRes.secrets) {
gcpSecrets = gcpSecrets.concat(secretsRes.secrets);
}
if (!secretsRes.nextPageToken) {
hasMorePages = false;
}
pageToken = secretsRes.nextPageToken;
}
const res: { [key: string]: string } = {};
for await (const gcpSecret of gcpSecrets) {
const arr = gcpSecret.name.split("/");
const key = arr[arr.length - 1];
try {
const { data: secretLatest } = await request.get<GCPLatestSecretVersionAccess>(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${destinationConfig.projectId}/secrets/${key}/versions/latest:access`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
res[key] = Buffer.from(secretLatest.payload.data, "base64").toString("utf-8");
} catch (error) {
// when a secret in GCP has no versions, we treat it as if it's a blank value
if (error instanceof AxiosError && error.response?.status === 404) {
res[key] = "";
} else {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
}
return res;
};
export const GcpSyncFns = {
syncSecrets: async (secretSync: TGcpSyncWithCredentials, secretMap: TSecretMap) => {
const { destinationConfig, connection } = secretSync;
const accessToken = await getGcpConnectionAuthToken(connection);
const gcpSecrets = await getGcpSecrets(accessToken, secretSync);
for await (const key of Object.keys(secretMap)) {
try {
// we do not process secrets with no value because GCP secret manager does not allow it
if (!secretMap[key].value) {
// eslint-disable-next-line no-continue
continue;
}
if (!(key in gcpSecrets)) {
// case: create secret
await request.post(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${destinationConfig.projectId}/secrets`,
{
replication: {
automatic: {}
}
},
{
params: {
secretId: key
},
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
await request.post(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${destinationConfig.projectId}/secrets/${key}:addVersion`,
{
payload: {
data: Buffer.from(secretMap[key].value).toString("base64")
}
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
}
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
for await (const key of Object.keys(gcpSecrets)) {
try {
if (!(key in secretMap) || !secretMap[key].value) {
// case: delete secret
await request.delete(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${destinationConfig.projectId}/secrets/${key}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
} else if (secretMap[key].value !== gcpSecrets[key]) {
if (!secretMap[key].value) {
logger.warn(
`syncSecretsGcpsecretManager: update secret value in gcp where [key=${key}] and [projectId=${destinationConfig.projectId}]`
);
}
await request.post(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${destinationConfig.projectId}/secrets/${key}:addVersion`,
{
payload: {
data: Buffer.from(secretMap[key].value).toString("base64")
}
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
}
} catch (error) {
throw new SecretSyncError({
error,
secretKey: key
});
}
}
},
getSecrets: async (secretSync: TGcpSyncWithCredentials): Promise<TSecretMap> => {
const { connection } = secretSync;
const accessToken = await getGcpConnectionAuthToken(connection);
const gcpSecrets = await getGcpSecrets(accessToken, secretSync);
return Object.fromEntries(Object.entries(gcpSecrets).map(([key, value]) => [key, { value: value ?? "" }]));
},
removeSecrets: async (secretSync: TGcpSyncWithCredentials, secretMap: TSecretMap) => {
const { destinationConfig, connection } = secretSync;
const accessToken = await getGcpConnectionAuthToken(connection);
const gcpSecrets = await getGcpSecrets(accessToken, secretSync);
for await (const [key] of Object.entries(gcpSecrets)) {
if (key in secretMap) {
await request.delete(
`${IntegrationUrls.GCP_SECRET_MANAGER_URL}/v1/projects/${destinationConfig.projectId}/secrets/${key}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
}
}
}
};

View File

@ -0,0 +1,45 @@
import z from "zod";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
BaseSecretSyncSchema,
GenericCreateSecretSyncFieldsSchema,
GenericUpdateSecretSyncFieldsSchema
} from "@app/services/secret-sync/secret-sync-schemas";
import { TSyncOptionsConfig } from "@app/services/secret-sync/secret-sync-types";
import { SecretSync } from "../secret-sync-enums";
import { GcpSyncScope } from "./gcp-sync-enums";
const GcpSyncOptionsConfig: TSyncOptionsConfig = { canImportSecrets: true };
const GcpSyncDestinationConfigSchema = z.object({
scope: z.literal(GcpSyncScope.Global),
projectId: z.string().min(1, "Project ID is required")
});
export const GcpSyncSchema = BaseSecretSyncSchema(SecretSync.GCPSecretManager, GcpSyncOptionsConfig).extend({
destination: z.literal(SecretSync.GCPSecretManager),
destinationConfig: GcpSyncDestinationConfigSchema
});
export const CreateGcpSyncSchema = GenericCreateSecretSyncFieldsSchema(
SecretSync.GCPSecretManager,
GcpSyncOptionsConfig
).extend({
destinationConfig: GcpSyncDestinationConfigSchema
});
export const UpdateGcpSyncSchema = GenericUpdateSecretSyncFieldsSchema(
SecretSync.GCPSecretManager,
GcpSyncOptionsConfig
).extend({
destinationConfig: GcpSyncDestinationConfigSchema.optional()
});
export const GcpSyncListItemSchema = z.object({
name: z.literal("GCP Secret Manager"),
connection: z.literal(AppConnection.GCP),
destination: z.literal(SecretSync.GCPSecretManager),
canImportSecrets: z.literal(true)
});

View File

@ -0,0 +1,33 @@
import z from "zod";
import { TGcpConnection } from "@app/services/app-connection/gcp";
import { CreateGcpSyncSchema, GcpSyncListItemSchema, GcpSyncSchema } from "./gcp-sync-schemas";
export type TGcpSyncListItem = z.infer<typeof GcpSyncListItemSchema>;
export type TGcpSync = z.infer<typeof GcpSyncSchema>;
export type TGcpSyncInput = z.infer<typeof CreateGcpSyncSchema>;
export type TGcpSyncWithCredentials = TGcpSync & {
connection: TGcpConnection;
};
export type GCPSecret = {
name: string;
createTime: string;
};
export type GCPSMListSecretsRes = {
secrets?: GCPSecret[];
totalSize?: number;
nextPageToken?: string;
};
export type GCPLatestSecretVersionAccess = {
name: string;
payload: {
data: string;
};
};

View File

@ -0,0 +1,4 @@
export * from "./gcp-sync-constants";
export * from "./gcp-sync-enums";
export * from "./gcp-sync-schemas";
export * from "./gcp-sync-types";

View File

@ -123,7 +123,6 @@ export const secretSyncDALFactory = (
}; };
const create = async (data: Parameters<(typeof secretSyncOrm)["create"]>[0]) => { const create = async (data: Parameters<(typeof secretSyncOrm)["create"]>[0]) => {
try {
const secretSync = (await secretSyncOrm.transaction(async (tx) => { const secretSync = (await secretSyncOrm.transaction(async (tx) => {
const sync = await secretSyncOrm.create(data, tx); const sync = await secretSyncOrm.create(data, tx);
@ -139,13 +138,9 @@ export const secretSyncDALFactory = (
? await folderDAL.findSecretPathByFolderIds(secretSync.projectId, [secretSync.folderId]) ? await folderDAL.findSecretPathByFolderIds(secretSync.projectId, [secretSync.folderId])
: []; : [];
return expandSecretSync(secretSync, folderWithPath); return expandSecretSync(secretSync, folderWithPath);
} catch (error) {
throw new DatabaseError({ error, name: "Create - Secret Sync" });
}
}; };
const updateById = async (syncId: string, data: Parameters<(typeof secretSyncOrm)["updateById"]>[1]) => { const updateById = async (syncId: string, data: Parameters<(typeof secretSyncOrm)["updateById"]>[1]) => {
try {
const secretSync = (await secretSyncOrm.transaction(async (tx) => { const secretSync = (await secretSyncOrm.transaction(async (tx) => {
const sync = await secretSyncOrm.updateById(syncId, data, tx); const sync = await secretSyncOrm.updateById(syncId, data, tx);
@ -161,9 +156,6 @@ export const secretSyncDALFactory = (
? await folderDAL.findSecretPathByFolderIds(secretSync.projectId, [secretSync.folderId]) ? await folderDAL.findSecretPathByFolderIds(secretSync.projectId, [secretSync.folderId])
: []; : [];
return expandSecretSync(secretSync, folderWithPath); return expandSecretSync(secretSync, folderWithPath);
} catch (error) {
throw new DatabaseError({ error, name: "Update by ID - Secret Sync" });
}
}; };
const findOne = async (filter: Parameters<(typeof secretSyncOrm)["findOne"]>[0], tx?: Knex) => { const findOne = async (filter: Parameters<(typeof secretSyncOrm)["findOne"]>[0], tx?: Knex) => {

View File

@ -1,6 +1,7 @@
export enum SecretSync { export enum SecretSync {
AWSParameterStore = "aws-parameter-store", AWSParameterStore = "aws-parameter-store",
GitHub = "github" GitHub = "github",
GCPSecretManager = "gcp-secret-manager"
} }
export enum SecretSyncInitialSyncBehavior { export enum SecretSyncInitialSyncBehavior {

View File

@ -13,9 +13,13 @@ import {
TSecretSyncWithCredentials TSecretSyncWithCredentials
} from "@app/services/secret-sync/secret-sync-types"; } from "@app/services/secret-sync/secret-sync-types";
import { GCP_SYNC_LIST_OPTION } from "./gcp";
import { GcpSyncFns } from "./gcp/gcp-sync-fns";
const SECRET_SYNC_LIST_OPTIONS: Record<SecretSync, TSecretSyncListItem> = { const SECRET_SYNC_LIST_OPTIONS: Record<SecretSync, TSecretSyncListItem> = {
[SecretSync.AWSParameterStore]: AWS_PARAMETER_STORE_SYNC_LIST_OPTION, [SecretSync.AWSParameterStore]: AWS_PARAMETER_STORE_SYNC_LIST_OPTION,
[SecretSync.GitHub]: GITHUB_SYNC_LIST_OPTION [SecretSync.GitHub]: GITHUB_SYNC_LIST_OPTION,
[SecretSync.GCPSecretManager]: GCP_SYNC_LIST_OPTION
}; };
export const listSecretSyncOptions = () => { export const listSecretSyncOptions = () => {
@ -71,6 +75,8 @@ export const SecretSyncFns = {
return AwsParameterStoreSyncFns.syncSecrets(secretSync, secretMap); return AwsParameterStoreSyncFns.syncSecrets(secretSync, secretMap);
case SecretSync.GitHub: case SecretSync.GitHub:
return GithubSyncFns.syncSecrets(secretSync, secretMap); return GithubSyncFns.syncSecrets(secretSync, secretMap);
case SecretSync.GCPSecretManager:
return GcpSyncFns.syncSecrets(secretSync, secretMap);
default: default:
throw new Error( throw new Error(
`Unhandled sync destination for sync secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}` `Unhandled sync destination for sync secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}`
@ -86,6 +92,9 @@ export const SecretSyncFns = {
case SecretSync.GitHub: case SecretSync.GitHub:
secretMap = await GithubSyncFns.getSecrets(secretSync); secretMap = await GithubSyncFns.getSecrets(secretSync);
break; break;
case SecretSync.GCPSecretManager:
secretMap = await GcpSyncFns.getSecrets(secretSync);
break;
default: default:
throw new Error( throw new Error(
`Unhandled sync destination for get secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}` `Unhandled sync destination for get secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}`
@ -103,6 +112,8 @@ export const SecretSyncFns = {
return AwsParameterStoreSyncFns.removeSecrets(secretSync, secretMap); return AwsParameterStoreSyncFns.removeSecrets(secretSync, secretMap);
case SecretSync.GitHub: case SecretSync.GitHub:
return GithubSyncFns.removeSecrets(secretSync, secretMap); return GithubSyncFns.removeSecrets(secretSync, secretMap);
case SecretSync.GCPSecretManager:
return GcpSyncFns.removeSecrets(secretSync, secretMap);
default: default:
throw new Error( throw new Error(
`Unhandled sync destination for remove secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}` `Unhandled sync destination for remove secrets fns: ${(secretSync as TSecretSyncWithCredentials).destination}`
@ -115,7 +126,7 @@ export const parseSyncErrorMessage = (err: unknown): string => {
if (err instanceof SecretSyncError) { if (err instanceof SecretSyncError) {
return JSON.stringify({ return JSON.stringify({
secretKey: err.secretKey, secretKey: err.secretKey,
error: err.message ?? parseSyncErrorMessage(err.error) error: err.message || parseSyncErrorMessage(err.error)
}); });
} }

View File

@ -3,10 +3,12 @@ import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
export const SECRET_SYNC_NAME_MAP: Record<SecretSync, string> = { export const SECRET_SYNC_NAME_MAP: Record<SecretSync, string> = {
[SecretSync.AWSParameterStore]: "AWS Parameter Store", [SecretSync.AWSParameterStore]: "AWS Parameter Store",
[SecretSync.GitHub]: "GitHub" [SecretSync.GitHub]: "GitHub",
[SecretSync.GCPSecretManager]: "GCP Secret Manager"
}; };
export const SECRET_SYNC_CONNECTION_MAP: Record<SecretSync, AppConnection> = { export const SECRET_SYNC_CONNECTION_MAP: Record<SecretSync, AppConnection> = {
[SecretSync.AWSParameterStore]: AppConnection.AWS, [SecretSync.AWSParameterStore]: AppConnection.AWS,
[SecretSync.GitHub]: AppConnection.GitHub [SecretSync.GitHub]: AppConnection.GitHub,
[SecretSync.GCPSecretManager]: AppConnection.GCP
}; };

View File

@ -8,7 +8,8 @@ import {
ProjectPermissionSub ProjectPermissionSub
} from "@app/ee/services/permission/project-permission"; } from "@app/ee/services/permission/project-permission";
import { KeyStorePrefixes, TKeyStoreFactory } from "@app/keystore/keystore"; import { KeyStorePrefixes, TKeyStoreFactory } from "@app/keystore/keystore";
import { BadRequestError, NotFoundError } from "@app/lib/errors"; import { DatabaseErrorCode } from "@app/lib/error-codes";
import { BadRequestError, DatabaseError, NotFoundError } from "@app/lib/errors";
import { OrgServiceActor } from "@app/lib/types"; import { OrgServiceActor } from "@app/lib/types";
import { TAppConnectionServiceFactory } from "@app/services/app-connection/app-connection-service"; import { TAppConnectionServiceFactory } from "@app/services/app-connection/app-connection-service";
import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service"; import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service";
@ -197,37 +198,26 @@ export const secretSyncServiceFactory = ({
// validates permission to connect and app is valid for sync destination // validates permission to connect and app is valid for sync destination
await appConnectionService.connectAppConnectionById(destinationApp, params.connectionId, actor); await appConnectionService.connectAppConnectionById(destinationApp, params.connectionId, actor);
const secretSync = await secretSyncDAL.transaction(async (tx) => { try {
const isConflictingName = Boolean( const secretSync = await secretSyncDAL.create({
(
await secretSyncDAL.find(
{
name: params.name,
projectId
},
tx
)
).length
);
if (isConflictingName)
throw new BadRequestError({
message: `A Secret Sync with the name "${params.name}" already exists for the project with ID "${folder.projectId}"`
});
const sync = await secretSyncDAL.create({
folderId: folder.id, folderId: folder.id,
...params, ...params,
...(params.isAutoSyncEnabled && { syncStatus: SecretSyncStatus.Pending }), ...(params.isAutoSyncEnabled && { syncStatus: SecretSyncStatus.Pending }),
projectId projectId
}); });
return sync;
});
if (secretSync.isAutoSyncEnabled) await secretSyncQueue.queueSecretSyncSyncSecretsById({ syncId: secretSync.id }); if (secretSync.isAutoSyncEnabled) await secretSyncQueue.queueSecretSyncSyncSecretsById({ syncId: secretSync.id });
return secretSync as TSecretSync; return secretSync as TSecretSync;
} catch (err) {
if (err instanceof DatabaseError && (err.error as { code: string })?.code === DatabaseErrorCode.UniqueViolation) {
throw new BadRequestError({
message: `A Secret Sync with the name "${params.name}" already exists for the project with ID "${folder.projectId}"`
});
}
throw err;
}
}; };
const updateSecretSync = async ( const updateSecretSync = async (
@ -260,7 +250,6 @@ export const secretSyncServiceFactory = ({
message: `Secret sync with ID "${secretSync.id}" is not configured for ${SECRET_SYNC_NAME_MAP[destination]}` message: `Secret sync with ID "${secretSync.id}" is not configured for ${SECRET_SYNC_NAME_MAP[destination]}`
}); });
const updatedSecretSync = await secretSyncDAL.transaction(async (tx) => {
let { folderId } = secretSync; let { folderId } = secretSync;
if (params.connectionId) { if (params.connectionId) {
@ -298,40 +287,28 @@ export const secretSyncServiceFactory = ({
folderId = newFolder.id; folderId = newFolder.id;
} }
if (params.name && secretSync.name !== params.name) {
const isConflictingName = Boolean(
(
await secretSyncDAL.find(
{
name: params.name,
projectId: secretSync.projectId
},
tx
)
).length
);
if (isConflictingName)
throw new BadRequestError({
message: `A Secret Sync with the name "${params.name}" already exists for project with ID "${secretSync.projectId}"`
});
}
const isAutoSyncEnabled = params.isAutoSyncEnabled ?? secretSync.isAutoSyncEnabled; const isAutoSyncEnabled = params.isAutoSyncEnabled ?? secretSync.isAutoSyncEnabled;
const updatedSync = await secretSyncDAL.updateById(syncId, { try {
const updatedSecretSync = await secretSyncDAL.updateById(syncId, {
...params, ...params,
...(isAutoSyncEnabled && folderId && { syncStatus: SecretSyncStatus.Pending }), ...(isAutoSyncEnabled && folderId && { syncStatus: SecretSyncStatus.Pending }),
folderId folderId
}); });
return updatedSync;
});
if (updatedSecretSync.isAutoSyncEnabled) if (updatedSecretSync.isAutoSyncEnabled)
await secretSyncQueue.queueSecretSyncSyncSecretsById({ syncId: secretSync.id }); await secretSyncQueue.queueSecretSyncSyncSecretsById({ syncId: secretSync.id });
return updatedSecretSync as TSecretSync; return updatedSecretSync as TSecretSync;
} catch (err) {
if (err instanceof DatabaseError && (err.error as { code: string })?.code === DatabaseErrorCode.UniqueViolation) {
throw new BadRequestError({
message: `A Secret Sync with the name "${params.name}" already exists for the project with ID "${secretSync.projectId}"`
});
}
throw err;
}
}; };
const deleteSecretSync = async ( const deleteSecretSync = async (

View File

@ -17,14 +17,18 @@ import {
TAwsParameterStoreSyncListItem, TAwsParameterStoreSyncListItem,
TAwsParameterStoreSyncWithCredentials TAwsParameterStoreSyncWithCredentials
} from "./aws-parameter-store"; } from "./aws-parameter-store";
import { TGcpSync, TGcpSyncInput, TGcpSyncListItem, TGcpSyncWithCredentials } from "./gcp";
export type TSecretSync = TAwsParameterStoreSync | TGitHubSync; export type TSecretSync = TAwsParameterStoreSync | TGitHubSync | TGcpSync;
export type TSecretSyncWithCredentials = TAwsParameterStoreSyncWithCredentials | TGitHubSyncWithCredentials; export type TSecretSyncWithCredentials =
| TAwsParameterStoreSyncWithCredentials
| TGitHubSyncWithCredentials
| TGcpSyncWithCredentials;
export type TSecretSyncInput = TAwsParameterStoreSyncInput | TGitHubSyncInput; export type TSecretSyncInput = TAwsParameterStoreSyncInput | TGitHubSyncInput | TGcpSyncInput;
export type TSecretSyncListItem = TAwsParameterStoreSyncListItem | TGitHubSyncListItem; export type TSecretSyncListItem = TAwsParameterStoreSyncListItem | TGitHubSyncListItem | TGcpSyncListItem;
export type TSyncOptionsConfig = { export type TSyncOptionsConfig = {
canImportSecrets: boolean; canImportSecrets: boolean;

View File

@ -30,6 +30,7 @@ export enum SmtpTemplates {
NewDeviceJoin = "newDevice.handlebars", NewDeviceJoin = "newDevice.handlebars",
OrgInvite = "organizationInvitation.handlebars", OrgInvite = "organizationInvitation.handlebars",
ResetPassword = "passwordReset.handlebars", ResetPassword = "passwordReset.handlebars",
SetupPassword = "passwordSetup.handlebars",
SecretLeakIncident = "secretLeakIncident.handlebars", SecretLeakIncident = "secretLeakIncident.handlebars",
WorkspaceInvite = "workspaceInvitation.handlebars", WorkspaceInvite = "workspaceInvitation.handlebars",
ScimUserProvisioned = "scimUserProvisioned.handlebars", ScimUserProvisioned = "scimUserProvisioned.handlebars",

View File

@ -0,0 +1,17 @@
<html>
<head>
<meta charset="utf-8" />
<meta http-equiv="x-ua-compatible" content="ie=edge" />
<title>Password Setup</title>
</head>
<body>
<h2>Setup your password</h2>
<p>Someone requested to set up a password for your account.</p>
<p><strong>Make sure you are already logged in to Infisical in the current browser before clicking the link below.</strong></p>
<a href="{{callback_url}}?token={{token}}&to={{email}}">Setup password</a>
<p>If you didn't initiate this request, please contact
{{#if isCloud}}us immediately at team@infisical.com.{{else}}your administrator immediately.{{/if}}</p>
{{emailFooter}}
</body>
</html>

View File

@ -110,7 +110,7 @@ var secretsCmd = &cobra.Command{
if plainOutput { if plainOutput {
for _, secret := range secrets { for _, secret := range secrets {
fmt.Println(secret.Value) fmt.Println(fmt.Sprintf("%s=%s", secret.Key, secret.Value))
} }
} else { } else {
visualize.PrintAllSecretDetails(secrets) visualize.PrintAllSecretDetails(secrets)

View File

@ -0,0 +1,67 @@
---
title: "How to write a design document"
sidebarTitle: "Writing Design Docs"
description: "Learn how to write a design document at Infisical"
---
## **Why write a design document?**
Writing a design document helps you efficiently solve broad, complex engineering problems at Infisical. While planning is important, we are a startup, so speed and urgency should be your top of mind. Keep the process lightweight and time boxed so that we can get the most out of it.
**Writing a design will help you:**
- **Understand the problem space:** Deeply understand the problem youre solving to make sure it is well scoped.
- **Stay on the right path:** Without proper planning, you risk cycling between partial implementation and replanning, encountering roadblocks that force you back to square one. A solid plan minimizes wasted engineering hours.
- **An opportunity to collaborate:** Bring relevant engineers into the discussion to develop well-thought-out solutions and catch potential issues you might have overlooked.
- **Faster implementation:** A well-thought-out plan will help you catch roadblocks early and ship quickly because you know exactly what needs to get implemented.
**When to write a design document:**
- **Write a design doc**: If the feature is not well defined, high-security, or will take more than **1 full engineering week** to build.
- **Skip the design doc**: For small, straightforward features that can be built quickly with informal discussions.
If you are unsure when to create a design doc, chat with @maidul.
## **What to Include in your Design Document**
Every feature/problem is unique, but your design docs should generally include the following sections. If you need to include additional sections, feel free to do so.
1. **Title**
- A descriptive title.
- Name of document owner and name of reviewer(s).
2. **Overview**
- A high-level summary of the problem and proposed solution. Keep it brief (max 3 paragraphs).
3. **Context**
- Explain the problems background, why its important to solve now, and any constraints (e.g., technical, sales, or timeline-related). What do we get out of solving this problem? (needed to close a deal, scale, performance, etc.).
4. **Solution**
- Provide a big-picture explanation of the solution, followed by detailed technical architecture.
- Use diagrams/charts where needed.
- Write clearly so that another engineer could implement the solution in your absence.
5. **Milestones**
- Break the project into phases with clear start and end dates estimates. Use a table or bullet points.
6. **FAQ**
- Common questions or concerns someone might have while reading your document that can be quickly addressed.
## **How to Write a Design Doc**
- **Keep it Simple**: Use clear, simple language. Opt for short sentences, bullet points, and concrete examples over fluff writing.
- **Use Visuals**: Add diagrams and charts for clarity to convey your ideas.
- **Make it Self-Explanatory**: Ensure that anyone reading the document can understand and implement the plan without needing additional context.
Before sharing your design docs with others, review your design doc as if you were a teammate seeing it for the first time. Anticipate questions and address them.
## **Process from start to finish**
1. **Research/Discuss**
- Before you start writing, take some time to research and get a solid understanding of the problem space. Look into how other well-established companies are tackling similar challenges, if they are.
Talk through the problem and your initial solution with other engineers on the team—bounce ideas around and get their feedback. If you have ideas on how the system could if implemented in Infisical, would it effect any downstream features/systems, etc?
Once youve got a general direction, you might need to test a some theories. This is where quick proof of concepts (POCs) come in handy, but dont get too caught up in the details. The goal of a POC is simply to validate a core idea or concept so you can get to the rest of your planning.
2. **Write the Doc**
- Based on your research/discussions, write the design doc and include all relevant sections. Your goal is to come up with a convincing plan on why this is the correct why to solve the problem at hand.
3. **Assign Reviewers**
- Ask a relevant engineer(s) to review your document. Their role is to identify blind spots, challenge assumptions, and ensure everything is clear. Once you and the reviewer are on the same page on the approach, update the document with any missing details they brought up.
4. **Team Review and Feedback**
- Invite the relevant engineers to a design doc review meeting and give them 10-15 minutes to read through the document. After everyone has had a chance to review it, open the floor up for discussion. Address any feedback or concerns raised during this meeting. If significant points were overlooked during your initial planning, you may need to revisit the drawing board. Your goal is to think about the feature holistically and minimize the need for drastic changes to your design doc later on.

View File

@ -66,7 +66,8 @@
{ {
"group": "Engineering", "group": "Engineering",
"pages": [ "pages": [
"documentation/engineering/oncall" "documentation/engineering/oncall",
"documentation/engineering/how-to-write-design-doc"
] ]
} }
] ]

View File

@ -193,6 +193,17 @@ services:
- openldap - openldap
profiles: [ldap] profiles: [ldap]
keycloak:
image: quay.io/keycloak/keycloak:26.1.0
restart: always
environment:
- KC_BOOTSTRAP_ADMIN_PASSWORD=admin
- KC_BOOTSTRAP_ADMIN_USERNAME=admin
command: start-dev
ports:
- 8088:8080
profiles: [ sso ]
volumes: volumes:
postgres-data: postgres-data:
driver: local driver: local

View File

@ -20,7 +20,8 @@ SITE_URL=http://localhost:8080
# Mail/SMTP # Mail/SMTP
SMTP_HOST= SMTP_HOST=
SMTP_PORT= SMTP_PORT=
SMTP_NAME= SMTP_FROM_ADDRESS=
SMTP_FROM_NAME=
SMTP_USERNAME= SMTP_USERNAME=
SMTP_PASSWORD= SMTP_PASSWORD=

View File

@ -0,0 +1,4 @@
---
title: "Available"
openapi: "GET /api/v1/app-connections/gcp/available"
---

View File

@ -0,0 +1,10 @@
---
title: "Create"
openapi: "POST /api/v1/app-connections/gcp"
---
<Note>
Check out the configuration docs for [GCP
Connections](/integrations/app-connections/gcp) to learn how to obtain the
required credentials.
</Note>

View File

@ -0,0 +1,4 @@
---
title: "Delete"
openapi: "DELETE /api/v1/app-connections/gcp/{connectionId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Get by ID"
openapi: "GET /api/v1/app-connections/gcp/{connectionId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Get by Name"
openapi: "GET /api/v1/app-connections/gcp/connection-name/{connectionName}"
---

View File

@ -0,0 +1,4 @@
---
title: "List"
openapi: "GET /api/v1/app-connections/gcp"
---

View File

@ -0,0 +1,10 @@
---
title: "Update"
openapi: "PATCH /api/v1/app-connections/gcp/{connectionId}"
---
<Note>
Check out the configuration docs for [GCP
Connections](/integrations/app-connections/gcp) to learn how to obtain the
required credentials.
</Note>

View File

@ -0,0 +1,4 @@
---
title: "Create"
openapi: "POST /api/v1/secret-syncs/gcp-secret-manager"
---

View File

@ -0,0 +1,4 @@
---
title: "Delete"
openapi: "DELETE /api/v1/secret-syncs/gcp-secret-manager/{syncId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Get by ID"
openapi: "GET /api/v1/secret-syncs/gcp-secret-manager/{syncId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Get by Name"
openapi: "GET /api/v1/secret-syncs/gcp-secret-manager/sync-name/{syncName}"
---

View File

@ -0,0 +1,4 @@
---
title: "Import Secrets"
openapi: "POST /api/v1/secret-syncs/gcp-secret-manager/{syncId}/import-secrets"
---

View File

@ -0,0 +1,4 @@
---
title: "List"
openapi: "GET /api/v1/secret-syncs/gcp-secret-manager"
---

Some files were not shown because too many files have changed in this diff Show More