Compare commits

..

117 Commits

Author SHA1 Message Date
6ceeccf583 Update kubernetes.mdx 2024-07-16 05:25:30 +02:00
78f4c0f002 Update Chart.yaml 2024-07-16 04:46:32 +02:00
6cff2f0437 Update values.yaml 2024-07-16 04:46:24 +02:00
6cefb180d6 Update SDK and go mod tidy 2024-07-16 04:44:32 +02:00
59a44155c5 Azure resource 2024-07-16 04:43:53 +02:00
d0ad9c6b17 Update sample.yaml 2024-07-16 04:43:46 +02:00
58a406b114 Update secrets.infisical.com_infisicalsecrets.yaml 2024-07-16 04:43:42 +02:00
8a85695dc5 Custom azure resource 2024-07-16 04:43:38 +02:00
de67c0ad9f Merge pull request #2110 from akhilmhdh/feat/folder-improvement-tf
New folder endpoints for terraform
2024-07-15 21:40:24 -04:00
b8d11d31a6 Merge pull request #2130 from Infisical/handbook-update
updated hiring handbook
2024-07-15 21:38:33 -04:00
d630ceaffe updated hiring handbook 2024-07-15 17:19:56 -07:00
a89e60f296 Merge pull request #2129 from Infisical/maidul-sddwqdwdwqe123
Remove org read check on project fetch
2024-07-15 16:38:21 -04:00
a5d9abf1c8 remove org read check on projects fetch 2024-07-15 16:33:11 -04:00
d97dea2573 Merge pull request #2128 from Infisical/misc/removed-aws-global-config-update
misc: moved aws creds to constructor
2024-07-16 02:38:57 +08:00
bc58f6b988 misc: moved aws creds to constructor 2024-07-16 02:31:31 +08:00
ed8e3f34fb Merge pull request #2095 from akhilmhdh/feat/aws-kms-sm
aws kms support base setup
2024-07-15 12:47:30 -04:00
91315c88c3 Merge pull request #2124 from Infisical/misc/displayed-project-name-in-slack-webhook
misc: displayed project name in slack webhook
2024-07-16 00:18:42 +08:00
9267f881d6 misc: displayed project name in slack webhook 2024-07-15 16:49:06 +08:00
c90ecd336c Merge pull request #2122 from Infisical/fix-scim-groups
Fix SCIM PATCH /Groups Fn
2024-07-15 00:57:46 +07:00
d8b1da3ddd Fix SCIM patch group fn 2024-07-15 00:29:18 +07:00
58e86382fe Merge pull request #2119 from Infisical/update-used-count-on-prem
Update dynamic seat count on prem
2024-07-14 20:13:42 +07:00
2080c4419e Update dynamic seat count on prem 2024-07-14 14:25:32 +07:00
b582a4a06d Merge pull request #2117 from DDDASHXD/sebastian/ui-fix
Fix: Features card overflow
2024-07-12 21:05:56 +02:00
a5c6a864de Fix: Features card overflow 2024-07-12 18:57:08 +00:00
5082c1ba3b Merge pull request #2115 from Infisical/misc/soft-delete-shared-secrets
misc: soft delete shared secrets upon expiry
2024-07-12 12:56:26 -04:00
cceb08b1b5 Merge pull request #2109 from Infisical/misc/address-ws-vulnerability-via-package-update
misc: addressed ws vulnerability via package update
2024-07-12 12:20:15 -04:00
4c34e58945 Merge pull request #2116 from Infisical/dependabot/npm_and_yarn/frontend/axios-0.28.0
build(deps): bump axios from 0.27.2 to 0.28.0 in /frontend
2024-07-12 19:17:08 +05:30
72de1901a1 build(deps): bump axios from 0.27.2 to 0.28.0 in /frontend
Bumps [axios](https://github.com/axios/axios) from 0.27.2 to 0.28.0.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v0.28.0/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v0.27.2...v0.28.0)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-12 13:37:13 +00:00
65fefcdd87 Merge pull request #2114 from Infisical/dependabot/npm_and_yarn/frontend/postcss-8.4.39
build(deps): bump postcss from 8.4.14 to 8.4.39 in /frontend
2024-07-12 19:03:59 +05:30
8e753eda72 misc: soft delete shared secrets 2024-07-12 21:29:17 +08:00
7137c94fa2 build(deps): bump postcss from 8.4.14 to 8.4.39 in /frontend
Bumps [postcss](https://github.com/postcss/postcss) from 8.4.14 to 8.4.39.
- [Release notes](https://github.com/postcss/postcss/releases)
- [Changelog](https://github.com/postcss/postcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/postcss/postcss/compare/8.4.14...8.4.39)

---
updated-dependencies:
- dependency-name: postcss
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-12 13:24:54 +00:00
52ea7dfa61 Merge pull request #2113 from Infisical/dependabot/go_modules/cli/github.com/rs/cors-1.11.0
build(deps): bump github.com/rs/cors from 1.9.0 to 1.11.0 in /cli
2024-07-12 18:52:25 +05:30
093925ed0e build(deps): bump github.com/rs/cors from 1.9.0 to 1.11.0 in /cli
Bumps [github.com/rs/cors](https://github.com/rs/cors) from 1.9.0 to 1.11.0.
- [Commits](https://github.com/rs/cors/compare/v1.9.0...v1.11.0)

---
updated-dependencies:
- dependency-name: github.com/rs/cors
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-12 13:20:45 +00:00
4491f2d8f1 Merge pull request #2111 from Infisical/dependabot/go_modules/cli/github.com/dvsekhvalnov/jose2go-1.6.0
build(deps): bump github.com/dvsekhvalnov/jose2go from 1.5.0 to 1.6.0 in /cli
2024-07-12 14:44:32 +05:30
4a401957c7 build(deps): bump github.com/dvsekhvalnov/jose2go in /cli
Bumps [github.com/dvsekhvalnov/jose2go](https://github.com/dvsekhvalnov/jose2go) from 1.5.0 to 1.6.0.
- [Commits](https://github.com/dvsekhvalnov/jose2go/compare/v1.5...v1.6.0)

---
updated-dependencies:
- dependency-name: github.com/dvsekhvalnov/jose2go
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-12 09:11:42 +00:00
=
539785acae docs: updated api reference docs for the new folder endpoint 2024-07-12 14:29:19 +05:30
=
3c63346d3a feat: new get-by-id for folder for tf 2024-07-12 14:24:13 +05:30
0c673f6cca misc: addressed ws vulnerability via package update 2024-07-12 15:36:49 +08:00
10f4cbf11f Merge pull request #2107 from Infisical/feat/add-share-secret-hide-unhide
feat: add share-secret hide and unhide
2024-07-12 14:57:02 +08:00
a6a8c32326 Merge pull request #2106 from Infisical/docs/identity-github-oidc-auth
doc: added docs for connecting github to infisical via oidc auth
2024-07-12 10:26:08 +07:00
99a474dba7 Merge pull request #2091 from Infisical/misc/moved-admin-user-deletion-to-pro
misc: moved admin user deletion to pro plan
2024-07-11 13:28:33 -04:00
e439f4e5aa Update UserPanel.tsx 2024-07-11 13:25:48 -04:00
ae2ecf1540 Merge pull request #2100 from Infisical/misc/add-ttl-max-value-for-identities
misc: add max checks for TTL values of identities
2024-07-11 13:21:53 -04:00
10214ea5dc misc: renamed button label 2024-07-12 00:45:16 +08:00
918cd414a8 feat: add share-secret hide and unhide 2024-07-12 00:42:26 +08:00
f9a125acee misc: updated limit to 10 years 2024-07-11 23:40:45 +08:00
52415ea83e misc: removed redundant text' 2024-07-11 23:30:55 +08:00
c5ca2b6796 doc: added docs for connecting github to infisical via oidc auth 2024-07-11 23:00:45 +08:00
ef5bcac925 Merge pull request #2103 from Infisical/move-groups
Consolidate People and Groups Tabs to shared User Tab at Org / Project Level
2024-07-11 18:58:12 +07:00
6cbeb29b4e Merge remote-tracking branch 'origin/main' into misc/add-ttl-max-value-for-identities 2024-07-11 19:17:25 +08:00
fbe344c0df Fix token auth ref 2024-07-11 14:56:57 +07:00
5821f65a63 Fix token auth ref 2024-07-11 14:56:08 +07:00
3af510d487 Merge pull request #2104 from Infisical/fix-token-auth-ref
Fix Token Auth Ref in Access Token DAL
2024-07-11 14:54:35 +07:00
c15adc7df9 Fix token auth ref 2024-07-11 14:49:22 +07:00
93af7573ac Consolidate people and groups tabs to user / user groups shared tab 2024-07-11 13:35:11 +07:00
cddda1148e misc: added max ttl checks for native auths 2024-07-11 14:05:50 +08:00
9c37eeeda6 misc: finalize form validation for universal auth ttl 2024-07-11 13:48:18 +08:00
eadf5bef77 misc: add TTL max values for universal auth 2024-07-11 13:35:58 +08:00
5dff46ee3a Add missing token auth to access token findOne fn 2024-07-11 10:59:08 +07:00
8b202c2a79 Merge pull request #2099 from Infisical/identity-improvements
Identity Workflow Improvements (Table Menu Opts, Error Handling)
2024-07-11 10:45:20 +07:00
4574519a76 Update identity table opts, identity project table error handling 2024-07-11 10:36:00 +07:00
82ee77bc05 Merge pull request #2093 from Infisical/doc/add-native-auth-to-docs
doc: added native auths to api reference
2024-07-11 09:46:26 +07:00
9a861499df Merge pull request #2097 from Infisical/secret-sharing-ui-update
update phrasing
2024-07-10 18:38:32 -04:00
17c7207f9d doc: added oidc auth api reference 2024-07-11 02:04:44 +08:00
d1f3c98f21 fix posthog cross orgin calls 2024-07-10 13:46:22 -04:00
d248a6166c Merge remote-tracking branch 'origin/main' into doc/add-native-auth-to-docs 2024-07-11 01:31:50 +08:00
8fdd82a335 Add token auth to api reference 2024-07-10 23:37:36 +07:00
c501c85eb8 misc: renamed to more generic label 2024-07-11 00:14:34 +08:00
eac621db73 Merge pull request #2096 from Infisical/identity-project-provisioning
Identities Page - Project Provisioning / De-provisioning
2024-07-10 23:08:21 +07:00
ab7983973e update phrasing 2024-07-10 08:06:06 -07:00
ff43773f37 Merge pull request #2088 from Infisical/feat/move-secrets
feat: move secrets
2024-07-10 21:48:57 +08:00
68574be05b Fix merge conflicts 2024-07-10 18:18:00 +07:00
1d9966af76 Add admin to display identity table 2024-07-10 18:15:39 +07:00
4dddf764bd Finish identity page project provisioning/deprovisioning 2024-07-10 18:14:34 +07:00
2d9435457d misc: addressed typo 2024-07-10 18:58:42 +08:00
=
5d4c7c2cbf feat: added encrypt/decrypt with key for kms service and changed kms encrytion to hoc to avoid back to back db calls 2024-07-10 15:23:02 +05:30
8b06215366 Merge pull request #2055 from Infisical/feat/oidc-identity
feat: oidc machine identity auth method
2024-07-10 17:29:56 +08:00
=
08f0bf9c67 feat: fixed migration down missing orgid 2024-07-10 14:47:44 +05:30
=
654dd97793 feat: external kms router defined not plugged in 2024-07-10 12:48:36 +05:30
=
2e7baf8c89 feat: added external kms router but not connected with the server yet 2024-07-10 12:48:35 +05:30
=
7ca7a95070 feat: kms service changes for db change 2024-07-10 12:48:35 +05:30
=
71c49c8b90 feat: kms db schema changes to support external and internal kms uniformly 2024-07-10 12:48:35 +05:30
4fab746b95 misc: added description to native auth properties 2024-07-10 15:17:23 +08:00
179edd98bf misc: rolled back frontend package-lock 2024-07-10 13:45:35 +08:00
dc05b34fb1 misc: rolled back package locks 2024-07-10 13:41:33 +08:00
899757ab7c doc: added native auth to api reference 2024-07-10 13:30:06 +08:00
20f6dbfbd1 Update oidc docs image 2024-07-10 12:09:24 +07:00
8ff524a037 Move migration file to front 2024-07-10 11:51:53 +07:00
3913e2f462 Fix merge conflicts, bring oidc auth up to speed with identity ui changes 2024-07-10 10:31:48 +07:00
9832915eba add .? incase adminUserDeletion is empty 2024-07-09 21:09:55 -04:00
ebbccdb857 add better label for identity id 2024-07-09 18:13:06 -04:00
b98c8629e5 misc: moved admin user deletion to pro 2024-07-09 23:51:09 +08:00
28723e9a4e misc: updated toast 2024-07-09 23:32:26 +08:00
079e005f49 misc: added audit log and overwrite feature 2024-07-09 21:46:12 +08:00
df90e4e6f0 Update go version in k8s dockerfile 2024-07-09 09:44:52 -04:00
6e9a624697 Merge pull request #2090 from Infisical/daniel/operator-bump-sdk
fix(operator): azure auth
2024-07-09 09:32:39 -04:00
94b0cb4697 Bump helm 2024-07-09 15:31:47 +02:00
5a5226c82f Update values.yaml 2024-07-09 15:27:11 +02:00
09cfaec175 Update infisicalsecret_controller.go 2024-07-09 15:27:11 +02:00
40abc184f2 Fix: Bump SDK version 2024-07-09 15:27:11 +02:00
3879edfab7 Merge pull request #2089 from Infisical/docs-token-auth
Update identity docs for auth token and newer identity flow
2024-07-09 16:51:44 +07:00
d20ae39f32 feat: initial move secret integration 2024-07-09 17:48:39 +08:00
53c875424e Update identity docs for auth token and newer identity flow 2024-07-09 16:47:24 +07:00
05bf2e4696 made move operation transactional 2024-07-09 16:03:50 +08:00
a06dee66f8 feat: initial logic for moving secrets 2024-07-09 15:20:58 +08:00
0eab9233bb Merge pull request #2076 from Infisical/misc/redesigned-org-security-settings
misc: redesigned org security settings page
2024-07-09 15:05:34 +08:00
9bf358a57d Merge pull request #2057 from Infisical/token-auth
Token Authentication Method + Revamped Identity (Page) Workflow
2024-07-09 11:55:56 +07:00
a37987b508 misc: added proper error handling 2024-07-08 21:31:11 +08:00
d4a2f4590b misc: redesigned org security settings page 2024-07-06 00:25:27 +08:00
8bc6edd165 doc: added general docs for oidc auth 2024-07-04 22:31:03 +08:00
2497aada8a misc: added oidc auth to access token trusted Ips 2024-07-04 15:54:37 +08:00
5921f349a8 misc: removed comment 2024-07-04 12:42:36 +08:00
4927cc804a feat: added endpoint for oidc auth revocation 2024-07-04 00:53:24 +08:00
2153dd94eb feat: finished up login with identity oidc 2024-07-03 23:48:31 +08:00
08322f46f9 misc: setup audit logs for oidc identity 2024-07-03 21:38:13 +08:00
fc9326272a feat: finished up oidc auth management 2024-07-03 21:18:50 +08:00
c90e423e4a feat: initial setup 2024-07-03 02:06:02 +08:00
227 changed files with 9535 additions and 2338 deletions

1119
backend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -106,6 +106,7 @@
},
"dependencies": {
"@aws-sdk/client-iam": "^3.525.0",
"@aws-sdk/client-kms": "^3.609.0",
"@aws-sdk/client-secrets-manager": "^3.504.0",
"@aws-sdk/client-sts": "^3.600.0",
"@casl/ability": "^6.5.0",
@ -125,8 +126,8 @@
"@peculiar/asn1-schema": "^2.3.8",
"@peculiar/x509": "^1.10.0",
"@serdnam/pino-cloudwatch-transport": "^1.0.4",
"@team-plain/typescript-sdk": "^4.6.1",
"@sindresorhus/slugify": "1.1.0",
"@team-plain/typescript-sdk": "^4.6.1",
"@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0",
"argon2": "^0.31.2",
@ -148,6 +149,7 @@
"jmespath": "^0.16.0",
"jsonwebtoken": "^9.0.2",
"jsrp": "^0.2.4",
"jwks-rsa": "^3.1.0",
"knex": "^3.0.1",
"ldapjs": "^3.0.7",
"libsodium-wrappers": "^0.7.13",

View File

@ -9,6 +9,7 @@ import { TAuditLogStreamServiceFactory } from "@app/ee/services/audit-log-stream
import { TCertificateAuthorityCrlServiceFactory } from "@app/ee/services/certificate-authority-crl/certificate-authority-crl-service";
import { TDynamicSecretServiceFactory } from "@app/ee/services/dynamic-secret/dynamic-secret-service";
import { TDynamicSecretLeaseServiceFactory } from "@app/ee/services/dynamic-secret-lease/dynamic-secret-lease-service";
import { TExternalKmsServiceFactory } from "@app/ee/services/external-kms/external-kms-service";
import { TGroupServiceFactory } from "@app/ee/services/group/group-service";
import { TIdentityProjectAdditionalPrivilegeServiceFactory } from "@app/ee/services/identity-project-additional-privilege/identity-project-additional-privilege-service";
import { TLdapConfigServiceFactory } from "@app/ee/services/ldap-config/ldap-config-service";
@ -41,6 +42,7 @@ import { TIdentityAwsAuthServiceFactory } from "@app/services/identity-aws-auth/
import { TIdentityAzureAuthServiceFactory } from "@app/services/identity-azure-auth/identity-azure-auth-service";
import { TIdentityGcpAuthServiceFactory } from "@app/services/identity-gcp-auth/identity-gcp-auth-service";
import { TIdentityKubernetesAuthServiceFactory } from "@app/services/identity-kubernetes-auth/identity-kubernetes-auth-service";
import { TIdentityOidcAuthServiceFactory } from "@app/services/identity-oidc-auth/identity-oidc-auth-service";
import { TIdentityProjectServiceFactory } from "@app/services/identity-project/identity-project-service";
import { TIdentityTokenAuthServiceFactory } from "@app/services/identity-token-auth/identity-token-auth-service";
import { TIdentityUaServiceFactory } from "@app/services/identity-ua/identity-ua-service";
@ -135,6 +137,7 @@ declare module "fastify" {
identityGcpAuth: TIdentityGcpAuthServiceFactory;
identityAwsAuth: TIdentityAwsAuthServiceFactory;
identityAzureAuth: TIdentityAzureAuthServiceFactory;
identityOidcAuth: TIdentityOidcAuthServiceFactory;
accessApprovalPolicy: TAccessApprovalPolicyServiceFactory;
accessApprovalRequest: TAccessApprovalRequestServiceFactory;
secretApprovalPolicy: TSecretApprovalPolicyServiceFactory;
@ -161,6 +164,7 @@ declare module "fastify" {
secretSharing: TSecretSharingServiceFactory;
rateLimit: TRateLimitServiceFactory;
userEngagement: TUserEngagementServiceFactory;
externalKms: TExternalKmsServiceFactory;
};
// this is exclusive use for middlewares in which we need to inject data
// everywhere else access using service layer

View File

@ -59,6 +59,9 @@ import {
TDynamicSecrets,
TDynamicSecretsInsert,
TDynamicSecretsUpdate,
TExternalKms,
TExternalKmsInsert,
TExternalKmsUpdate,
TGitAppInstallSessions,
TGitAppInstallSessionsInsert,
TGitAppInstallSessionsUpdate,
@ -92,6 +95,9 @@ import {
TIdentityKubernetesAuths,
TIdentityKubernetesAuthsInsert,
TIdentityKubernetesAuthsUpdate,
TIdentityOidcAuths,
TIdentityOidcAuthsInsert,
TIdentityOidcAuthsUpdate,
TIdentityOrgMemberships,
TIdentityOrgMembershipsInsert,
TIdentityOrgMembershipsUpdate,
@ -122,6 +128,9 @@ import {
TIntegrations,
TIntegrationsInsert,
TIntegrationsUpdate,
TInternalKms,
TInternalKmsInsert,
TInternalKmsUpdate,
TKmsKeys,
TKmsKeysInsert,
TKmsKeysUpdate,
@ -483,6 +492,11 @@ declare module "knex/types/tables" {
TIdentityAzureAuthsInsert,
TIdentityAzureAuthsUpdate
>;
[TableName.IdentityOidcAuth]: KnexOriginal.CompositeTableType<
TIdentityOidcAuths,
TIdentityOidcAuthsInsert,
TIdentityOidcAuthsUpdate
>;
[TableName.IdentityUaClientSecret]: KnexOriginal.CompositeTableType<
TIdentityUaClientSecrets,
TIdentityUaClientSecretsInsert,
@ -648,6 +662,8 @@ declare module "knex/types/tables" {
TKmsRootConfigInsert,
TKmsRootConfigUpdate
>;
[TableName.InternalKms]: KnexOriginal.CompositeTableType<TInternalKms, TInternalKmsInsert, TInternalKmsUpdate>;
[TableName.ExternalKms]: KnexOriginal.CompositeTableType<TExternalKms, TExternalKmsInsert, TExternalKmsUpdate>;
[TableName.KmsKey]: KnexOriginal.CompositeTableType<TKmsKeys, TKmsKeysInsert, TKmsKeysUpdate>;
[TableName.KmsKeyVersion]: KnexOriginal.CompositeTableType<
TKmsKeyVersions,

View File

@ -0,0 +1,256 @@
import slugify from "@sindresorhus/slugify";
import { Knex } from "knex";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TableName } from "../schemas";
const createInternalKmsTableAndBackfillData = async (knex: Knex) => {
const doesOldKmsKeyTableExist = await knex.schema.hasTable(TableName.KmsKey);
const doesInternalKmsTableExist = await knex.schema.hasTable(TableName.InternalKms);
// building the internal kms table by filling from old kms table
if (doesOldKmsKeyTableExist && !doesInternalKmsTableExist) {
await knex.schema.createTable(TableName.InternalKms, (tb) => {
tb.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
tb.binary("encryptedKey").notNullable();
tb.string("encryptionAlgorithm").notNullable();
tb.integer("version").defaultTo(1).notNullable();
tb.uuid("kmsKeyId").unique().notNullable();
tb.foreign("kmsKeyId").references("id").inTable(TableName.KmsKey).onDelete("CASCADE");
});
// copy the old kms and backfill
const oldKmsKey = await knex(TableName.KmsKey).select("version", "encryptedKey", "encryptionAlgorithm", "id");
if (oldKmsKey.length) {
await knex(TableName.InternalKms).insert(
oldKmsKey.map((el) => ({
encryptionAlgorithm: el.encryptionAlgorithm,
encryptedKey: el.encryptedKey,
kmsKeyId: el.id,
version: el.version
}))
);
}
}
};
const renameKmsKeyVersionTableAsInternalKmsKeyVersion = async (knex: Knex) => {
const doesOldKmsKeyVersionTableExist = await knex.schema.hasTable(TableName.KmsKeyVersion);
const doesNewKmsKeyVersionTableExist = await knex.schema.hasTable(TableName.InternalKmsKeyVersion);
if (doesOldKmsKeyVersionTableExist && !doesNewKmsKeyVersionTableExist) {
// because we haven't started using versioning for kms thus no data exist
await knex.schema.renameTable(TableName.KmsKeyVersion, TableName.InternalKmsKeyVersion);
const hasKmsKeyIdColumn = await knex.schema.hasColumn(TableName.InternalKmsKeyVersion, "kmsKeyId");
const hasInternalKmsIdColumn = await knex.schema.hasColumn(TableName.InternalKmsKeyVersion, "internalKmsId");
await knex.schema.alterTable(TableName.InternalKmsKeyVersion, (tb) => {
if (hasKmsKeyIdColumn) tb.dropColumn("kmsKeyId");
if (!hasInternalKmsIdColumn) {
tb.uuid("internalKmsId").notNullable();
tb.foreign("internalKmsId").references("id").inTable(TableName.InternalKms).onDelete("CASCADE");
}
});
}
};
const createExternalKmsKeyTable = async (knex: Knex) => {
const doesExternalKmsServiceExist = await knex.schema.hasTable(TableName.ExternalKms);
if (!doesExternalKmsServiceExist) {
await knex.schema.createTable(TableName.ExternalKms, (tb) => {
tb.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
tb.string("provider").notNullable();
tb.binary("encryptedProviderInputs").notNullable();
tb.string("status");
tb.string("statusDetails");
tb.uuid("kmsKeyId").unique().notNullable();
tb.foreign("kmsKeyId").references("id").inTable(TableName.KmsKey).onDelete("CASCADE");
});
}
};
const removeNonRequiredFieldsFromKmsKeyTableAndBackfillRequiredData = async (knex: Knex) => {
const doesOldKmsKeyTableExist = await knex.schema.hasTable(TableName.KmsKey);
// building the internal kms table by filling from old kms table
if (doesOldKmsKeyTableExist) {
const hasSlugColumn = await knex.schema.hasColumn(TableName.KmsKey, "slug");
const hasEncryptedKeyColumn = await knex.schema.hasColumn(TableName.KmsKey, "encryptedKey");
const hasEncryptionAlgorithmColumn = await knex.schema.hasColumn(TableName.KmsKey, "encryptionAlgorithm");
const hasVersionColumn = await knex.schema.hasColumn(TableName.KmsKey, "version");
const hasTimestamps = await knex.schema.hasColumn(TableName.KmsKey, "createdAt");
const hasProjectId = await knex.schema.hasColumn(TableName.KmsKey, "projectId");
const hasOrgId = await knex.schema.hasColumn(TableName.KmsKey, "orgId");
await knex.schema.alterTable(TableName.KmsKey, (tb) => {
if (!hasSlugColumn) tb.string("slug", 32);
if (hasEncryptedKeyColumn) tb.dropColumn("encryptedKey");
if (hasEncryptionAlgorithmColumn) tb.dropColumn("encryptionAlgorithm");
if (hasVersionColumn) tb.dropColumn("version");
if (!hasTimestamps) tb.timestamps(true, true, true);
});
// backfill all org id in kms key because its gonna be changed to non nullable
if (hasProjectId && hasOrgId) {
await knex(TableName.KmsKey)
.whereNull("orgId")
.update({
// eslint-disable-next-line
// @ts-ignore because generate schema happens after this
orgId: knex(TableName.Project)
.select("orgId")
.where("id", knex.raw("??", [`${TableName.KmsKey}.projectId`]))
});
}
// backfill slugs in kms
const missingSlugs = await knex(TableName.KmsKey).whereNull("slug").select("id");
if (missingSlugs.length) {
await knex(TableName.KmsKey)
// eslint-disable-next-line
// @ts-ignore because generate schema happens after this
.insert(missingSlugs.map(({ id }) => ({ id, slug: slugify(alphaNumericNanoId(8).toLowerCase()) })))
.onConflict("id")
.merge();
}
await knex.schema.alterTable(TableName.KmsKey, (tb) => {
if (hasOrgId) tb.uuid("orgId").notNullable().alter();
tb.string("slug", 32).notNullable().alter();
if (hasProjectId) tb.dropColumn("projectId");
if (hasOrgId) tb.unique(["orgId", "slug"]);
});
}
};
/*
* The goal for this migration is split the existing kms key into three table
* the kms-key table would be a container table that contains
* the internal kms key table and external kms table
*/
export async function up(knex: Knex): Promise<void> {
await createInternalKmsTableAndBackfillData(knex);
await renameKmsKeyVersionTableAsInternalKmsKeyVersion(knex);
await removeNonRequiredFieldsFromKmsKeyTableAndBackfillRequiredData(knex);
await createExternalKmsKeyTable(knex);
const doesOrgKmsKeyExist = await knex.schema.hasColumn(TableName.Organization, "kmsDefaultKeyId");
if (!doesOrgKmsKeyExist) {
await knex.schema.alterTable(TableName.Organization, (tb) => {
tb.uuid("kmsDefaultKeyId").nullable();
tb.foreign("kmsDefaultKeyId").references("id").inTable(TableName.KmsKey);
});
}
const doesProjectKmsSecretManagerKeyExist = await knex.schema.hasColumn(TableName.Project, "kmsSecretManagerKeyId");
if (!doesProjectKmsSecretManagerKeyExist) {
await knex.schema.alterTable(TableName.Project, (tb) => {
tb.uuid("kmsSecretManagerKeyId").nullable();
tb.foreign("kmsSecretManagerKeyId").references("id").inTable(TableName.KmsKey);
});
}
}
const renameInternalKmsKeyVersionBackToKmsKeyVersion = async (knex: Knex) => {
const doesInternalKmsKeyVersionTableExist = await knex.schema.hasTable(TableName.InternalKmsKeyVersion);
const doesKmsKeyVersionTableExist = await knex.schema.hasTable(TableName.KmsKeyVersion);
if (doesInternalKmsKeyVersionTableExist && !doesKmsKeyVersionTableExist) {
// because we haven't started using versioning for kms thus no data exist
await knex.schema.renameTable(TableName.InternalKmsKeyVersion, TableName.KmsKeyVersion);
const hasInternalKmsIdColumn = await knex.schema.hasColumn(TableName.KmsKeyVersion, "internalKmsId");
const hasKmsKeyIdColumn = await knex.schema.hasColumn(TableName.KmsKeyVersion, "kmsKeyId");
await knex.schema.alterTable(TableName.KmsKeyVersion, (tb) => {
if (hasInternalKmsIdColumn) tb.dropColumn("internalKmsId");
if (!hasKmsKeyIdColumn) {
tb.uuid("kmsKeyId").notNullable();
tb.foreign("kmsKeyId").references("id").inTable(TableName.KmsKey).onDelete("CASCADE");
}
});
}
};
const bringBackKmsKeyFields = async (knex: Knex) => {
const doesOldKmsKeyTableExist = await knex.schema.hasTable(TableName.KmsKey);
const doesInternalKmsTableExist = await knex.schema.hasTable(TableName.InternalKms);
if (doesOldKmsKeyTableExist && doesInternalKmsTableExist) {
const hasSlug = await knex.schema.hasColumn(TableName.KmsKey, "slug");
const hasEncryptedKeyColumn = await knex.schema.hasColumn(TableName.KmsKey, "encryptedKey");
const hasEncryptionAlgorithmColumn = await knex.schema.hasColumn(TableName.KmsKey, "encryptionAlgorithm");
const hasVersionColumn = await knex.schema.hasColumn(TableName.KmsKey, "version");
const hasNullableOrgId = await knex.schema.hasColumn(TableName.KmsKey, "orgId");
const hasProjectIdColumn = await knex.schema.hasColumn(TableName.KmsKey, "projectId");
await knex.schema.alterTable(TableName.KmsKey, (tb) => {
if (!hasEncryptedKeyColumn) tb.binary("encryptedKey");
if (!hasEncryptionAlgorithmColumn) tb.string("encryptionAlgorithm");
if (!hasVersionColumn) tb.integer("version").defaultTo(1);
if (hasNullableOrgId) tb.uuid("orgId").nullable().alter();
if (!hasProjectIdColumn) {
tb.string("projectId");
tb.foreign("projectId").references("id").inTable(TableName.Project).onDelete("CASCADE");
}
if (hasSlug) tb.dropColumn("slug");
});
}
};
const backfillKmsKeyFromInternalKmsTable = async (knex: Knex) => {
const doesOldKmsKeyTableExist = await knex.schema.hasTable(TableName.KmsKey);
const doesInternalKmsTableExist = await knex.schema.hasTable(TableName.InternalKms);
if (doesInternalKmsTableExist && doesOldKmsKeyTableExist) {
// backfill kms key with internal kms data
await knex(TableName.KmsKey).update({
// eslint-disable-next-line
// @ts-ignore because generate schema happens after this
encryptedKey: knex(TableName.InternalKms)
.select("encryptedKey")
.where("kmsKeyId", knex.raw("??", [`${TableName.KmsKey}.id`])),
// eslint-disable-next-line
// @ts-ignore because generate schema happens after this
encryptionAlgorithm: knex(TableName.InternalKms)
.select("encryptionAlgorithm")
.where("kmsKeyId", knex.raw("??", [`${TableName.KmsKey}.id`])),
// eslint-disable-next-line
// @ts-ignore because generate schema happens after this
projectId: knex(TableName.Project)
.select("id")
.where("kmsCertificateKeyId", knex.raw("??", [`${TableName.KmsKey}.id`]))
});
}
};
export async function down(knex: Knex): Promise<void> {
const doesOrgKmsKeyExist = await knex.schema.hasColumn(TableName.Organization, "kmsDefaultKeyId");
if (doesOrgKmsKeyExist) {
await knex.schema.alterTable(TableName.Organization, (tb) => {
tb.dropColumn("kmsDefaultKeyId");
});
}
const doesProjectKmsSecretManagerKeyExist = await knex.schema.hasColumn(TableName.Project, "kmsSecretManagerKeyId");
if (doesProjectKmsSecretManagerKeyExist) {
await knex.schema.alterTable(TableName.Project, (tb) => {
tb.dropColumn("kmsSecretManagerKeyId");
});
}
await renameInternalKmsKeyVersionBackToKmsKeyVersion(knex);
await bringBackKmsKeyFields(knex);
await backfillKmsKeyFromInternalKmsTable(knex);
const doesOldKmsKeyTableExist = await knex.schema.hasTable(TableName.KmsKey);
if (doesOldKmsKeyTableExist) {
await knex.schema.alterTable(TableName.KmsKey, (tb) => {
tb.binary("encryptedKey").notNullable().alter();
tb.string("encryptionAlgorithm").notNullable().alter();
});
}
const doesInternalKmsTableExist = await knex.schema.hasTable(TableName.InternalKms);
if (doesInternalKmsTableExist) await knex.schema.dropTable(TableName.InternalKms);
const doesExternalKmsServiceExist = await knex.schema.hasTable(TableName.ExternalKms);
if (doesExternalKmsServiceExist) await knex.schema.dropTable(TableName.ExternalKms);
}

View File

@ -0,0 +1,34 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.IdentityOidcAuth))) {
await knex.schema.createTable(TableName.IdentityOidcAuth, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.bigInteger("accessTokenTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenMaxTTL").defaultTo(7200).notNullable();
t.bigInteger("accessTokenNumUsesLimit").defaultTo(0).notNullable();
t.jsonb("accessTokenTrustedIps").notNullable();
t.uuid("identityId").notNullable().unique();
t.foreign("identityId").references("id").inTable(TableName.Identity).onDelete("CASCADE");
t.string("oidcDiscoveryUrl").notNullable();
t.text("encryptedCaCert").notNullable();
t.string("caCertIV").notNullable();
t.string("caCertTag").notNullable();
t.string("boundIssuer").notNullable();
t.string("boundAudiences").notNullable();
t.jsonb("boundClaims").notNullable();
t.string("boundSubject");
t.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.IdentityOidcAuth);
}
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.IdentityOidcAuth);
await dropOnUpdateTrigger(knex, TableName.IdentityOidcAuth);
}

View File

@ -0,0 +1,23 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const ExternalKmsSchema = z.object({
id: z.string().uuid(),
provider: z.string(),
encryptedProviderInputs: zodBuffer,
status: z.string().nullable().optional(),
statusDetails: z.string().nullable().optional(),
kmsKeyId: z.string().uuid()
});
export type TExternalKms = z.infer<typeof ExternalKmsSchema>;
export type TExternalKmsInsert = Omit<z.input<typeof ExternalKmsSchema>, TImmutableDBKeys>;
export type TExternalKmsUpdate = Partial<Omit<z.input<typeof ExternalKmsSchema>, TImmutableDBKeys>>;

View File

@ -0,0 +1,31 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const IdentityOidcAuthsSchema = z.object({
id: z.string().uuid(),
accessTokenTTL: z.coerce.number().default(7200),
accessTokenMaxTTL: z.coerce.number().default(7200),
accessTokenNumUsesLimit: z.coerce.number().default(0),
accessTokenTrustedIps: z.unknown(),
identityId: z.string().uuid(),
oidcDiscoveryUrl: z.string(),
encryptedCaCert: z.string(),
caCertIV: z.string(),
caCertTag: z.string(),
boundIssuer: z.string(),
boundAudiences: z.string(),
boundClaims: z.unknown(),
boundSubject: z.string().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TIdentityOidcAuths = z.infer<typeof IdentityOidcAuthsSchema>;
export type TIdentityOidcAuthsInsert = Omit<z.input<typeof IdentityOidcAuthsSchema>, TImmutableDBKeys>;
export type TIdentityOidcAuthsUpdate = Partial<Omit<z.input<typeof IdentityOidcAuthsSchema>, TImmutableDBKeys>>;

View File

@ -17,6 +17,7 @@ export * from "./certificate-secrets";
export * from "./certificates";
export * from "./dynamic-secret-leases";
export * from "./dynamic-secrets";
export * from "./external-kms";
export * from "./git-app-install-sessions";
export * from "./git-app-org";
export * from "./group-project-membership-roles";
@ -28,6 +29,7 @@ export * from "./identity-aws-auths";
export * from "./identity-azure-auths";
export * from "./identity-gcp-auths";
export * from "./identity-kubernetes-auths";
export * from "./identity-oidc-auths";
export * from "./identity-org-memberships";
export * from "./identity-project-additional-privilege";
export * from "./identity-project-membership-role";
@ -38,6 +40,7 @@ export * from "./identity-universal-auths";
export * from "./incident-contacts";
export * from "./integration-auths";
export * from "./integrations";
export * from "./internal-kms";
export * from "./kms-key-versions";
export * from "./kms-keys";
export * from "./kms-root-config";

View File

@ -0,0 +1,21 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const InternalKmsKeyVersionSchema = z.object({
id: z.string().uuid(),
encryptedKey: zodBuffer,
version: z.number(),
internalKmsId: z.string().uuid()
});
export type TInternalKmsKeyVersion = z.infer<typeof InternalKmsKeyVersionSchema>;
export type TInternalKmsKeyVersionInsert = Omit<z.input<typeof InternalKmsKeyVersionSchema>, TImmutableDBKeys>;
export type TInternalKmsKeyVersionUpdate = Partial<Omit<z.input<typeof InternalKmsKeyVersionSchema>, TImmutableDBKeys>>;

View File

@ -0,0 +1,22 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const InternalKmsSchema = z.object({
id: z.string().uuid(),
encryptedKey: zodBuffer,
encryptionAlgorithm: z.string(),
version: z.number().default(1),
kmsKeyId: z.string().uuid()
});
export type TInternalKms = z.infer<typeof InternalKmsSchema>;
export type TInternalKmsInsert = Omit<z.input<typeof InternalKmsSchema>, TImmutableDBKeys>;
export type TInternalKmsUpdate = Partial<Omit<z.input<typeof InternalKmsSchema>, TImmutableDBKeys>>;

View File

@ -5,20 +5,17 @@
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const KmsKeysSchema = z.object({
id: z.string().uuid(),
encryptedKey: zodBuffer,
encryptionAlgorithm: z.string(),
version: z.number().default(1),
description: z.string().nullable().optional(),
isDisabled: z.boolean().default(false).nullable().optional(),
isReserved: z.boolean().default(true).nullable().optional(),
projectId: z.string().nullable().optional(),
orgId: z.string().uuid().nullable().optional()
orgId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date(),
slug: z.string()
});
export type TKmsKeys = z.infer<typeof KmsKeysSchema>;

View File

@ -60,6 +60,7 @@ export enum TableName {
IdentityAzureAuth = "identity_azure_auths",
IdentityUaClientSecret = "identity_ua_client_secrets",
IdentityAwsAuth = "identity_aws_auths",
IdentityOidcAuth = "identity_oidc_auths",
IdentityOrgMembership = "identity_org_memberships",
IdentityProjectMembership = "identity_project_memberships",
IdentityProjectMembershipRole = "identity_project_membership_role",
@ -95,6 +96,10 @@ export enum TableName {
// KMS Service
KmsServerRootConfig = "kms_root_config",
KmsKey = "kms_keys",
ExternalKms = "external_kms",
InternalKms = "internal_kms",
InternalKmsKeyVersion = "internal_kms_key_version",
// @depreciated
KmsKeyVersion = "kms_key_versions"
}
@ -167,5 +172,6 @@ export enum IdentityAuthMethod {
KUBERNETES_AUTH = "kubernetes-auth",
GCP_AUTH = "gcp-auth",
AWS_AUTH = "aws-auth",
AZURE_AUTH = "azure-auth"
AZURE_AUTH = "azure-auth",
OIDC_AUTH = "oidc-auth"
}

View File

@ -15,7 +15,8 @@ export const OrganizationsSchema = z.object({
createdAt: z.date(),
updatedAt: z.date(),
authEnforced: z.boolean().default(false).nullable().optional(),
scimEnabled: z.boolean().default(false).nullable().optional()
scimEnabled: z.boolean().default(false).nullable().optional(),
kmsDefaultKeyId: z.string().uuid().nullable().optional()
});
export type TOrganizations = z.infer<typeof OrganizationsSchema>;

View File

@ -19,7 +19,8 @@ export const ProjectsSchema = z.object({
upgradeStatus: z.string().nullable().optional(),
pitVersionLimit: z.number().default(10),
kmsCertificateKeyId: z.string().uuid().nullable().optional(),
auditLogsRetentionDays: z.number().nullable().optional()
auditLogsRetentionDays: z.number().nullable().optional(),
kmsSecretManagerKeyId: z.string().uuid().nullable().optional()
});
export type TProjects = z.infer<typeof ProjectsSchema>;

View File

@ -0,0 +1,190 @@
import { z } from "zod";
import { ExternalKmsSchema, KmsKeysSchema } from "@app/db/schemas";
import {
ExternalKmsAwsSchema,
ExternalKmsInputSchema,
ExternalKmsInputUpdateSchema
} from "@app/ee/services/external-kms/providers/model";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
const sanitizedExternalSchema = KmsKeysSchema.extend({
external: ExternalKmsSchema.pick({
id: true,
status: true,
statusDetails: true,
provider: true
})
});
const sanitizedExternalSchemaForGetById = KmsKeysSchema.extend({
external: ExternalKmsSchema.pick({
id: true,
status: true,
statusDetails: true,
provider: true
}).extend({
providerInput: ExternalKmsAwsSchema
})
});
export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
server.route({
method: "POST",
url: "/",
config: {
rateLimit: writeLimit
},
schema: {
body: z.object({
slug: z.string().min(1).trim().optional(),
description: z.string().min(1).trim().optional(),
provider: ExternalKmsInputSchema
}),
response: {
200: z.object({
externalKms: sanitizedExternalSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const externalKms = await server.services.externalKms.create({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
slug: req.body.slug,
provider: req.body.provider,
description: req.body.description
});
return { externalKms };
}
});
server.route({
method: "PATCH",
url: "/:id",
config: {
rateLimit: writeLimit
},
schema: {
params: z.object({
id: z.string().trim().min(1)
}),
body: z.object({
slug: z.string().min(1).trim().optional(),
description: z.string().min(1).trim().optional(),
provider: ExternalKmsInputUpdateSchema
}),
response: {
200: z.object({
externalKms: sanitizedExternalSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const externalKms = await server.services.externalKms.updateById({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
slug: req.body.slug,
provider: req.body.provider,
description: req.body.description,
id: req.params.id
});
return { externalKms };
}
});
server.route({
method: "DELETE",
url: "/:id",
config: {
rateLimit: writeLimit
},
schema: {
params: z.object({
id: z.string().trim().min(1)
}),
response: {
200: z.object({
externalKms: sanitizedExternalSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const externalKms = await server.services.externalKms.deleteById({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
return { externalKms };
}
});
server.route({
method: "GET",
url: "/:id",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
id: z.string().trim().min(1)
}),
response: {
200: z.object({
externalKms: sanitizedExternalSchemaForGetById
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const externalKms = await server.services.externalKms.findById({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
return { externalKms };
}
});
server.route({
method: "GET",
url: "/slug/:slug",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
slug: z.string().trim().min(1)
}),
response: {
200: z.object({
externalKms: sanitizedExternalSchemaForGetById
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const externalKms = await server.services.externalKms.findBySlug({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
slug: req.params.slug
});
return { externalKms };
}
});
};

View File

@ -475,10 +475,13 @@ export const registerScimRouter = async (server: FastifyZodProvider) => {
}),
z.object({
op: z.literal("add"),
value: z.object({
value: z.string().trim(),
display: z.string().trim().optional()
})
path: z.string().trim(),
value: z.array(
z.object({
value: z.string().trim(),
display: z.string().trim().optional()
})
)
})
])
)

View File

@ -45,6 +45,7 @@ export enum EventType {
CREATE_SECRETS = "create-secrets",
UPDATE_SECRET = "update-secret",
UPDATE_SECRETS = "update-secrets",
MOVE_SECRETS = "move-secrets",
DELETE_SECRET = "delete-secret",
DELETE_SECRETS = "delete-secrets",
GET_WORKSPACE_KEY = "get-workspace-key",
@ -78,6 +79,11 @@ export enum EventType {
UPDATE_IDENTITY_KUBENETES_AUTH = "update-identity-kubernetes-auth",
GET_IDENTITY_KUBERNETES_AUTH = "get-identity-kubernetes-auth",
REVOKE_IDENTITY_KUBERNETES_AUTH = "revoke-identity-kubernetes-auth",
LOGIN_IDENTITY_OIDC_AUTH = "login-identity-oidc-auth",
ADD_IDENTITY_OIDC_AUTH = "add-identity-oidc-auth",
UPDATE_IDENTITY_OIDC_AUTH = "update-identity-oidc-auth",
GET_IDENTITY_OIDC_AUTH = "get-identity-oidc-auth",
REVOKE_IDENTITY_OIDC_AUTH = "revoke-identity-oidc-auth",
CREATE_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRET = "create-identity-universal-auth-client-secret",
REVOKE_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRET = "revoke-identity-universal-auth-client-secret",
GET_IDENTITY_UNIVERSAL_AUTH_CLIENT_SECRETS = "get-identity-universal-auth-client-secret",
@ -235,6 +241,17 @@ interface UpdateSecretBatchEvent {
};
}
interface MoveSecretsEvent {
type: EventType.MOVE_SECRETS;
metadata: {
sourceEnvironment: string;
sourceSecretPath: string;
destinationEnvironment: string;
destinationSecretPath: string;
secretIds: string[];
};
}
interface DeleteSecretEvent {
type: EventType.DELETE_SECRET;
metadata: {
@ -749,6 +766,63 @@ interface GetIdentityAzureAuthEvent {
};
}
interface LoginIdentityOidcAuthEvent {
type: EventType.LOGIN_IDENTITY_OIDC_AUTH;
metadata: {
identityId: string;
identityOidcAuthId: string;
identityAccessTokenId: string;
};
}
interface AddIdentityOidcAuthEvent {
type: EventType.ADD_IDENTITY_OIDC_AUTH;
metadata: {
identityId: string;
oidcDiscoveryUrl: string;
caCert: string;
boundIssuer: string;
boundAudiences: string;
boundClaims: Record<string, string>;
boundSubject: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: Array<TIdentityTrustedIp>;
};
}
interface DeleteIdentityOidcAuthEvent {
type: EventType.REVOKE_IDENTITY_OIDC_AUTH;
metadata: {
identityId: string;
};
}
interface UpdateIdentityOidcAuthEvent {
type: EventType.UPDATE_IDENTITY_OIDC_AUTH;
metadata: {
identityId: string;
oidcDiscoveryUrl?: string;
caCert?: string;
boundIssuer?: string;
boundAudiences?: string;
boundClaims?: Record<string, string>;
boundSubject?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: Array<TIdentityTrustedIp>;
};
}
interface GetIdentityOidcAuthEvent {
type: EventType.GET_IDENTITY_OIDC_AUTH;
metadata: {
identityId: string;
};
}
interface CreateEnvironmentEvent {
type: EventType.CREATE_ENVIRONMENT;
metadata: {
@ -1097,6 +1171,7 @@ export type Event =
| CreateSecretBatchEvent
| UpdateSecretEvent
| UpdateSecretBatchEvent
| MoveSecretsEvent
| DeleteSecretEvent
| DeleteSecretBatchEvent
| GetWorkspaceKeyEvent
@ -1149,6 +1224,11 @@ export type Event =
| DeleteIdentityAzureAuthEvent
| UpdateIdentityAzureAuthEvent
| GetIdentityAzureAuthEvent
| LoginIdentityOidcAuthEvent
| AddIdentityOidcAuthEvent
| DeleteIdentityOidcAuthEvent
| UpdateIdentityOidcAuthEvent
| GetIdentityOidcAuthEvent
| CreateEnvironmentEvent
| UpdateEnvironmentEvent
| DeleteEnvironmentEvent

View File

@ -17,7 +17,7 @@ type TCertificateAuthorityCrlServiceFactoryDep = {
certificateAuthorityDAL: Pick<TCertificateAuthorityDALFactory, "findById">;
certificateAuthorityCrlDAL: Pick<TCertificateAuthorityCrlDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "findOne" | "updateById" | "transaction">;
kmsService: Pick<TKmsServiceFactory, "decrypt" | "generateKmsKey">;
kmsService: Pick<TKmsServiceFactory, "decryptWithKmsKey" | "generateKmsKey">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
};
@ -68,11 +68,11 @@ export const certificateAuthorityCrlServiceFactory = ({
kmsService
});
const decryptedCrl = await kmsService.decrypt({
kmsId: keyId,
cipherTextBlob: caCrl.encryptedCrl
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: keyId
});
const decryptedCrl = kmsDecryptor({ cipherTextBlob: caCrl.encryptedCrl });
const crl = new x509.X509Crl(decryptedCrl);
const base64crl = crl.toString("base64");

View File

@ -0,0 +1,47 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName, TKmsKeys } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TExternalKmsDALFactory = ReturnType<typeof externalKmsDALFactory>;
export const externalKmsDALFactory = (db: TDbClient) => {
const externalKmsOrm = ormify(db, TableName.ExternalKms);
const find = async (filter: Partial<TKmsKeys>, tx?: Knex) => {
try {
const result = await (tx || db.replicaNode())(TableName.ExternalKms)
.join(TableName.KmsKey, `${TableName.KmsKey}.id`, `${TableName.ExternalKms}.kmsKeyId`)
.where(filter)
.select(selectAllTableCols(TableName.KmsKey))
.select(
db.ref("id").withSchema(TableName.ExternalKms).as("externalKmsId"),
db.ref("provider").withSchema(TableName.ExternalKms).as("externalKmsProvider"),
db.ref("encryptedProviderInputs").withSchema(TableName.ExternalKms).as("externalKmsEncryptedProviderInput"),
db.ref("status").withSchema(TableName.ExternalKms).as("externalKmsStatus"),
db.ref("statusDetails").withSchema(TableName.ExternalKms).as("externalKmsStatusDetails")
);
return result.map((el) => ({
id: el.id,
description: el.description,
isDisabled: el.isDisabled,
isReserved: el.isReserved,
orgId: el.orgId,
slug: el.slug,
externalKms: {
id: el.externalKmsId,
provider: el.externalKmsProvider,
status: el.externalKmsStatus,
statusDetails: el.externalKmsStatusDetails
}
}));
} catch (error) {
throw new DatabaseError({ error, name: "Find" });
}
};
return { ...externalKmsOrm, find };
};

View File

@ -0,0 +1,309 @@
import { ForbiddenError } from "@casl/ability";
import slugify from "@sindresorhus/slugify";
import { BadRequestError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TKmsKeyDALFactory } from "@app/services/kms/kms-key-dal";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "../permission/org-permission";
import { TPermissionServiceFactory } from "../permission/permission-service";
import { TExternalKmsDALFactory } from "./external-kms-dal";
import {
TCreateExternalKmsDTO,
TDeleteExternalKmsDTO,
TGetExternalKmsByIdDTO,
TGetExternalKmsBySlugDTO,
TListExternalKmsDTO,
TUpdateExternalKmsDTO
} from "./external-kms-types";
import { AwsKmsProviderFactory } from "./providers/aws-kms";
import { ExternalKmsAwsSchema, KmsProviders } from "./providers/model";
type TExternalKmsServiceFactoryDep = {
externalKmsDAL: TExternalKmsDALFactory;
kmsService: Pick<TKmsServiceFactory, "getOrgKmsKeyId" | "encryptWithKmsKey" | "decryptWithKmsKey">;
kmsDAL: Pick<TKmsKeyDALFactory, "create" | "updateById" | "findById" | "deleteById" | "findOne">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
};
export type TExternalKmsServiceFactory = ReturnType<typeof externalKmsServiceFactory>;
export const externalKmsServiceFactory = ({
externalKmsDAL,
permissionService,
kmsService,
kmsDAL
}: TExternalKmsServiceFactoryDep) => {
const create = async ({
provider,
description,
actor,
slug,
actorId,
actorOrgId,
actorAuthMethod
}: TCreateExternalKmsDTO) => {
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const kmsSlug = slug ? slugify(slug) : slugify(alphaNumericNanoId(32));
let sanitizedProviderInput = "";
switch (provider.type) {
case KmsProviders.Aws:
{
const externalKms = await AwsKmsProviderFactory({ inputs: provider.inputs });
await externalKms.validateConnection();
// if missing kms key this generate a new kms key id and returns new provider input
const newProviderInput = await externalKms.generateInputKmsKey();
sanitizedProviderInput = JSON.stringify(newProviderInput);
}
break;
default:
throw new BadRequestError({ message: "external kms provided is invalid" });
}
const orgKmsKeyId = await kmsService.getOrgKmsKeyId(actorOrgId);
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: orgKmsKeyId
});
const { cipherTextBlob: encryptedProviderInputs } = kmsEncryptor({
plainText: Buffer.from(sanitizedProviderInput, "utf8")
});
const externalKms = await externalKmsDAL.transaction(async (tx) => {
const kms = await kmsDAL.create(
{
isReserved: false,
description,
slug: kmsSlug,
orgId: actorOrgId
},
tx
);
const externalKmsCfg = await externalKmsDAL.create(
{
provider: provider.type,
encryptedProviderInputs,
kmsKeyId: kms.id
},
tx
);
return { ...kms, external: externalKmsCfg };
});
return externalKms;
};
const updateById = async ({
provider,
description,
actor,
id: kmsId,
slug,
actorId,
actorOrgId,
actorAuthMethod
}: TUpdateExternalKmsDTO) => {
const kmsDoc = await kmsDAL.findById(kmsId);
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
kmsDoc.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const kmsSlug = slug ? slugify(slug) : undefined;
const externalKmsDoc = await externalKmsDAL.findOne({ kmsKeyId: kmsDoc.id });
if (!externalKmsDoc) throw new BadRequestError({ message: "External kms not found" });
const orgDefaultKmsId = await kmsService.getOrgKmsKeyId(kmsDoc.orgId);
let sanitizedProviderInput = "";
if (provider) {
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: orgDefaultKmsId
});
const decryptedProviderInputBlob = kmsDecryptor({
cipherTextBlob: externalKmsDoc.encryptedProviderInputs
});
switch (provider.type) {
case KmsProviders.Aws:
{
const decryptedProviderInput = await ExternalKmsAwsSchema.parseAsync(
JSON.parse(decryptedProviderInputBlob.toString("utf8"))
);
const updatedProviderInput = { ...decryptedProviderInput, ...provider.inputs };
const externalKms = await AwsKmsProviderFactory({ inputs: updatedProviderInput });
await externalKms.validateConnection();
sanitizedProviderInput = JSON.stringify(updatedProviderInput);
}
break;
default:
throw new BadRequestError({ message: "external kms provided is invalid" });
}
}
let encryptedProviderInputs: Buffer | undefined;
if (sanitizedProviderInput) {
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: orgDefaultKmsId
});
const { cipherTextBlob } = kmsEncryptor({
plainText: Buffer.from(sanitizedProviderInput, "utf8")
});
encryptedProviderInputs = cipherTextBlob;
}
const externalKms = await externalKmsDAL.transaction(async (tx) => {
const kms = await kmsDAL.updateById(
kmsDoc.id,
{
description,
slug: kmsSlug
},
tx
);
if (encryptedProviderInputs) {
const externalKmsCfg = await externalKmsDAL.updateById(
externalKmsDoc.id,
{
encryptedProviderInputs
},
tx
);
return { ...kms, external: externalKmsCfg };
}
return { ...kms, external: externalKmsDoc };
});
return externalKms;
};
const deleteById = async ({ actor, id: kmsId, actorId, actorOrgId, actorAuthMethod }: TDeleteExternalKmsDTO) => {
const kmsDoc = await kmsDAL.findById(kmsId);
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
kmsDoc.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const externalKmsDoc = await externalKmsDAL.findOne({ kmsKeyId: kmsDoc.id });
if (!externalKmsDoc) throw new BadRequestError({ message: "External kms not found" });
const externalKms = await externalKmsDAL.transaction(async (tx) => {
const kms = await kmsDAL.deleteById(kmsDoc.id, tx);
return { ...kms, external: externalKmsDoc };
});
return externalKms;
};
const list = async ({ actor, actorId, actorOrgId, actorAuthMethod }: TListExternalKmsDTO) => {
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const externalKmsDocs = await externalKmsDAL.find({ orgId: actorOrgId });
return externalKmsDocs;
};
const findById = async ({ actor, actorId, actorOrgId, actorAuthMethod, id: kmsId }: TGetExternalKmsByIdDTO) => {
const kmsDoc = await kmsDAL.findById(kmsId);
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
kmsDoc.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const externalKmsDoc = await externalKmsDAL.findOne({ kmsKeyId: kmsDoc.id });
if (!externalKmsDoc) throw new BadRequestError({ message: "External kms not found" });
const orgDefaultKmsId = await kmsService.getOrgKmsKeyId(kmsDoc.orgId);
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: orgDefaultKmsId
});
const decryptedProviderInputBlob = kmsDecryptor({
cipherTextBlob: externalKmsDoc.encryptedProviderInputs
});
switch (externalKmsDoc.provider) {
case KmsProviders.Aws: {
const decryptedProviderInput = await ExternalKmsAwsSchema.parseAsync(
JSON.parse(decryptedProviderInputBlob.toString("utf8"))
);
return { ...kmsDoc, external: { ...externalKmsDoc, providerInput: decryptedProviderInput } };
}
default:
throw new BadRequestError({ message: "external kms provided is invalid" });
}
};
const findBySlug = async ({
actor,
actorId,
actorOrgId,
actorAuthMethod,
slug: kmsSlug
}: TGetExternalKmsBySlugDTO) => {
const kmsDoc = await kmsDAL.findOne({ slug: kmsSlug, orgId: actorOrgId });
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
kmsDoc.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const externalKmsDoc = await externalKmsDAL.findOne({ kmsKeyId: kmsDoc.id });
if (!externalKmsDoc) throw new BadRequestError({ message: "External kms not found" });
const orgDefaultKmsId = await kmsService.getOrgKmsKeyId(kmsDoc.orgId);
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: orgDefaultKmsId
});
const decryptedProviderInputBlob = kmsDecryptor({
cipherTextBlob: externalKmsDoc.encryptedProviderInputs
});
switch (externalKmsDoc.provider) {
case KmsProviders.Aws: {
const decryptedProviderInput = await ExternalKmsAwsSchema.parseAsync(
JSON.parse(decryptedProviderInputBlob.toString("utf8"))
);
return { ...kmsDoc, external: { ...externalKmsDoc, providerInput: decryptedProviderInput } };
}
default:
throw new BadRequestError({ message: "external kms provided is invalid" });
}
};
return {
create,
updateById,
deleteById,
list,
findById,
findBySlug
};
};

View File

@ -0,0 +1,30 @@
import { TOrgPermission } from "@app/lib/types";
import { TExternalKmsInputSchema, TExternalKmsInputUpdateSchema } from "./providers/model";
export type TCreateExternalKmsDTO = {
slug?: string;
description?: string;
provider: TExternalKmsInputSchema;
} & Omit<TOrgPermission, "orgId">;
export type TUpdateExternalKmsDTO = {
id: string;
slug?: string;
description?: string;
provider?: TExternalKmsInputUpdateSchema;
} & Omit<TOrgPermission, "orgId">;
export type TDeleteExternalKmsDTO = {
id: string;
} & Omit<TOrgPermission, "orgId">;
export type TListExternalKmsDTO = Omit<TOrgPermission, "orgId">;
export type TGetExternalKmsByIdDTO = {
id: string;
} & Omit<TOrgPermission, "orgId">;
export type TGetExternalKmsBySlugDTO = {
slug: string;
} & Omit<TOrgPermission, "orgId">;

View File

@ -0,0 +1,102 @@
import { CreateKeyCommand, DecryptCommand, DescribeKeyCommand, EncryptCommand, KMSClient } from "@aws-sdk/client-kms";
import { AssumeRoleCommand, STSClient } from "@aws-sdk/client-sts";
import { randomUUID } from "crypto";
import { ExternalKmsAwsSchema, KmsAwsCredentialType, TExternalKmsAwsSchema, TExternalKmsProviderFns } from "./model";
const getAwsKmsClient = async (providerInputs: TExternalKmsAwsSchema) => {
if (providerInputs.credential.type === KmsAwsCredentialType.AssumeRole) {
const awsCredential = providerInputs.credential.data;
const stsClient = new STSClient({
region: providerInputs.awsRegion
});
const command = new AssumeRoleCommand({
RoleArn: awsCredential.assumeRoleArn,
RoleSessionName: `infisical-kms-${randomUUID()}`,
DurationSeconds: 900, // 15mins
ExternalId: awsCredential.externalId
});
const response = await stsClient.send(command);
if (!response.Credentials?.AccessKeyId || !response.Credentials?.SecretAccessKey)
throw new Error("Failed to assume role");
const kmsClient = new KMSClient({
region: providerInputs.awsRegion,
credentials: {
accessKeyId: response.Credentials.AccessKeyId,
secretAccessKey: response.Credentials.SecretAccessKey,
sessionToken: response.Credentials.SessionToken,
expiration: response.Credentials.Expiration
}
});
return kmsClient;
}
const awsCredential = providerInputs.credential.data;
const kmsClient = new KMSClient({
region: providerInputs.awsRegion,
credentials: {
accessKeyId: awsCredential.accessKey,
secretAccessKey: awsCredential.secretKey
}
});
return kmsClient;
};
type AwsKmsProviderArgs = {
inputs: unknown;
};
type TAwsKmsProviderFactoryReturn = TExternalKmsProviderFns & {
generateInputKmsKey: () => Promise<TExternalKmsAwsSchema>;
};
export const AwsKmsProviderFactory = async ({ inputs }: AwsKmsProviderArgs): Promise<TAwsKmsProviderFactoryReturn> => {
const providerInputs = await ExternalKmsAwsSchema.parseAsync(inputs);
const awsClient = await getAwsKmsClient(providerInputs);
const generateInputKmsKey = async () => {
if (providerInputs.kmsKeyId) return providerInputs;
const command = new CreateKeyCommand({ Tags: [{ TagKey: "author", TagValue: "infisical" }] });
const kmsKey = await awsClient.send(command);
if (!kmsKey.KeyMetadata?.KeyId) throw new Error("Failed to generate kms key");
return { ...providerInputs, kmsKeyId: kmsKey.KeyMetadata?.KeyId };
};
const validateConnection = async () => {
const command = new DescribeKeyCommand({
KeyId: providerInputs.kmsKeyId
});
const isConnected = await awsClient.send(command).then(() => true);
return isConnected;
};
const encrypt = async (data: Buffer) => {
const command = new EncryptCommand({
KeyId: providerInputs.kmsKeyId,
Plaintext: data
});
const encryptionCommand = await awsClient.send(command);
if (!encryptionCommand.CiphertextBlob) throw new Error("encryption failed");
return { encryptedBlob: Buffer.from(encryptionCommand.CiphertextBlob) };
};
const decrypt = async (encryptedBlob: Buffer) => {
const command = new DecryptCommand({
KeyId: providerInputs.kmsKeyId,
CiphertextBlob: encryptedBlob
});
const decryptionCommand = await awsClient.send(command);
if (!decryptionCommand.Plaintext) throw new Error("decryption failed");
return { data: Buffer.from(decryptionCommand.Plaintext) };
};
return {
generateInputKmsKey,
validateConnection,
encrypt,
decrypt
};
};

View File

@ -0,0 +1,61 @@
import { z } from "zod";
export enum KmsProviders {
Aws = "aws"
}
export enum KmsAwsCredentialType {
AssumeRole = "assume-role",
AccessKey = "access-key"
}
export const ExternalKmsAwsSchema = z.object({
credential: z
.discriminatedUnion("type", [
z.object({
type: z.literal(KmsAwsCredentialType.AccessKey),
data: z.object({
accessKey: z.string().trim().min(1).describe("AWS user account access key"),
secretKey: z.string().trim().min(1).describe("AWS user account secret key")
})
}),
z.object({
type: z.literal(KmsAwsCredentialType.AssumeRole),
data: z.object({
assumeRoleArn: z.string().trim().min(1).describe("AWS user role to be assumed by infisical"),
externalId: z
.string()
.trim()
.min(1)
.optional()
.describe("AWS assume role external id for furthur security in authentication")
})
})
])
.describe("AWS credential information to connect"),
awsRegion: z.string().min(1).trim().describe("AWS region to connect"),
kmsKeyId: z
.string()
.trim()
.optional()
.describe("A pre existing AWS KMS key id to be used for encryption. If not provided a kms key will be generated.")
});
export type TExternalKmsAwsSchema = z.infer<typeof ExternalKmsAwsSchema>;
// The root schema of the JSON
export const ExternalKmsInputSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(KmsProviders.Aws), inputs: ExternalKmsAwsSchema })
]);
export type TExternalKmsInputSchema = z.infer<typeof ExternalKmsInputSchema>;
export const ExternalKmsInputUpdateSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(KmsProviders.Aws), inputs: ExternalKmsAwsSchema.partial() })
]);
export type TExternalKmsInputUpdateSchema = z.infer<typeof ExternalKmsInputUpdateSchema>;
// generic function shared by all provider
export type TExternalKmsProviderFns = {
validateConnection: () => Promise<boolean>;
encrypt: (data: Buffer) => Promise<{ encryptedBlob: Buffer }>;
decrypt: (encryptedBlob: Buffer) => Promise<{ data: Buffer }>;
};

View File

@ -38,7 +38,8 @@ export const getDefaultOnPremFeatures = (): TFeatureSet => ({
has_used_trial: true,
secretApproval: false,
secretRotation: true,
caCrl: false
caCrl: false,
instanceUserManagement: false
});
export const setupLicenceRequestWithStore = (baseURL: string, refreshUrl: string, licenseKey: string) => {

View File

@ -218,6 +218,8 @@ export const licenseServiceFactory = ({
} else if (instanceType === InstanceType.EnterpriseOnPrem) {
const usedSeats = await licenseDAL.countOfOrgMembers(null, tx);
const usedIdentitySeats = await licenseDAL.countOrgUsersAndIdentities(null, tx);
onPremFeatures.membersUsed = usedSeats;
onPremFeatures.identitiesUsed = usedIdentitySeats;
await licenseServerOnPremApi.request.patch(`/api/license/v1/license`, {
usedSeats,
usedIdentitySeats

View File

@ -30,9 +30,9 @@ export type TFeatureSet = {
workspacesUsed: 0;
dynamicSecret: false;
memberLimit: null;
membersUsed: 0;
membersUsed: number;
identityLimit: null;
identitiesUsed: 0;
identitiesUsed: number;
environmentLimit: null;
environmentsUsed: 0;
secretVersioning: true;
@ -56,6 +56,7 @@ export type TFeatureSet = {
secretApproval: false;
secretRotation: true;
caCrl: false;
instanceUserManagement: false;
};
export type TOrgPlansTableDTO = {

View File

@ -109,6 +109,9 @@ export const permissionServiceFactory = ({
authMethod: ActorAuthMethod,
userOrgId?: string
) => {
// when token is scoped, ensure the passed org id is same as user org id
if (userOrgId && userOrgId !== orgId)
throw new BadRequestError({ message: "Invalid user token. Scoped to different organization." });
const membership = await permissionDAL.getOrgPermission(userId, orgId);
if (!membership) throw new UnauthorizedError({ name: "User not in org" });
if (membership.role === OrgMembershipRole.Custom && !membership.permissions) {

View File

@ -2,7 +2,7 @@ import { ForbiddenError } from "@casl/ability";
import slugify from "@sindresorhus/slugify";
import jwt from "jsonwebtoken";
import { OrgMembershipRole, OrgMembershipStatus, TableName, TGroups, TOrgMemberships, TUsers } from "@app/db/schemas";
import { OrgMembershipRole, OrgMembershipStatus, TableName, TOrgMemberships, TUsers } from "@app/db/schemas";
import { TGroupDALFactory } from "@app/ee/services/group/group-dal";
import { addUsersToGroupByUserIds, removeUsersFromGroupByUserIds } from "@app/ee/services/group/group-fns";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
@ -66,7 +66,7 @@ type TScimServiceFactoryDep = {
projectMembershipDAL: Pick<TProjectMembershipDALFactory, "find" | "delete" | "findProjectMembershipsByUserId">;
groupDAL: Pick<
TGroupDALFactory,
"create" | "findOne" | "findAllGroupMembers" | "update" | "delete" | "findGroups" | "transaction"
"create" | "findOne" | "findAllGroupMembers" | "delete" | "findGroups" | "transaction" | "updateById" | "update"
>;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
userGroupMembershipDAL: Pick<
@ -817,7 +817,6 @@ export const scimServiceFactory = ({
});
};
// TODO: add support for add/remove op
const updateScimGroupNamePatch = async ({ groupId, orgId, operations }: TUpdateScimGroupNamePatchDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
@ -840,23 +839,45 @@ export const scimServiceFactory = ({
status: 403
});
let group: TGroups | undefined;
let group = await groupDAL.findOne({
id: groupId,
orgId
});
if (!group) {
throw new ScimRequestError({
detail: "Group Not Found",
status: 404
});
}
for await (const operation of operations) {
switch (operation.op) {
case "replace": {
await groupDAL.update(
{
id: groupId,
orgId
},
{
name: operation.value.displayName
}
);
group = await groupDAL.updateById(group.id, {
name: operation.value.displayName
});
break;
}
case "add": {
// TODO
const orgMemberships = await orgMembershipDAL.find({
$in: {
id: operation.value.map((member) => member.value)
}
});
await addUsersToGroupByUserIds({
group,
userIds: orgMemberships.map((membership) => membership.userId as string),
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL
});
break;
}
case "remove": {
@ -872,13 +893,6 @@ export const scimServiceFactory = ({
}
}
if (!group) {
throw new ScimRequestError({
detail: "Group Not Found",
status: 404
});
}
return buildScimGroup({
groupId: group.id,
name: group.name,

View File

@ -125,10 +125,11 @@ type TRemoveOp = {
type TAddOp = {
op: "add";
path: string;
value: {
value: string;
display?: string;
};
}[];
};
export type TDeleteScimGroupDTO = {

View File

@ -70,13 +70,13 @@ export const UNIVERSAL_AUTH = {
"The maximum number of times that an access token can be used; a value of 0 implies infinite number of uses."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve."
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke."
identityId: "The ID of the identity to revoke the auth method for."
},
UPDATE: {
identityId: "The ID of the identity to update.",
identityId: "The ID of the identity to update the auth method for.",
clientSecretTrustedIps: "The new list of IPs or CIDR ranges that the Client Secret can be used from.",
accessTokenTrustedIps: "The new list of IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The new lifetime for an access token in seconds.",
@ -119,26 +119,228 @@ export const AWS_AUTH = {
"The base64-encoded body of the signed request. Most likely, the base64-encoding of Action=GetCallerIdentity&Version=2011-06-15.",
iamRequestHeaders: "The base64-encoded headers of the sts:GetCallerIdentity signed request."
},
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
allowedPrincipalArns:
"The comma-separated list of trusted IAM principal ARNs that are allowed to authenticate with Infisical.",
allowedAccountIds:
"The comma-separated list of trusted AWS account IDs that are allowed to authenticate with Infisical.",
accessTokenTTL: "The lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an acccess token in seconds.",
stsEndpoint: "The endpoint URL for the AWS STS API.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
allowedPrincipalArns:
"The new comma-separated list of trusted IAM principal ARNs that are allowed to authenticate with Infisical.",
allowedAccountIds:
"The new comma-separated list of trusted AWS account IDs that are allowed to authenticate with Infisical.",
accessTokenTTL: "The new lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an acccess token in seconds.",
stsEndpoint: "The new endpoint URL for the AWS STS API.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke."
identityId: "The ID of the identity to revoke the auth method for."
}
} as const;
export const AZURE_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login."
},
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
tenantId: "The tenant ID for the Azure AD organization.",
resource: "The resource URL for the application registered in Azure AD.",
allowedServicePrincipalIds:
"The comma-separated list of Azure AD service principal IDs that are allowed to authenticate with Infisical.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
tenantId: "The new tenant ID for the Azure AD organization.",
resource: "The new resource URL for the application registered in Azure AD.",
allowedServicePrincipalIds:
"The new comma-separated list of Azure AD service principal IDs that are allowed to authenticate with Infisical.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The new lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke."
identityId: "The ID of the identity to revoke the auth method for."
}
} as const;
export const GCP_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login."
},
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
allowedServiceAccounts:
"The comma-separated list of trusted service account emails corresponding to the GCE resource(s) allowed to authenticate with Infisical.",
allowedProjects:
"The comma-separated list of trusted GCP projects that the GCE instance must belong to authenticate with Infisical.",
allowedZones:
"The comma-separated list of trusted zones that the GCE instances must belong to authenticate with Infisical.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
allowedServiceAccounts:
"The new comma-separated list of trusted service account emails corresponding to the GCE resource(s) allowed to authenticate with Infisical.",
allowedProjects:
"The new comma-separated list of trusted GCP projects that the GCE instance must belong to authenticate with Infisical.",
allowedZones:
"The new comma-separated list of trusted zones that the GCE instances must belong to authenticate with Infisical.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The new lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke."
identityId: "The ID of the identity to revoke the auth method for."
}
} as const;
export const KUBERNETES_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login."
},
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
kubernetesHost: "The host string, host:port pair, or URL to the base of the Kubernetes API server.",
caCert: "The PEM-encoded CA cert for the Kubernetes API server.",
tokenReviewerJwt:
"The long-lived service account JWT token for Infisical to access the TokenReview API to validate other service account JWT tokens submitted by applications/pods.",
allowedNamespaces:
"The comma-separated list of trusted namespaces that service accounts must belong to authenticate with Infisical.",
allowedNames: "The comma-separated list of trusted service account names that can authenticate with Infisical.",
allowedAudience:
"The optional audience claim that the service account JWT token must have to authenticate with Infisical.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
kubernetesHost: "The new host string, host:port pair, or URL to the base of the Kubernetes API server.",
caCert: "The new PEM-encoded CA cert for the Kubernetes API server.",
tokenReviewerJwt:
"The new long-lived service account JWT token for Infisical to access the TokenReview API to validate other service account JWT tokens submitted by applications/pods.",
allowedNamespaces:
"The new comma-separated list of trusted namespaces that service accounts must belong to authenticate with Infisical.",
allowedNames: "The new comma-separated list of trusted service account names that can authenticate with Infisical.",
allowedAudience:
"The new optional audience claim that the service account JWT token must have to authenticate with Infisical.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The new lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke."
identityId: "The ID of the identity to revoke the auth method for."
}
} as const;
export const TOKEN_AUTH = {
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The new lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke the auth method for."
},
GET_TOKENS: {
identityId: "The ID of the identity to list token metadata for.",
offset: "The offset to start from. If you enter 10, it will start from the 10th token.",
limit: "The number of tokens to return"
},
CREATE_TOKEN: {
identityId: "The ID of the identity to create the token for.",
name: "The name of the token to create"
},
UPDATE_TOKEN: {
tokenId: "The ID of the token to update metadata for",
name: "The name of the token to update to"
},
REVOKE_TOKEN: {
tokenId: "The ID of the token to revoke"
}
} as const;
export const OIDC_AUTH = {
LOGIN: {
identityId: "The ID of the identity to login."
},
ATTACH: {
identityId: "The ID of the identity to attach the configuration onto.",
oidcDiscoveryUrl: "The URL used to retrieve the OpenID Connect configuration from the identity provider.",
caCert: "The PEM-encoded CA cert for establishing secure communication with the Identity Provider endpoints.",
boundIssuer: "The unique identifier of the identity provider issuing the JWT.",
boundAudiences: "The list of intended recipients.",
boundClaims: "The attributes that should be present in the JWT for it to be valid.",
boundSubject: "The expected principal that is the subject of the JWT.",
accessTokenTrustedIps: "The IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The maximum number of times that an access token can be used."
},
UPDATE: {
identityId: "The ID of the identity to update the auth method for.",
oidcDiscoveryUrl: "The new URL used to retrieve the OpenID Connect configuration from the identity provider.",
caCert: "The new PEM-encoded CA cert for establishing secure communication with the Identity Provider endpoints.",
boundIssuer: "The new unique identifier of the identity provider issuing the JWT.",
boundAudiences: "The new list of intended recipients.",
boundClaims: "The new attributes that should be present in the JWT for it to be valid.",
boundSubject: "The new expected principal that is the subject of the JWT.",
accessTokenTrustedIps: "The new IPs or CIDR ranges that access tokens can be used from.",
accessTokenTTL: "The new lifetime for an acccess token in seconds.",
accessTokenMaxTTL: "The new maximum lifetime for an acccess token in seconds.",
accessTokenNumUsesLimit: "The new maximum number of times that an access token can be used."
},
RETRIEVE: {
identityId: "The ID of the identity to retrieve the auth method for."
},
REVOKE: {
identityId: "The ID of the identity to revoke the auth method for."
}
} as const;
@ -313,6 +515,9 @@ export const FOLDERS = {
path: "The path to list folders from.",
directory: "The directory to list folders from. (Deprecated in favor of path)"
},
GET_BY_ID: {
folderId: "The id of the folder to get details."
},
CREATE: {
workspaceId: "The ID of the project to create the folder in.",
environment: "The slug of the environment to create the folder in.",

View File

@ -22,6 +22,8 @@ import { buildDynamicSecretProviders } from "@app/ee/services/dynamic-secret/pro
import { dynamicSecretLeaseDALFactory } from "@app/ee/services/dynamic-secret-lease/dynamic-secret-lease-dal";
import { dynamicSecretLeaseQueueServiceFactory } from "@app/ee/services/dynamic-secret-lease/dynamic-secret-lease-queue";
import { dynamicSecretLeaseServiceFactory } from "@app/ee/services/dynamic-secret-lease/dynamic-secret-lease-service";
import { externalKmsDALFactory } from "@app/ee/services/external-kms/external-kms-dal";
import { externalKmsServiceFactory } from "@app/ee/services/external-kms/external-kms-service";
import { groupDALFactory } from "@app/ee/services/group/group-dal";
import { groupServiceFactory } from "@app/ee/services/group/group-service";
import { userGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
@ -102,6 +104,8 @@ import { identityGcpAuthDALFactory } from "@app/services/identity-gcp-auth/ident
import { identityGcpAuthServiceFactory } from "@app/services/identity-gcp-auth/identity-gcp-auth-service";
import { identityKubernetesAuthDALFactory } from "@app/services/identity-kubernetes-auth/identity-kubernetes-auth-dal";
import { identityKubernetesAuthServiceFactory } from "@app/services/identity-kubernetes-auth/identity-kubernetes-auth-service";
import { identityOidcAuthDALFactory } from "@app/services/identity-oidc-auth/identity-oidc-auth-dal";
import { identityOidcAuthServiceFactory } from "@app/services/identity-oidc-auth/identity-oidc-auth-service";
import { identityProjectDALFactory } from "@app/services/identity-project/identity-project-dal";
import { identityProjectMembershipRoleDALFactory } from "@app/services/identity-project/identity-project-membership-role-dal";
import { identityProjectServiceFactory } from "@app/services/identity-project/identity-project-service";
@ -114,7 +118,8 @@ import { integrationDALFactory } from "@app/services/integration/integration-dal
import { integrationServiceFactory } from "@app/services/integration/integration-service";
import { integrationAuthDALFactory } from "@app/services/integration-auth/integration-auth-dal";
import { integrationAuthServiceFactory } from "@app/services/integration-auth/integration-auth-service";
import { kmsDALFactory } from "@app/services/kms/kms-dal";
import { internalKmsDALFactory } from "@app/services/kms/internal-kms-dal";
import { kmskeyDALFactory } from "@app/services/kms/kms-key-dal";
import { kmsRootConfigDALFactory } from "@app/services/kms/kms-root-config-dal";
import { kmsServiceFactory } from "@app/services/kms/kms-service";
import { incidentContactDALFactory } from "@app/services/org/incident-contacts-dal";
@ -242,6 +247,7 @@ export const registerRoutes = async (
const identityUaClientSecretDAL = identityUaClientSecretDALFactory(db);
const identityAwsAuthDAL = identityAwsAuthDALFactory(db);
const identityGcpAuthDAL = identityGcpAuthDALFactory(db);
const identityOidcAuthDAL = identityOidcAuthDALFactory(db);
const identityAzureAuthDAL = identityAzureAuthDALFactory(db);
const auditLogDAL = auditLogDALFactory(db);
@ -285,7 +291,9 @@ export const registerRoutes = async (
const dynamicSecretDAL = dynamicSecretDALFactory(db);
const dynamicSecretLeaseDAL = dynamicSecretLeaseDALFactory(db);
const kmsDAL = kmsDALFactory(db);
const kmsDAL = kmskeyDALFactory(db);
const internalKmsDAL = internalKmsDALFactory(db);
const externalKmsDAL = externalKmsDALFactory(db);
const kmsRootConfigDAL = kmsRootConfigDALFactory(db);
const permissionService = permissionServiceFactory({
@ -299,7 +307,16 @@ export const registerRoutes = async (
const kmsService = kmsServiceFactory({
kmsRootConfigDAL,
keyStore,
kmsDAL
kmsDAL,
internalKmsDAL,
orgDAL,
projectDAL
});
const externalKmsService = externalKmsServiceFactory({
kmsDAL,
kmsService,
permissionService,
externalKmsDAL
});
const trustedIpService = trustedIpServiceFactory({
@ -466,7 +483,8 @@ export const registerRoutes = async (
authService: loginService,
serverCfgDAL: superAdminDAL,
orgService,
keyStore
keyStore,
licenseService
});
const rateLimitService = rateLimitServiceFactory({
rateLimitDAL,
@ -641,7 +659,8 @@ export const registerRoutes = async (
const webhookService = webhookServiceFactory({
permissionService,
webhookDAL,
projectEnvDAL
projectEnvDAL,
projectDAL
});
const secretTagService = secretTagServiceFactory({ secretTagDAL, permissionService });
@ -709,7 +728,10 @@ export const registerRoutes = async (
secretQueueService,
secretImportDAL,
projectEnvDAL,
projectBotService
projectBotService,
secretApprovalPolicyService,
secretApprovalRequestDAL,
secretApprovalRequestSecretDAL
});
const secretSharingService = secretSharingServiceFactory({
@ -885,6 +907,16 @@ export const registerRoutes = async (
licenseService
});
const identityOidcAuthService = identityOidcAuthServiceFactory({
identityOidcAuthDAL,
identityOrgMembershipDAL,
identityAccessTokenDAL,
identityDAL,
permissionService,
licenseService,
orgBotDAL
});
const dynamicSecretProviders = buildDynamicSecretProviders();
const dynamicSecretQueueService = dynamicSecretLeaseQueueServiceFactory({
queueService,
@ -988,6 +1020,7 @@ export const registerRoutes = async (
identityGcpAuth: identityGcpAuthService,
identityAwsAuth: identityAwsAuthService,
identityAzureAuth: identityAzureAuthService,
identityOidcAuth: identityOidcAuthService,
accessApprovalPolicy: accessApprovalPolicyService,
accessApprovalRequest: accessApprovalRequestService,
secretApprovalPolicy: secretApprovalPolicyService,
@ -1012,7 +1045,8 @@ export const registerRoutes = async (
projectUserAdditionalPrivilege: projectUserAdditionalPrivilegeService,
identityProjectAdditionalPrivilege: identityProjectAdditionalPrivilegeService,
secretSharing: secretSharingService,
userEngagement: userEngagementService
userEngagement: userEngagementService,
externalKms: externalKmsService
});
const cronJobs: CronJob[] = [];

View File

@ -77,35 +77,45 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(AWS_AUTH.ATTACH.identityId)
}),
body: z.object({
stsEndpoint: z.string().trim().min(1).default("https://sts.amazonaws.com/"),
allowedPrincipalArns: validatePrincipalArns,
allowedAccountIds: validateAccountIds,
stsEndpoint: z
.string()
.trim()
.min(1)
.default("https://sts.amazonaws.com/")
.describe(AWS_AUTH.ATTACH.stsEndpoint),
allowedPrincipalArns: validatePrincipalArns.describe(AWS_AUTH.ATTACH.allowedPrincipalArns),
allowedAccountIds: validateAccountIds.describe(AWS_AUTH.ATTACH.allowedAccountIds),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(AWS_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
.default(2592000)
.describe(AWS_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
.default(2592000)
.describe(AWS_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(AWS_AUTH.ATTACH.accessTokenNumUsesLimit)
}),
response: {
200: z.object({
@ -160,28 +170,31 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(AWS_AUTH.UPDATE.identityId)
}),
body: z.object({
stsEndpoint: z.string().trim().min(1).optional(),
allowedPrincipalArns: validatePrincipalArns,
allowedAccountIds: validateAccountIds,
stsEndpoint: z.string().trim().min(1).optional().describe(AWS_AUTH.UPDATE.stsEndpoint),
allowedPrincipalArns: validatePrincipalArns.describe(AWS_AUTH.UPDATE.allowedPrincipalArns),
allowedAccountIds: validateAccountIds.describe(AWS_AUTH.UPDATE.allowedAccountIds),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
.optional()
.describe(AWS_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(AWS_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z.number().int().min(0).optional().describe(AWS_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
.describe(AWS_AUTH.UPDATE.accessTokenMaxTTL)
}),
response: {
200: z.object({
@ -236,7 +249,7 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(AWS_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({

View File

@ -19,7 +19,7 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
schema: {
description: "Login with Azure Auth",
body: z.object({
identityId: z.string(),
identityId: z.string().describe(AZURE_AUTH.LOGIN.identityId),
jwt: z.string()
}),
response: {
@ -72,35 +72,40 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(AZURE_AUTH.LOGIN.identityId)
}),
body: z.object({
tenantId: z.string().trim(),
resource: z.string().trim(),
allowedServicePrincipalIds: validateAzureAuthField,
tenantId: z.string().trim().describe(AZURE_AUTH.ATTACH.tenantId),
resource: z.string().trim().describe(AZURE_AUTH.ATTACH.resource),
allowedServicePrincipalIds: validateAzureAuthField.describe(AZURE_AUTH.ATTACH.allowedServicePrincipalIds),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(AZURE_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
.default(2592000)
.describe(AZURE_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
.default(2592000)
.describe(AZURE_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(AZURE_AUTH.ATTACH.accessTokenNumUsesLimit)
}),
response: {
200: z.object({
@ -154,28 +159,33 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(AZURE_AUTH.UPDATE.identityId)
}),
body: z.object({
tenantId: z.string().trim().optional(),
resource: z.string().trim().optional(),
allowedServicePrincipalIds: validateAzureAuthField.optional(),
tenantId: z.string().trim().optional().describe(AZURE_AUTH.UPDATE.tenantId),
resource: z.string().trim().optional().describe(AZURE_AUTH.UPDATE.resource),
allowedServicePrincipalIds: validateAzureAuthField
.optional()
.describe(AZURE_AUTH.UPDATE.allowedServicePrincipalIds),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
.optional()
.describe(AZURE_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(AZURE_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z.number().int().min(0).optional().describe(AZURE_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
.describe(AZURE_AUTH.UPDATE.accessTokenMaxTTL)
}),
response: {
200: z.object({
@ -229,7 +239,7 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(AZURE_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({

View File

@ -19,7 +19,7 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
schema: {
description: "Login with GCP Auth",
body: z.object({
identityId: z.string(),
identityId: z.string().describe(GCP_AUTH.LOGIN.identityId),
jwt: z.string()
}),
response: {
@ -72,36 +72,41 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(GCP_AUTH.ATTACH.identityId)
}),
body: z.object({
type: z.enum(["iam", "gce"]),
allowedServiceAccounts: validateGcpAuthField,
allowedProjects: validateGcpAuthField,
allowedZones: validateGcpAuthField,
allowedServiceAccounts: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedServiceAccounts),
allowedProjects: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedProjects),
allowedZones: validateGcpAuthField.describe(GCP_AUTH.ATTACH.allowedZones),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(GCP_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
.default(2592000)
.describe(GCP_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
.default(2592000)
.describe(GCP_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(GCP_AUTH.ATTACH.accessTokenNumUsesLimit)
}),
response: {
200: z.object({
@ -157,29 +162,32 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(GCP_AUTH.UPDATE.identityId)
}),
body: z.object({
type: z.enum(["iam", "gce"]).optional(),
allowedServiceAccounts: validateGcpAuthField.optional(),
allowedProjects: validateGcpAuthField.optional(),
allowedZones: validateGcpAuthField.optional(),
allowedServiceAccounts: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedServiceAccounts),
allowedProjects: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedProjects),
allowedZones: validateGcpAuthField.optional().describe(GCP_AUTH.UPDATE.allowedZones),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
.optional()
.describe(GCP_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(GCP_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z.number().int().min(0).optional().describe(GCP_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
.describe(GCP_AUTH.UPDATE.accessTokenMaxTTL)
}),
response: {
200: z.object({
@ -235,7 +243,7 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(GCP_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({

View File

@ -30,7 +30,7 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
schema: {
description: "Login with Kubernetes Auth",
body: z.object({
identityId: z.string().trim(),
identityId: z.string().trim().describe(KUBERNETES_AUTH.LOGIN.identityId),
jwt: z.string().trim()
}),
response: {
@ -85,38 +85,48 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(KUBERNETES_AUTH.ATTACH.identityId)
}),
body: z.object({
kubernetesHost: z.string().trim().min(1),
caCert: z.string().trim().default(""),
tokenReviewerJwt: z.string().trim().min(1),
allowedNamespaces: z.string(), // TODO: validation
allowedNames: z.string(),
allowedAudience: z.string(),
kubernetesHost: z.string().trim().min(1).describe(KUBERNETES_AUTH.ATTACH.kubernetesHost),
caCert: z.string().trim().default("").describe(KUBERNETES_AUTH.ATTACH.caCert),
tokenReviewerJwt: z.string().trim().min(1).describe(KUBERNETES_AUTH.ATTACH.tokenReviewerJwt),
allowedNamespaces: z.string().describe(KUBERNETES_AUTH.ATTACH.allowedNamespaces), // TODO: validation
allowedNames: z.string().describe(KUBERNETES_AUTH.ATTACH.allowedNames),
allowedAudience: z.string().describe(KUBERNETES_AUTH.ATTACH.allowedAudience),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(KUBERNETES_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
.default(2592000)
.describe(KUBERNETES_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
.default(2592000)
.describe(KUBERNETES_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z
.number()
.int()
.min(0)
.default(0)
.describe(KUBERNETES_AUTH.ATTACH.accessTokenNumUsesLimit)
}),
response: {
200: z.object({
@ -171,31 +181,45 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(KUBERNETES_AUTH.UPDATE.identityId)
}),
body: z.object({
kubernetesHost: z.string().trim().min(1).optional(),
caCert: z.string().trim().optional(),
tokenReviewerJwt: z.string().trim().min(1).optional(),
allowedNamespaces: z.string().optional(), // TODO: validation
allowedNames: z.string().optional(),
allowedAudience: z.string().optional(),
kubernetesHost: z.string().trim().min(1).optional().describe(KUBERNETES_AUTH.UPDATE.kubernetesHost),
caCert: z.string().trim().optional().describe(KUBERNETES_AUTH.UPDATE.caCert),
tokenReviewerJwt: z.string().trim().min(1).optional().describe(KUBERNETES_AUTH.UPDATE.tokenReviewerJwt),
allowedNamespaces: z.string().optional().describe(KUBERNETES_AUTH.UPDATE.allowedNamespaces), // TODO: validation
allowedNames: z.string().optional().describe(KUBERNETES_AUTH.UPDATE.allowedNames),
allowedAudience: z.string().optional().describe(KUBERNETES_AUTH.UPDATE.allowedAudience),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
.optional()
.describe(KUBERNETES_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(0)
.max(315360000)
.optional()
.describe(KUBERNETES_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z
.number()
.int()
.min(0)
.optional()
.describe(KUBERNETES_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
.describe(KUBERNETES_AUTH.UPDATE.accessTokenMaxTTL)
}),
response: {
200: z.object({
@ -250,7 +274,7 @@ export const registerIdentityKubernetesRouter = async (server: FastifyZodProvide
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(KUBERNETES_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({

View File

@ -0,0 +1,361 @@
import { z } from "zod";
import { IdentityOidcAuthsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { OIDC_AUTH } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { TIdentityTrustedIp } from "@app/services/identity/identity-types";
import {
validateOidcAuthAudiencesField,
validateOidcBoundClaimsField
} from "@app/services/identity-oidc-auth/identity-oidc-auth-validators";
const IdentityOidcAuthResponseSchema = IdentityOidcAuthsSchema.omit({
encryptedCaCert: true,
caCertIV: true,
caCertTag: true
}).extend({
caCert: z.string()
});
export const registerIdentityOidcAuthRouter = async (server: FastifyZodProvider) => {
server.route({
method: "POST",
url: "/oidc-auth/login",
config: {
rateLimit: writeLimit
},
schema: {
description: "Login with OIDC Auth",
body: z.object({
identityId: z.string().trim().describe(OIDC_AUTH.LOGIN.identityId),
jwt: z.string().trim()
}),
response: {
200: z.object({
accessToken: z.string(),
expiresIn: z.coerce.number(),
accessTokenMaxTTL: z.coerce.number(),
tokenType: z.literal("Bearer")
})
}
},
handler: async (req) => {
const { identityOidcAuth, accessToken, identityAccessToken, identityMembershipOrg } =
await server.services.identityOidcAuth.login({
identityId: req.body.identityId,
jwt: req.body.jwt
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityMembershipOrg?.orgId,
event: {
type: EventType.LOGIN_IDENTITY_OIDC_AUTH,
metadata: {
identityId: identityOidcAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
identityOidcAuthId: identityOidcAuth.id
}
}
});
return {
accessToken,
tokenType: "Bearer" as const,
expiresIn: identityOidcAuth.accessTokenTTL,
accessTokenMaxTTL: identityOidcAuth.accessTokenMaxTTL
};
}
});
server.route({
method: "POST",
url: "/oidc-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Attach OIDC Auth configuration onto identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim().describe(OIDC_AUTH.ATTACH.identityId)
}),
body: z.object({
oidcDiscoveryUrl: z.string().url().min(1).describe(OIDC_AUTH.ATTACH.oidcDiscoveryUrl),
caCert: z.string().trim().default("").describe(OIDC_AUTH.ATTACH.caCert),
boundIssuer: z.string().min(1).describe(OIDC_AUTH.ATTACH.boundIssuer),
boundAudiences: validateOidcAuthAudiencesField.describe(OIDC_AUTH.ATTACH.boundAudiences),
boundClaims: validateOidcBoundClaimsField.describe(OIDC_AUTH.ATTACH.boundClaims),
boundSubject: z.string().optional().default("").describe(OIDC_AUTH.ATTACH.boundSubject),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(OIDC_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000)
.describe(OIDC_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000)
.describe(OIDC_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(OIDC_AUTH.ATTACH.accessTokenNumUsesLimit)
}),
response: {
200: z.object({
identityOidcAuth: IdentityOidcAuthResponseSchema
})
}
},
handler: async (req) => {
const identityOidcAuth = await server.services.identityOidcAuth.attachOidcAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityOidcAuth.orgId,
event: {
type: EventType.ADD_IDENTITY_OIDC_AUTH,
metadata: {
identityId: identityOidcAuth.identityId,
oidcDiscoveryUrl: identityOidcAuth.oidcDiscoveryUrl,
caCert: identityOidcAuth.caCert,
boundIssuer: identityOidcAuth.boundIssuer,
boundAudiences: identityOidcAuth.boundAudiences,
boundClaims: identityOidcAuth.boundClaims as Record<string, string>,
boundSubject: identityOidcAuth.boundSubject as string,
accessTokenTTL: identityOidcAuth.accessTokenTTL,
accessTokenMaxTTL: identityOidcAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityOidcAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityOidcAuth.accessTokenNumUsesLimit
}
}
});
return {
identityOidcAuth
};
}
});
server.route({
method: "PATCH",
url: "/oidc-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Update OIDC Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().trim().describe(OIDC_AUTH.UPDATE.identityId)
}),
body: z
.object({
oidcDiscoveryUrl: z.string().url().min(1).describe(OIDC_AUTH.UPDATE.oidcDiscoveryUrl),
caCert: z.string().trim().default("").describe(OIDC_AUTH.UPDATE.caCert),
boundIssuer: z.string().min(1).describe(OIDC_AUTH.UPDATE.boundIssuer),
boundAudiences: validateOidcAuthAudiencesField.describe(OIDC_AUTH.UPDATE.boundAudiences),
boundClaims: validateOidcBoundClaimsField.describe(OIDC_AUTH.UPDATE.boundClaims),
boundSubject: z.string().optional().default("").describe(OIDC_AUTH.UPDATE.boundSubject),
accessTokenTrustedIps: z
.object({
ipAddress: z.string().trim()
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(OIDC_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000)
.describe(OIDC_AUTH.UPDATE.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000)
.describe(OIDC_AUTH.UPDATE.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(OIDC_AUTH.UPDATE.accessTokenNumUsesLimit)
})
.partial(),
response: {
200: z.object({
identityOidcAuth: IdentityOidcAuthResponseSchema
})
}
},
handler: async (req) => {
const identityOidcAuth = await server.services.identityOidcAuth.updateOidcAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
...req.body,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityOidcAuth.orgId,
event: {
type: EventType.UPDATE_IDENTITY_OIDC_AUTH,
metadata: {
identityId: identityOidcAuth.identityId,
oidcDiscoveryUrl: identityOidcAuth.oidcDiscoveryUrl,
caCert: identityOidcAuth.caCert,
boundIssuer: identityOidcAuth.boundIssuer,
boundAudiences: identityOidcAuth.boundAudiences,
boundClaims: identityOidcAuth.boundClaims as Record<string, string>,
boundSubject: identityOidcAuth.boundSubject as string,
accessTokenTTL: identityOidcAuth.accessTokenTTL,
accessTokenMaxTTL: identityOidcAuth.accessTokenMaxTTL,
accessTokenTrustedIps: identityOidcAuth.accessTokenTrustedIps as TIdentityTrustedIp[],
accessTokenNumUsesLimit: identityOidcAuth.accessTokenNumUsesLimit
}
}
});
return { identityOidcAuth };
}
});
server.route({
method: "GET",
url: "/oidc-auth/identities/:identityId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Retrieve OIDC Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().describe(OIDC_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({
identityOidcAuth: IdentityOidcAuthResponseSchema
})
}
},
handler: async (req) => {
const identityOidcAuth = await server.services.identityOidcAuth.getOidcAuth({
identityId: req.params.identityId,
actor: req.permission.type,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityOidcAuth.orgId,
event: {
type: EventType.GET_IDENTITY_OIDC_AUTH,
metadata: {
identityId: identityOidcAuth.identityId
}
}
});
return { identityOidcAuth };
}
});
server.route({
method: "DELETE",
url: "/oidc-auth/identities/:identityId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Delete OIDC Auth configuration on identity",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string().describe(OIDC_AUTH.REVOKE.identityId)
}),
response: {
200: z.object({
identityOidcAuth: IdentityOidcAuthResponseSchema.omit({
caCert: true
})
})
}
},
handler: async (req) => {
const identityOidcAuth = await server.services.identityOidcAuth.revokeOidcAuth({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
identityId: req.params.identityId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: identityOidcAuth.orgId,
event: {
type: EventType.REVOKE_IDENTITY_OIDC_AUTH,
metadata: {
identityId: identityOidcAuth.identityId
}
}
});
return { identityOidcAuth };
}
});
};

View File

@ -2,6 +2,7 @@ import { z } from "zod";
import { IdentityAccessTokensSchema, IdentityTokenAuthsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { TOKEN_AUTH } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
@ -23,7 +24,7 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(TOKEN_AUTH.ATTACH.identityId)
}),
body: z.object({
accessTokenTrustedIps: z
@ -32,23 +33,28 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
})
.array()
.min(1)
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }]),
.default([{ ipAddress: "0.0.0.0/0" }, { ipAddress: "::/0" }])
.describe(TOKEN_AUTH.ATTACH.accessTokenTrustedIps),
accessTokenTTL: z
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
.default(2592000),
.default(2592000)
.describe(TOKEN_AUTH.ATTACH.accessTokenTTL),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.default(2592000),
accessTokenNumUsesLimit: z.number().int().min(0).default(0)
.default(2592000)
.describe(TOKEN_AUTH.ATTACH.accessTokenMaxTTL),
accessTokenNumUsesLimit: z.number().int().min(0).default(0).describe(TOKEN_AUTH.ATTACH.accessTokenNumUsesLimit)
}),
response: {
200: z.object({
@ -102,7 +108,7 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string().trim()
identityId: z.string().trim().describe(TOKEN_AUTH.UPDATE.identityId)
}),
body: z.object({
accessTokenTrustedIps: z
@ -111,16 +117,19 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
})
.array()
.min(1)
.optional(),
accessTokenTTL: z.number().int().min(0).optional(),
accessTokenNumUsesLimit: z.number().int().min(0).optional(),
.optional()
.describe(TOKEN_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).max(315360000).optional().describe(TOKEN_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z.number().int().min(0).optional().describe(TOKEN_AUTH.UPDATE.accessTokenNumUsesLimit),
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
.optional()
.describe(TOKEN_AUTH.UPDATE.accessTokenMaxTTL)
}),
response: {
200: z.object({
@ -174,7 +183,7 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(TOKEN_AUTH.RETRIEVE.identityId)
}),
response: {
200: z.object({
@ -221,7 +230,7 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(TOKEN_AUTH.REVOKE.identityId)
}),
response: {
200: z.object({
@ -253,15 +262,6 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
}
});
// proposed
// update token by id: PATCH /token-auth/tokens/:tokenId
// revoke token by id: POST /token-auth/tokens/:tokenId/revoke
// current
// revoke token by id: POST /token/revoke-by-id
// token-auth/identities/:identityId/tokens
server.route({
method: "POST",
url: "/token-auth/identities/:identityId/tokens",
@ -270,17 +270,17 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Create token for identity with Token Auth configured",
description: "Create token for identity with Token Auth",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(TOKEN_AUTH.CREATE_TOKEN.identityId)
}),
body: z.object({
name: z.string().optional()
name: z.string().optional().describe(TOKEN_AUTH.CREATE_TOKEN.name)
}),
response: {
200: z.object({
@ -331,18 +331,18 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Get tokens for identity with Token Auth configured",
description: "Get tokens for identity with Token Auth",
security: [
{
bearerAuth: []
}
],
params: z.object({
identityId: z.string()
identityId: z.string().describe(TOKEN_AUTH.GET_TOKENS.identityId)
}),
querystring: z.object({
offset: z.coerce.number().min(0).max(100).default(0),
limit: z.coerce.number().min(1).max(100).default(20)
offset: z.coerce.number().min(0).max(100).default(0).describe(TOKEN_AUTH.GET_TOKENS.offset),
limit: z.coerce.number().min(1).max(100).default(20).describe(TOKEN_AUTH.GET_TOKENS.limit)
}),
response: {
200: z.object({
@ -383,17 +383,17 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Update token for identity with Token Auth configured",
description: "Update token for identity with Token Auth",
security: [
{
bearerAuth: []
}
],
params: z.object({
tokenId: z.string()
tokenId: z.string().describe(TOKEN_AUTH.UPDATE_TOKEN.tokenId)
}),
body: z.object({
name: z.string().optional()
name: z.string().optional().describe(TOKEN_AUTH.UPDATE_TOKEN.name)
}),
response: {
200: z.object({
@ -436,14 +436,14 @@ export const registerIdentityTokenAuthRouter = async (server: FastifyZodProvider
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Revoke token for identity with Token Auth configured",
description: "Revoke token for identity with Token Auth",
security: [
{
bearerAuth: []
}
],
params: z.object({
tokenId: z.string()
tokenId: z.string().describe(TOKEN_AUTH.REVOKE_TOKEN.tokenId)
}),
response: {
200: z.object({

View File

@ -107,6 +107,7 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
.number()
.int()
.min(1)
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenTTL must have a non zero number"
})
@ -115,6 +116,7 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
@ -196,7 +198,13 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
.min(1)
.optional()
.describe(UNIVERSAL_AUTH.UPDATE.accessTokenTrustedIps),
accessTokenTTL: z.number().int().min(0).optional().describe(UNIVERSAL_AUTH.UPDATE.accessTokenTTL),
accessTokenTTL: z
.number()
.int()
.min(0)
.max(315360000)
.optional()
.describe(UNIVERSAL_AUTH.UPDATE.accessTokenTTL),
accessTokenNumUsesLimit: z
.number()
.int()
@ -206,6 +214,7 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
accessTokenMaxTTL: z
.number()
.int()
.max(315360000)
.refine((value) => value !== 0, {
message: "accessTokenMaxTTL must have a non zero number"
})
@ -362,7 +371,7 @@ export const registerIdentityUaRouter = async (server: FastifyZodProvider) => {
body: z.object({
description: z.string().trim().default("").describe(UNIVERSAL_AUTH.CREATE_CLIENT_SECRET.description),
numUsesLimit: z.number().min(0).default(0).describe(UNIVERSAL_AUTH.CREATE_CLIENT_SECRET.numUsesLimit),
ttl: z.number().min(0).default(0).describe(UNIVERSAL_AUTH.CREATE_CLIENT_SECRET.ttl)
ttl: z.number().min(0).max(315360000).default(0).describe(UNIVERSAL_AUTH.CREATE_CLIENT_SECRET.ttl)
}),
response: {
200: z.object({

View File

@ -8,6 +8,7 @@ import { registerIdentityAwsAuthRouter } from "./identity-aws-iam-auth-router";
import { registerIdentityAzureAuthRouter } from "./identity-azure-auth-router";
import { registerIdentityGcpAuthRouter } from "./identity-gcp-auth-router";
import { registerIdentityKubernetesRouter } from "./identity-kubernetes-auth-router";
import { registerIdentityOidcAuthRouter } from "./identity-oidc-auth-router";
import { registerIdentityRouter } from "./identity-router";
import { registerIdentityTokenAuthRouter } from "./identity-token-auth-router";
import { registerIdentityUaRouter } from "./identity-universal-auth-router";
@ -42,6 +43,7 @@ export const registerV1Routes = async (server: FastifyZodProvider) => {
await authRouter.register(registerIdentityAccessTokenRouter);
await authRouter.register(registerIdentityAwsAuthRouter);
await authRouter.register(registerIdentityAzureAuthRouter);
await authRouter.register(registerIdentityOidcAuthRouter);
},
{ prefix: "/auth" }
);

View File

@ -292,4 +292,39 @@ export const registerSecretFolderRouter = async (server: FastifyZodProvider) =>
return { folders };
}
});
server.route({
method: "GET",
url: "/:id",
config: {
rateLimit: readLimit
},
schema: {
description: "Get folder by id",
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string().trim().describe(FOLDERS.GET_BY_ID.folderId)
}),
response: {
200: z.object({
folder: SecretFoldersSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const folder = await server.services.folder.getFolderById({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
return { folder };
}
});
};

View File

@ -5,6 +5,7 @@ import {
IdentitiesSchema,
IdentityProjectMembershipsSchema,
ProjectMembershipRole,
ProjectsSchema,
ProjectUserMembershipRolesSchema
} from "@app/db/schemas";
import { PROJECT_IDENTITIES } from "@app/lib/api-docs";
@ -234,7 +235,8 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
temporaryAccessEndTime: z.date().nullable().optional()
})
),
identity: IdentitiesSchema.pick({ name: true, id: true, authMethod: true })
identity: IdentitiesSchema.pick({ name: true, id: true, authMethod: true }),
project: ProjectsSchema.pick({ name: true, id: true })
})
.array()
})
@ -291,7 +293,8 @@ export const registerIdentityProjectRouter = async (server: FastifyZodProvider)
temporaryAccessEndTime: z.date().nullable().optional()
})
),
identity: IdentitiesSchema.pick({ name: true, id: true, authMethod: true })
identity: IdentitiesSchema.pick({ name: true, id: true, authMethod: true }),
project: ProjectsSchema.pick({ name: true, id: true })
})
})
}

View File

@ -1325,6 +1325,61 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.route({
method: "POST",
url: "/move",
config: {
rateLimit: secretsLimit
},
schema: {
body: z.object({
projectSlug: z.string().trim(),
sourceEnvironment: z.string().trim(),
sourceSecretPath: z.string().trim().default("/").transform(removeTrailingSlash),
destinationEnvironment: z.string().trim(),
destinationSecretPath: z.string().trim().default("/").transform(removeTrailingSlash),
secretIds: z.string().array(),
shouldOverwrite: z.boolean().default(false)
}),
response: {
200: z.object({
isSourceUpdated: z.boolean(),
isDestinationUpdated: z.boolean()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { projectId, isSourceUpdated, isDestinationUpdated } = await server.services.secret.moveSecrets({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.body
});
await server.services.auditLog.createAuditLog({
projectId,
...req.auditLogInfo,
event: {
type: EventType.MOVE_SECRETS,
metadata: {
sourceEnvironment: req.body.sourceEnvironment,
sourceSecretPath: req.body.sourceSecretPath,
destinationEnvironment: req.body.destinationEnvironment,
destinationSecretPath: req.body.destinationSecretPath,
secretIds: req.body.secretIds
}
}
});
return {
isSourceUpdated,
isDestinationUpdated
};
}
});
server.route({
method: "POST",
url: "/batch",

View File

@ -75,8 +75,10 @@ export const getCaCredentials = async ({
kmsService
});
const decryptedPrivateKey = await kmsService.decrypt({
kmsId: keyId,
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: keyId
});
const decryptedPrivateKey = kmsDecryptor({
cipherTextBlob: caSecret.encryptedPrivateKey
});
@ -123,15 +125,17 @@ export const getCaCertChain = async ({
kmsService
});
const decryptedCaCert = await kmsService.decrypt({
kmsId: keyId,
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: keyId
});
const decryptedCaCert = kmsDecryptor({
cipherTextBlob: caCert.encryptedCertificate
});
const caCertObj = new x509.X509Certificate(decryptedCaCert);
const decryptedChain = await kmsService.decrypt({
kmsId: keyId,
const decryptedChain = kmsDecryptor({
cipherTextBlob: caCert.encryptedCertificateChain
});
@ -168,8 +172,11 @@ export const rebuildCaCrl = async ({
kmsService
});
const privateKey = await kmsService.decrypt({
kmsId: keyId,
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: keyId
});
const privateKey = kmsDecryptor({
cipherTextBlob: caSecret.encryptedPrivateKey
});
@ -200,8 +207,10 @@ export const rebuildCaCrl = async ({
signingKey: sk
});
const { cipherTextBlob: encryptedCrl } = await kmsService.encrypt({
kmsId: keyId,
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: keyId
});
const { cipherTextBlob: encryptedCrl } = kmsEncryptor({
plainText: Buffer.from(new Uint8Array(crl.rawData))
});

View File

@ -25,7 +25,7 @@ type TCertificateAuthorityQueueFactoryDep = {
certificateAuthoritySecretDAL: TCertificateAuthoritySecretDALFactory;
certificateDAL: TCertificateDALFactory;
projectDAL: Pick<TProjectDALFactory, "findProjectBySlug" | "findOne" | "updateById" | "findById" | "transaction">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "encrypt" | "decrypt">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "encryptWithKmsKey" | "decryptWithKmsKey">;
queueService: TQueueServiceFactory;
};
export type TCertificateAuthorityQueueFactory = ReturnType<typeof certificateAuthorityQueueFactory>;
@ -88,8 +88,10 @@ export const certificateAuthorityQueueFactory = ({
kmsService
});
const privateKey = await kmsService.decrypt({
kmsId: keyId,
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: keyId
});
const privateKey = kmsDecryptor({
cipherTextBlob: caSecret.encryptedPrivateKey
});
@ -120,8 +122,10 @@ export const certificateAuthorityQueueFactory = ({
signingKey: sk
});
const { cipherTextBlob: encryptedCrl } = await kmsService.encrypt({
kmsId: keyId,
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: keyId
});
const { cipherTextBlob: encryptedCrl } = kmsEncryptor({
plainText: Buffer.from(new Uint8Array(crl.rawData))
});

View File

@ -53,7 +53,7 @@ type TCertificateAuthorityServiceFactoryDep = {
certificateDAL: Pick<TCertificateDALFactory, "transaction" | "create" | "find">;
certificateBodyDAL: Pick<TCertificateBodyDALFactory, "create">;
projectDAL: Pick<TProjectDALFactory, "findProjectBySlug" | "findOne" | "updateById" | "findById" | "transaction">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "encrypt" | "decrypt">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "encryptWithKmsKey" | "decryptWithKmsKey">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission">;
};
@ -154,11 +154,14 @@ export const certificateAuthorityServiceFactory = ({
tx
);
const keyId = await getProjectKmsCertificateKeyId({
const certificateManagerKmsId = await getProjectKmsCertificateKeyId({
projectId: project.id,
projectDAL,
kmsService
});
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: certificateManagerKmsId
});
if (type === CaType.ROOT) {
// note: create self-signed cert only applicable for root CA
@ -178,13 +181,11 @@ export const certificateAuthorityServiceFactory = ({
]
});
const { cipherTextBlob: encryptedCertificate } = await kmsService.encrypt({
kmsId: keyId,
const { cipherTextBlob: encryptedCertificate } = kmsEncryptor({
plainText: Buffer.from(new Uint8Array(cert.rawData))
});
const { cipherTextBlob: encryptedCertificateChain } = await kmsService.encrypt({
kmsId: keyId,
const { cipherTextBlob: encryptedCertificateChain } = kmsEncryptor({
plainText: Buffer.alloc(0)
});
@ -208,8 +209,7 @@ export const certificateAuthorityServiceFactory = ({
signingKey: keys.privateKey
});
const { cipherTextBlob: encryptedCrl } = await kmsService.encrypt({
kmsId: keyId,
const { cipherTextBlob: encryptedCrl } = kmsEncryptor({
plainText: Buffer.from(new Uint8Array(crl.rawData))
});
@ -224,8 +224,7 @@ export const certificateAuthorityServiceFactory = ({
// https://nodejs.org/api/crypto.html#static-method-keyobjectfromkey
const skObj = KeyObject.from(keys.privateKey);
const { cipherTextBlob: encryptedPrivateKey } = await kmsService.encrypt({
kmsId: keyId,
const { cipherTextBlob: encryptedPrivateKey } = kmsEncryptor({
plainText: skObj.export({
type: "pkcs8",
format: "der"
@ -449,15 +448,17 @@ export const certificateAuthorityServiceFactory = ({
const alg = keyAlgorithmToAlgCfg(ca.keyAlgorithm as CertKeyAlgorithm);
const keyId = await getProjectKmsCertificateKeyId({
const certificateManagerKmsId = await getProjectKmsCertificateKeyId({
projectId: ca.projectId,
projectDAL,
kmsService
});
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: certificateManagerKmsId
});
const caCert = await certificateAuthorityCertDAL.findOne({ caId: ca.id });
const decryptedCaCert = await kmsService.decrypt({
kmsId: keyId,
const decryptedCaCert = kmsDecryptor({
cipherTextBlob: caCert.encryptedCertificate
});
@ -605,19 +606,20 @@ export const certificateAuthorityServiceFactory = ({
dn: parentCertSubject
});
const keyId = await getProjectKmsCertificateKeyId({
const certificateManagerKmsId = await getProjectKmsCertificateKeyId({
projectId: ca.projectId,
projectDAL,
kmsService
});
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: certificateManagerKmsId
});
const { cipherTextBlob: encryptedCertificate } = await kmsService.encrypt({
kmsId: keyId,
const { cipherTextBlob: encryptedCertificate } = kmsEncryptor({
plainText: Buffer.from(new Uint8Array(certObj.rawData))
});
const { cipherTextBlob: encryptedCertificateChain } = await kmsService.encrypt({
kmsId: keyId,
const { cipherTextBlob: encryptedCertificateChain } = kmsEncryptor({
plainText: Buffer.from(certificateChain)
});
@ -682,14 +684,16 @@ export const certificateAuthorityServiceFactory = ({
const caCert = await certificateAuthorityCertDAL.findOne({ caId: ca.id });
if (!caCert) throw new BadRequestError({ message: "CA does not have a certificate installed" });
const keyId = await getProjectKmsCertificateKeyId({
const certificateManagerKmsId = await getProjectKmsCertificateKeyId({
projectId: ca.projectId,
projectDAL,
kmsService
});
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: certificateManagerKmsId
});
const decryptedCaCert = await kmsService.decrypt({
kmsId: keyId,
const decryptedCaCert = kmsDecryptor({
cipherTextBlob: caCert.encryptedCertificate
});
@ -796,8 +800,10 @@ export const certificateAuthorityServiceFactory = ({
const skLeafObj = KeyObject.from(leafKeys.privateKey);
const skLeaf = skLeafObj.export({ format: "pem", type: "pkcs8" }) as string;
const { cipherTextBlob: encryptedCertificate } = await kmsService.encrypt({
kmsId: keyId,
const kmsEncryptor = await kmsService.encryptWithKmsKey({
kmsId: certificateManagerKmsId
});
const { cipherTextBlob: encryptedCertificate } = kmsEncryptor({
plainText: Buffer.from(new Uint8Array(leafCert.rawData))
});

View File

@ -95,7 +95,7 @@ export type TGetCaCredentialsDTO = {
certificateAuthorityDAL: Pick<TCertificateAuthorityDALFactory, "findById">;
certificateAuthoritySecretDAL: Pick<TCertificateAuthoritySecretDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "findOne" | "updateById" | "transaction">;
kmsService: Pick<TKmsServiceFactory, "decrypt" | "generateKmsKey">;
kmsService: Pick<TKmsServiceFactory, "decryptWithKmsKey" | "generateKmsKey">;
};
export type TGetCaCertChainDTO = {
@ -103,7 +103,7 @@ export type TGetCaCertChainDTO = {
certificateAuthorityDAL: Pick<TCertificateAuthorityDALFactory, "findById">;
certificateAuthorityCertDAL: Pick<TCertificateAuthorityCertDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "findOne" | "updateById" | "transaction">;
kmsService: Pick<TKmsServiceFactory, "decrypt" | "generateKmsKey">;
kmsService: Pick<TKmsServiceFactory, "decryptWithKmsKey" | "generateKmsKey">;
};
export type TRebuildCaCrlDTO = {
@ -113,7 +113,7 @@ export type TRebuildCaCrlDTO = {
certificateAuthoritySecretDAL: Pick<TCertificateAuthoritySecretDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "findOne" | "updateById" | "transaction">;
certificateDAL: Pick<TCertificateDALFactory, "find">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "decrypt" | "encrypt">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "decryptWithKmsKey" | "encryptWithKmsKey">;
};
export type TRotateCaCrlTriggerDTO = {

View File

@ -25,7 +25,7 @@ type TCertificateServiceFactoryDep = {
certificateAuthorityCrlDAL: Pick<TCertificateAuthorityCrlDALFactory, "update">;
certificateAuthoritySecretDAL: Pick<TCertificateAuthoritySecretDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "findOne" | "updateById" | "findById" | "transaction">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "encrypt" | "decrypt">;
kmsService: Pick<TKmsServiceFactory, "generateKmsKey" | "encryptWithKmsKey" | "decryptWithKmsKey">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission">;
};
@ -164,14 +164,16 @@ export const certificateServiceFactory = ({
const certBody = await certificateBodyDAL.findOne({ certId: cert.id });
const keyId = await getProjectKmsCertificateKeyId({
const certificateManagerKeyId = await getProjectKmsCertificateKeyId({
projectId: ca.projectId,
projectDAL,
kmsService
});
const decryptedCert = await kmsService.decrypt({
kmsId: keyId,
const kmsDecryptor = await kmsService.decryptWithKmsKey({
kmsId: certificateManagerKeyId
});
const decryptedCert = kmsDecryptor({
cipherTextBlob: certBody.encryptedCertificate
});

View File

@ -51,6 +51,18 @@ export const identityAccessTokenDALFactory = (db: TDbClient) => {
`${TableName.IdentityKubernetesAuth}.identityId`
);
})
.leftJoin(TableName.IdentityOidcAuth, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.OIDC_AUTH])).andOn(
`${TableName.Identity}.id`,
`${TableName.IdentityOidcAuth}.identityId`
);
})
.leftJoin(TableName.IdentityTokenAuth, (qb) => {
qb.on(`${TableName.Identity}.authMethod`, db.raw("?", [IdentityAuthMethod.TOKEN_AUTH])).andOn(
`${TableName.Identity}.id`,
`${TableName.IdentityTokenAuth}.identityId`
);
})
.select(selectAllTableCols(TableName.IdentityAccessToken))
.select(
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityUniversalAuth).as("accessTokenTrustedIpsUa"),
@ -58,6 +70,8 @@ export const identityAccessTokenDALFactory = (db: TDbClient) => {
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityAwsAuth).as("accessTokenTrustedIpsAws"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityAzureAuth).as("accessTokenTrustedIpsAzure"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityKubernetesAuth).as("accessTokenTrustedIpsK8s"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityOidcAuth).as("accessTokenTrustedIpsOidc"),
db.ref("accessTokenTrustedIps").withSchema(TableName.IdentityTokenAuth).as("accessTokenTrustedIpsToken"),
db.ref("name").withSchema(TableName.Identity)
)
.first();
@ -71,7 +85,9 @@ export const identityAccessTokenDALFactory = (db: TDbClient) => {
doc.accessTokenTrustedIpsGcp ||
doc.accessTokenTrustedIpsAws ||
doc.accessTokenTrustedIpsAzure ||
doc.accessTokenTrustedIpsK8s
doc.accessTokenTrustedIpsK8s ||
doc.accessTokenTrustedIpsOidc ||
doc.accessTokenTrustedIpsToken
};
} catch (error) {
throw new DatabaseError({ error, name: "IdAccessTokenFindOne" });

View File

@ -0,0 +1,10 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TIdentityOidcAuthDALFactory = ReturnType<typeof identityOidcAuthDALFactory>;
export const identityOidcAuthDALFactory = (db: TDbClient) => {
const oidcAuthOrm = ormify(db, TableName.IdentityOidcAuth);
return oidcAuthOrm;
};

View File

@ -0,0 +1,534 @@
import { ForbiddenError } from "@casl/ability";
import axios from "axios";
import https from "https";
import jwt from "jsonwebtoken";
import { JwksClient } from "jwks-rsa";
import { IdentityAuthMethod, SecretKeyEncoding, TIdentityOidcAuthsUpdate } from "@app/db/schemas";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { isAtLeastAsPrivileged } from "@app/lib/casl";
import { getConfig } from "@app/lib/config/env";
import { generateAsymmetricKeyPair } from "@app/lib/crypto";
import {
decryptSymmetric,
encryptSymmetric,
generateSymmetricKey,
infisicalSymmetricDecrypt,
infisicalSymmetricEncypt
} from "@app/lib/crypto/encryption";
import { BadRequestError, ForbiddenRequestError, UnauthorizedError } from "@app/lib/errors";
import { extractIPDetails, isValidIpOrCidr } from "@app/lib/ip";
import { ActorType, AuthTokenType } from "../auth/auth-type";
import { TIdentityDALFactory } from "../identity/identity-dal";
import { TIdentityOrgDALFactory } from "../identity/identity-org-dal";
import { TIdentityAccessTokenDALFactory } from "../identity-access-token/identity-access-token-dal";
import { TIdentityAccessTokenJwtPayload } from "../identity-access-token/identity-access-token-types";
import { TOrgBotDALFactory } from "../org/org-bot-dal";
import { TIdentityOidcAuthDALFactory } from "./identity-oidc-auth-dal";
import {
TAttachOidcAuthDTO,
TGetOidcAuthDTO,
TLoginOidcAuthDTO,
TRevokeOidcAuthDTO,
TUpdateOidcAuthDTO
} from "./identity-oidc-auth-types";
type TIdentityOidcAuthServiceFactoryDep = {
identityOidcAuthDAL: TIdentityOidcAuthDALFactory;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">;
identityDAL: Pick<TIdentityDALFactory, "updateById">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "transaction" | "create">;
};
export type TIdentityOidcAuthServiceFactory = ReturnType<typeof identityOidcAuthServiceFactory>;
export const identityOidcAuthServiceFactory = ({
identityOidcAuthDAL,
identityOrgMembershipDAL,
identityDAL,
permissionService,
licenseService,
identityAccessTokenDAL,
orgBotDAL
}: TIdentityOidcAuthServiceFactoryDep) => {
const login = async ({ identityId, jwt: oidcJwt }: TLoginOidcAuthDTO) => {
const identityOidcAuth = await identityOidcAuthDAL.findOne({ identityId });
if (!identityOidcAuth) {
throw new UnauthorizedError();
}
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({
identityId: identityOidcAuth.identityId
});
if (!identityMembershipOrg) {
throw new BadRequestError({ message: "Failed to find identity" });
}
const orgBot = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId });
if (!orgBot) {
throw new BadRequestError({ message: "Org bot not found", name: "OrgBotNotFound" });
}
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
const { encryptedCaCert, caCertIV, caCertTag } = identityOidcAuth;
let caCert = "";
if (encryptedCaCert && caCertIV && caCertTag) {
caCert = decryptSymmetric({
ciphertext: encryptedCaCert,
iv: caCertIV,
tag: caCertTag,
key
});
}
const requestAgent = new https.Agent({ ca: caCert, rejectUnauthorized: !!caCert });
const { data: discoveryDoc } = await axios.get<{ jwks_uri: string }>(
`${identityOidcAuth.oidcDiscoveryUrl}/.well-known/openid-configuration`,
{
httpsAgent: requestAgent
}
);
const jwksUri = discoveryDoc.jwks_uri;
const decodedToken = jwt.decode(oidcJwt, { complete: true });
if (!decodedToken) {
throw new BadRequestError({
message: "Invalid JWT"
});
}
const client = new JwksClient({
jwksUri,
requestAgent
});
const { kid } = decodedToken.header;
const oidcSigningKey = await client.getSigningKey(kid);
const tokenData = jwt.verify(oidcJwt, oidcSigningKey.getPublicKey(), {
issuer: identityOidcAuth.boundIssuer
}) as Record<string, string>;
if (identityOidcAuth.boundSubject) {
if (tokenData.sub !== identityOidcAuth.boundSubject) {
throw new UnauthorizedError();
}
}
if (identityOidcAuth.boundAudiences) {
if (!identityOidcAuth.boundAudiences.split(", ").includes(tokenData.aud)) {
throw new UnauthorizedError();
}
}
if (identityOidcAuth.boundClaims) {
Object.keys(identityOidcAuth.boundClaims).forEach((claimKey) => {
const claimValue = (identityOidcAuth.boundClaims as Record<string, string>)[claimKey];
// handle both single and multi-valued claims
if (!claimValue.split(", ").some((claimEntry) => tokenData[claimKey] === claimEntry)) {
throw new UnauthorizedError();
}
});
}
const identityAccessToken = await identityOidcAuthDAL.transaction(async (tx) => {
const newToken = await identityAccessTokenDAL.create(
{
identityId: identityOidcAuth.identityId,
isAccessTokenRevoked: false,
accessTokenTTL: identityOidcAuth.accessTokenTTL,
accessTokenMaxTTL: identityOidcAuth.accessTokenMaxTTL,
accessTokenNumUses: 0,
accessTokenNumUsesLimit: identityOidcAuth.accessTokenNumUsesLimit
},
tx
);
return newToken;
});
const appCfg = getConfig();
const accessToken = jwt.sign(
{
identityId: identityOidcAuth.identityId,
identityAccessTokenId: identityAccessToken.id,
authTokenType: AuthTokenType.IDENTITY_ACCESS_TOKEN
} as TIdentityAccessTokenJwtPayload,
appCfg.AUTH_SECRET,
{
expiresIn:
Number(identityAccessToken.accessTokenMaxTTL) === 0
? undefined
: Number(identityAccessToken.accessTokenMaxTTL)
}
);
return { accessToken, identityOidcAuth, identityAccessToken, identityMembershipOrg };
};
const attachOidcAuth = async ({
identityId,
oidcDiscoveryUrl,
caCert,
boundIssuer,
boundAudiences,
boundClaims,
boundSubject,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TAttachOidcAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) {
throw new BadRequestError({ message: "Failed to find identity" });
}
if (identityMembershipOrg.identity.authMethod)
throw new BadRequestError({
message: "Failed to add OIDC Auth to already configured identity"
});
if (accessTokenMaxTTL > 0 && accessTokenTTL > accessTokenMaxTTL) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const orgBot = await orgBotDAL.transaction(async (tx) => {
const doc = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId }, tx);
if (doc) return doc;
const { privateKey, publicKey } = generateAsymmetricKeyPair();
const key = generateSymmetricKey();
const {
ciphertext: encryptedPrivateKey,
iv: privateKeyIV,
tag: privateKeyTag,
encoding: privateKeyKeyEncoding,
algorithm: privateKeyAlgorithm
} = infisicalSymmetricEncypt(privateKey);
const {
ciphertext: encryptedSymmetricKey,
iv: symmetricKeyIV,
tag: symmetricKeyTag,
encoding: symmetricKeyKeyEncoding,
algorithm: symmetricKeyAlgorithm
} = infisicalSymmetricEncypt(key);
return orgBotDAL.create(
{
name: "Infisical org bot",
publicKey,
privateKeyIV,
encryptedPrivateKey,
symmetricKeyIV,
symmetricKeyTag,
encryptedSymmetricKey,
symmetricKeyAlgorithm,
orgId: identityMembershipOrg.orgId,
privateKeyTag,
privateKeyAlgorithm,
privateKeyKeyEncoding,
symmetricKeyKeyEncoding
},
tx
);
});
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
const { ciphertext: encryptedCaCert, iv: caCertIV, tag: caCertTag } = encryptSymmetric(caCert, key);
const identityOidcAuth = await identityOidcAuthDAL.transaction(async (tx) => {
const doc = await identityOidcAuthDAL.create(
{
identityId: identityMembershipOrg.identityId,
oidcDiscoveryUrl,
encryptedCaCert,
caCertIV,
caCertTag,
boundIssuer,
boundAudiences,
boundClaims,
boundSubject,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: JSON.stringify(reformattedAccessTokenTrustedIps)
},
tx
);
await identityDAL.updateById(
identityMembershipOrg.identityId,
{
authMethod: IdentityAuthMethod.OIDC_AUTH
},
tx
);
return doc;
});
return { ...identityOidcAuth, orgId: identityMembershipOrg.orgId, caCert };
};
const updateOidcAuth = async ({
identityId,
oidcDiscoveryUrl,
caCert,
boundIssuer,
boundAudiences,
boundClaims,
boundSubject,
accessTokenTTL,
accessTokenMaxTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TUpdateOidcAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) {
throw new BadRequestError({ message: "Failed to find identity" });
}
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.OIDC_AUTH) {
throw new BadRequestError({
message: "Failed to update OIDC Auth"
});
}
const identityOidcAuth = await identityOidcAuthDAL.findOne({ identityId });
if (
(accessTokenMaxTTL || identityOidcAuth.accessTokenMaxTTL) > 0 &&
(accessTokenTTL || identityOidcAuth.accessTokenMaxTTL) > (accessTokenMaxTTL || identityOidcAuth.accessTokenMaxTTL)
) {
throw new BadRequestError({ message: "Access token TTL cannot be greater than max TTL" });
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Identity);
const plan = await licenseService.getPlan(identityMembershipOrg.orgId);
const reformattedAccessTokenTrustedIps = accessTokenTrustedIps?.map((accessTokenTrustedIp) => {
if (
!plan.ipAllowlisting &&
accessTokenTrustedIp.ipAddress !== "0.0.0.0/0" &&
accessTokenTrustedIp.ipAddress !== "::/0"
)
throw new BadRequestError({
message:
"Failed to add IP access range to access token due to plan restriction. Upgrade plan to add IP access range."
});
if (!isValidIpOrCidr(accessTokenTrustedIp.ipAddress))
throw new BadRequestError({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
return extractIPDetails(accessTokenTrustedIp.ipAddress);
});
const updateQuery: TIdentityOidcAuthsUpdate = {
oidcDiscoveryUrl,
boundIssuer,
boundAudiences,
boundClaims,
boundSubject,
accessTokenMaxTTL,
accessTokenTTL,
accessTokenNumUsesLimit,
accessTokenTrustedIps: reformattedAccessTokenTrustedIps
? JSON.stringify(reformattedAccessTokenTrustedIps)
: undefined
};
const orgBot = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId });
if (!orgBot) {
throw new BadRequestError({ message: "Org bot not found", name: "OrgBotNotFound" });
}
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
if (caCert !== undefined) {
const { ciphertext: encryptedCACert, iv: caCertIV, tag: caCertTag } = encryptSymmetric(caCert, key);
updateQuery.encryptedCaCert = encryptedCACert;
updateQuery.caCertIV = caCertIV;
updateQuery.caCertTag = caCertTag;
}
const updatedOidcAuth = await identityOidcAuthDAL.updateById(identityOidcAuth.id, updateQuery);
const updatedCACert =
updatedOidcAuth.encryptedCaCert && updatedOidcAuth.caCertIV && updatedOidcAuth.caCertTag
? decryptSymmetric({
ciphertext: updatedOidcAuth.encryptedCaCert,
iv: updatedOidcAuth.caCertIV,
tag: updatedOidcAuth.caCertTag,
key
})
: "";
return {
...updatedOidcAuth,
orgId: identityMembershipOrg.orgId,
caCert: updatedCACert
};
};
const getOidcAuth = async ({ identityId, actorId, actor, actorAuthMethod, actorOrgId }: TGetOidcAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) {
throw new BadRequestError({ message: "Failed to find identity" });
}
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.OIDC_AUTH) {
throw new BadRequestError({
message: "The identity does not have OIDC Auth attached"
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Identity);
const identityOidcAuth = await identityOidcAuthDAL.findOne({ identityId });
const orgBot = await orgBotDAL.findOne({ orgId: identityMembershipOrg.orgId });
if (!orgBot) {
throw new BadRequestError({ message: "Org bot not found", name: "OrgBotNotFound" });
}
const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV,
tag: orgBot.symmetricKeyTag,
keyEncoding: orgBot.symmetricKeyKeyEncoding as SecretKeyEncoding
});
const caCert = decryptSymmetric({
ciphertext: identityOidcAuth.encryptedCaCert,
iv: identityOidcAuth.caCertIV,
tag: identityOidcAuth.caCertTag,
key
});
return { ...identityOidcAuth, orgId: identityMembershipOrg.orgId, caCert };
};
const revokeOidcAuth = async ({ identityId, actorId, actor, actorAuthMethod, actorOrgId }: TRevokeOidcAuthDTO) => {
const identityMembershipOrg = await identityOrgMembershipDAL.findOne({ identityId });
if (!identityMembershipOrg) {
throw new BadRequestError({ message: "Failed to find identity" });
}
if (identityMembershipOrg.identity?.authMethod !== IdentityAuthMethod.OIDC_AUTH) {
throw new BadRequestError({
message: "The identity does not have OIDC auth"
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Identity);
const { permission: rolePermission } = await permissionService.getOrgPermission(
ActorType.IDENTITY,
identityMembershipOrg.identityId,
identityMembershipOrg.orgId,
actorAuthMethod,
actorOrgId
);
const hasPriviledge = isAtLeastAsPrivileged(permission, rolePermission);
if (!hasPriviledge) {
throw new ForbiddenRequestError({
message: "Failed to revoke OIDC auth of identity with more privileged role"
});
}
const revokedIdentityOidcAuth = await identityOidcAuthDAL.transaction(async (tx) => {
const deletedOidcAuth = await identityOidcAuthDAL.delete({ identityId }, tx);
await identityDAL.updateById(identityId, { authMethod: null }, tx);
return { ...deletedOidcAuth?.[0], orgId: identityMembershipOrg.orgId };
});
return revokedIdentityOidcAuth;
};
return {
attachOidcAuth,
updateOidcAuth,
getOidcAuth,
revokeOidcAuth,
login
};
};

View File

@ -0,0 +1,42 @@
import { TProjectPermission } from "@app/lib/types";
export type TAttachOidcAuthDTO = {
identityId: string;
oidcDiscoveryUrl: string;
caCert: string;
boundIssuer: string;
boundAudiences: string;
boundClaims: Record<string, string>;
boundSubject: string;
accessTokenTTL: number;
accessTokenMaxTTL: number;
accessTokenNumUsesLimit: number;
accessTokenTrustedIps: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TUpdateOidcAuthDTO = {
identityId: string;
oidcDiscoveryUrl?: string;
caCert?: string;
boundIssuer?: string;
boundAudiences?: string;
boundClaims?: Record<string, string>;
boundSubject?: string;
accessTokenTTL?: number;
accessTokenMaxTTL?: number;
accessTokenNumUsesLimit?: number;
accessTokenTrustedIps?: { ipAddress: string }[];
} & Omit<TProjectPermission, "projectId">;
export type TGetOidcAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;
export type TLoginOidcAuthDTO = {
identityId: string;
jwt: string;
};
export type TRevokeOidcAuthDTO = {
identityId: string;
} & Omit<TProjectPermission, "projectId">;

View File

@ -0,0 +1,25 @@
import { z } from "zod";
export const validateOidcAuthAudiencesField = z
.string()
.trim()
.default("")
.transform((data) => {
if (data === "") return "";
return data
.split(",")
.map((id) => id.trim())
.join(", ");
});
export const validateOidcBoundClaimsField = z.record(z.string()).transform((data) => {
const formattedClaims: Record<string, string> = {};
Object.keys(data).forEach((key) => {
formattedClaims[key] = data[key]
.split(",")
.map((id) => id.trim())
.join(", ");
});
return formattedClaims;
});

View File

@ -111,6 +111,7 @@ export const identityProjectDALFactory = (db: TDbClient) => {
try {
const docs = await (tx || db.replicaNode())(TableName.IdentityProjectMembership)
.where(`${TableName.IdentityProjectMembership}.projectId`, projectId)
.join(TableName.Project, `${TableName.IdentityProjectMembership}.projectId`, `${TableName.Project}.id`)
.join(TableName.Identity, `${TableName.IdentityProjectMembership}.identityId`, `${TableName.Identity}.id`)
.where((qb) => {
if (filter.identityId) {
@ -149,12 +150,13 @@ export const identityProjectDALFactory = (db: TDbClient) => {
db.ref("isTemporary").withSchema(TableName.IdentityProjectMembershipRole),
db.ref("temporaryRange").withSchema(TableName.IdentityProjectMembershipRole),
db.ref("temporaryAccessStartTime").withSchema(TableName.IdentityProjectMembershipRole),
db.ref("temporaryAccessEndTime").withSchema(TableName.IdentityProjectMembershipRole)
db.ref("temporaryAccessEndTime").withSchema(TableName.IdentityProjectMembershipRole),
db.ref("name").as("projectName").withSchema(TableName.Project)
);
const members = sqlNestRelationships({
data: docs,
parentMapper: ({ identityId, identityName, identityAuthMethod, id, createdAt, updatedAt }) => ({
parentMapper: ({ identityId, identityName, identityAuthMethod, id, createdAt, updatedAt, projectName }) => ({
id,
identityId,
createdAt,
@ -163,6 +165,10 @@ export const identityProjectDALFactory = (db: TDbClient) => {
id: identityId,
name: identityName,
authMethod: identityAuthMethod
},
project: {
id: projectId,
name: projectName
}
}),
key: "id",

View File

@ -574,14 +574,14 @@ export const integrationAuthServiceFactory = ({
const botKey = await projectBotService.getBotKey(integrationAuth.projectId);
const { accessId, accessToken } = await getIntegrationAccessToken(integrationAuth, botKey);
AWS.config.update({
const kms = new AWS.KMS({
region,
credentials: {
accessKeyId: String(accessId),
secretAccessKey: accessToken
}
});
const kms = new AWS.KMS();
const aliases = await kms.listAliases({}).promise();
const keyAliases = aliases.Aliases!.filter((alias) => {

View File

@ -0,0 +1,10 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TInternalKmsDALFactory = ReturnType<typeof internalKmsDALFactory>;
export const internalKmsDALFactory = (db: TDbClient) => {
const internalKmsOrm = ormify(db, TableName.InternalKms);
return internalKmsOrm;
};

View File

@ -1,10 +0,0 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TKmsDALFactory = ReturnType<typeof kmsDALFactory>;
export const kmsDALFactory = (db: TDbClient) => {
const kmsOrm = ormify(db, TableName.KmsKey);
return kmsOrm;
};

View File

@ -0,0 +1,64 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { KmsKeysSchema, TableName } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TKmsKeyDALFactory = ReturnType<typeof kmskeyDALFactory>;
export const kmskeyDALFactory = (db: TDbClient) => {
const kmsOrm = ormify(db, TableName.KmsKey);
const findByIdWithAssociatedKms = async (id: string, tx?: Knex) => {
try {
const result = await (tx || db.replicaNode())(TableName.KmsKey)
.where({ [`${TableName.KmsKey}.id` as "id"]: id })
.leftJoin(TableName.InternalKms, `${TableName.KmsKey}.id`, `${TableName.InternalKms}.kmsKeyId`)
.leftJoin(TableName.ExternalKms, `${TableName.KmsKey}.id`, `${TableName.ExternalKms}.kmsKeyId`)
.first()
.select(selectAllTableCols(TableName.KmsKey))
.select(
db.ref("id").withSchema(TableName.InternalKms).as("internalKmsId"),
db.ref("encryptedKey").withSchema(TableName.InternalKms).as("internalKmsEncryptedKey"),
db.ref("encryptionAlgorithm").withSchema(TableName.InternalKms).as("internalKmsEncryptionAlgorithm"),
db.ref("version").withSchema(TableName.InternalKms).as("internalKmsVersion"),
db.ref("id").withSchema(TableName.InternalKms).as("internalKmsId")
)
.select(
db.ref("id").withSchema(TableName.ExternalKms).as("externalKmsId"),
db.ref("provider").withSchema(TableName.ExternalKms).as("externalKmsProvider"),
db.ref("encryptedProviderInputs").withSchema(TableName.ExternalKms).as("externalKmsEncryptedProviderInput"),
db.ref("status").withSchema(TableName.ExternalKms).as("externalKmsStatus"),
db.ref("statusDetails").withSchema(TableName.ExternalKms).as("externalKmsStatusDetails")
);
const data = {
...KmsKeysSchema.parse(result),
isExternal: Boolean(result?.externalKmsId),
externalKms: result?.externalKmsId
? {
id: result.externalKmsId,
provider: result.externalKmsProvider,
encryptedProviderInput: result.externalKmsEncryptedProviderInput,
status: result.externalKmsStatus,
statusDetails: result.externalKmsStatusDetails
}
: undefined,
internalKms: result?.internalKmsId
? {
id: result.internalKmsId,
encryptedKey: result.internalKmsEncryptedKey,
encryptionAlgorithm: result.internalKmsEncryptionAlgorithm,
version: result.internalKmsVersion
}
: undefined
};
return data;
} catch (error) {
throw new DatabaseError({ error, name: "Find by id" });
}
};
return { ...kmsOrm, findByIdWithAssociatedKms };
};

View File

@ -1,18 +1,34 @@
import slugify from "@sindresorhus/slugify";
import { Knex } from "knex";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { randomSecureBytes } from "@app/lib/crypto";
import { symmetricCipherService, SymmetricEncryption } from "@app/lib/crypto/cipher";
import { BadRequestError } from "@app/lib/errors";
import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TKmsDALFactory } from "./kms-dal";
import { TOrgDALFactory } from "../org/org-dal";
import { TProjectDALFactory } from "../project/project-dal";
import { TInternalKmsDALFactory } from "./internal-kms-dal";
import { TKmsKeyDALFactory } from "./kms-key-dal";
import { TKmsRootConfigDALFactory } from "./kms-root-config-dal";
import { TDecryptWithKmsDTO, TEncryptWithKmsDTO, TGenerateKMSDTO } from "./kms-types";
import {
TDecryptWithKeyDTO,
TDecryptWithKmsDTO,
TEncryptionWithKeyDTO,
TEncryptWithKmsDTO,
TGenerateKMSDTO
} from "./kms-types";
type TKmsServiceFactoryDep = {
kmsDAL: TKmsDALFactory;
kmsDAL: TKmsKeyDALFactory;
projectDAL: Pick<TProjectDALFactory, "findById" | "updateById" | "transaction">;
orgDAL: Pick<TOrgDALFactory, "findById" | "updateById" | "transaction">;
kmsRootConfigDAL: Pick<TKmsRootConfigDALFactory, "findById" | "create">;
keyStore: Pick<TKeyStoreFactory, "acquireLock" | "waitTillReady" | "setItemWithExpiry">;
internalKmsDAL: Pick<TInternalKmsDALFactory, "create">;
};
export type TKmsServiceFactory = ReturnType<typeof kmsServiceFactory>;
@ -25,54 +41,158 @@ const KMS_ROOT_CREATION_WAIT_TIME = 10;
// akhilmhdh: Don't edit this value. This is measured for blob concatination in kms
const KMS_VERSION = "v01";
const KMS_VERSION_BLOB_LENGTH = 3;
export const kmsServiceFactory = ({ kmsDAL, kmsRootConfigDAL, keyStore }: TKmsServiceFactoryDep) => {
export const kmsServiceFactory = ({
kmsDAL,
kmsRootConfigDAL,
keyStore,
internalKmsDAL,
orgDAL,
projectDAL
}: TKmsServiceFactoryDep) => {
let ROOT_ENCRYPTION_KEY = Buffer.alloc(0);
// this is used symmetric encryption
const generateKmsKey = async ({ scopeId, scopeType, isReserved = true, tx }: TGenerateKMSDTO) => {
const generateKmsKey = async ({ orgId, isReserved = true, tx, slug }: TGenerateKMSDTO) => {
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const kmsKeyMaterial = randomSecureBytes(32);
const encryptedKeyMaterial = cipher.encrypt(kmsKeyMaterial, ROOT_ENCRYPTION_KEY);
const sanitizedSlug = slug ? slugify(slug) : slugify(alphaNumericNanoId(32));
const dbQuery = async (db: Knex) => {
const kmsDoc = await kmsDAL.create({
slug: sanitizedSlug,
orgId,
isReserved
});
const { encryptedKey, ...doc } = await kmsDAL.create(
{
version: 1,
encryptedKey: encryptedKeyMaterial,
encryptionAlgorithm: SymmetricEncryption.AES_GCM_256,
isReserved,
orgId: scopeType === "org" ? scopeId : undefined,
projectId: scopeType === "project" ? scopeId : undefined
},
tx
);
const { encryptedKey, ...doc } = await internalKmsDAL.create(
{
version: 1,
encryptedKey: encryptedKeyMaterial,
encryptionAlgorithm: SymmetricEncryption.AES_GCM_256,
kmsKeyId: kmsDoc.id
},
db
);
return doc;
};
if (tx) return dbQuery(tx);
const doc = await kmsDAL.transaction(async (tx2) => dbQuery(tx2));
return doc;
};
const encrypt = async ({ kmsId, plainText }: TEncryptWithKmsDTO) => {
const kmsDoc = await kmsDAL.findById(kmsId);
const encryptWithKmsKey = async ({ kmsId }: Omit<TEncryptWithKmsDTO, "plainText">) => {
const kmsDoc = await kmsDAL.findByIdWithAssociatedKms(kmsId);
if (!kmsDoc) throw new BadRequestError({ message: "KMS ID not found" });
// akhilmhdh: as more encryption are added do a check here on kmsDoc.encryptionAlgorithm
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
return ({ plainText }: Pick<TEncryptWithKmsDTO, "plainText">) => {
const kmsKey = cipher.decrypt(kmsDoc.internalKms?.encryptedKey as Buffer, ROOT_ENCRYPTION_KEY);
const encryptedPlainTextBlob = cipher.encrypt(plainText, kmsKey);
const kmsKey = cipher.decrypt(kmsDoc.encryptedKey, ROOT_ENCRYPTION_KEY);
const encryptedPlainTextBlob = cipher.encrypt(plainText, kmsKey);
// Buffer#1 encrypted text + Buffer#2 version number
const versionBlob = Buffer.from(KMS_VERSION, "utf8"); // length is 3
const cipherTextBlob = Buffer.concat([encryptedPlainTextBlob, versionBlob]);
return { cipherTextBlob };
// Buffer#1 encrypted text + Buffer#2 version number
const versionBlob = Buffer.from(KMS_VERSION, "utf8"); // length is 3
const cipherTextBlob = Buffer.concat([encryptedPlainTextBlob, versionBlob]);
return { cipherTextBlob };
};
};
const decrypt = async ({ cipherTextBlob: versionedCipherTextBlob, kmsId }: TDecryptWithKmsDTO) => {
const kmsDoc = await kmsDAL.findById(kmsId);
if (!kmsDoc) throw new BadRequestError({ message: "KMS ID not found" });
const encryptWithInputKey = async ({ key }: Omit<TEncryptionWithKeyDTO, "plainText">) => {
// akhilmhdh: as more encryption are added do a check here on kmsDoc.encryptionAlgorithm
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const kmsKey = cipher.decrypt(kmsDoc.encryptedKey, ROOT_ENCRYPTION_KEY);
return ({ plainText }: Pick<TEncryptWithKmsDTO, "plainText">) => {
const encryptedPlainTextBlob = cipher.encrypt(plainText, key);
// Buffer#1 encrypted text + Buffer#2 version number
const versionBlob = Buffer.from(KMS_VERSION, "utf8"); // length is 3
const cipherTextBlob = Buffer.concat([encryptedPlainTextBlob, versionBlob]);
return { cipherTextBlob };
};
};
const cipherTextBlob = versionedCipherTextBlob.subarray(0, -KMS_VERSION_BLOB_LENGTH);
const decryptedBlob = cipher.decrypt(cipherTextBlob, kmsKey);
return decryptedBlob;
const decryptWithKmsKey = async ({ kmsId }: Omit<TDecryptWithKmsDTO, "cipherTextBlob">) => {
const kmsDoc = await kmsDAL.findByIdWithAssociatedKms(kmsId);
if (!kmsDoc) throw new BadRequestError({ message: "KMS ID not found" });
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const kmsKey = cipher.decrypt(kmsDoc.internalKms?.encryptedKey as Buffer, ROOT_ENCRYPTION_KEY);
return ({ cipherTextBlob: versionedCipherTextBlob }: Pick<TDecryptWithKmsDTO, "cipherTextBlob">) => {
const cipherTextBlob = versionedCipherTextBlob.subarray(0, -KMS_VERSION_BLOB_LENGTH);
const decryptedBlob = cipher.decrypt(cipherTextBlob, kmsKey);
return decryptedBlob;
};
};
const decryptWithInputKey = async ({ key }: Omit<TDecryptWithKeyDTO, "cipherTextBlob">) => {
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
return ({ cipherTextBlob: versionedCipherTextBlob }: Pick<TDecryptWithKeyDTO, "cipherTextBlob">) => {
const cipherTextBlob = versionedCipherTextBlob.subarray(0, -KMS_VERSION_BLOB_LENGTH);
const decryptedBlob = cipher.decrypt(cipherTextBlob, key);
return decryptedBlob;
};
};
const getOrgKmsKeyId = async (orgId: string) => {
const keyId = await orgDAL.transaction(async (tx) => {
const org = await orgDAL.findById(orgId, tx);
if (!org) {
throw new BadRequestError({ message: "Org not found" });
}
if (!org.kmsDefaultKeyId) {
// create default kms key for certificate service
const key = await generateKmsKey({
isReserved: true,
orgId: org.id,
tx
});
await orgDAL.updateById(
org.id,
{
kmsDefaultKeyId: key.id
},
tx
);
return key.id;
}
return org.kmsDefaultKeyId;
});
return keyId;
};
const getProjectSecretManagerKmsKeyId = async (projectId: string) => {
const keyId = await projectDAL.transaction(async (tx) => {
const project = await projectDAL.findById(projectId, tx);
if (!project) {
throw new BadRequestError({ message: "Project not found" });
}
if (!project.kmsSecretManagerKeyId) {
// create default kms key for certificate service
const key = await generateKmsKey({
isReserved: true,
orgId: project.orgId,
tx
});
await projectDAL.updateById(
projectId,
{
kmsSecretManagerKeyId: key.id
},
tx
);
return key.id;
}
return project.kmsSecretManagerKeyId;
});
return keyId;
};
const startService = async () => {
@ -123,7 +243,11 @@ export const kmsServiceFactory = ({ kmsDAL, kmsRootConfigDAL, keyStore }: TKmsSe
return {
startService,
generateKmsKey,
encrypt,
decrypt
encryptWithKmsKey,
encryptWithInputKey,
decryptWithKmsKey,
decryptWithInputKey,
getOrgKmsKeyId,
getProjectSecretManagerKmsKeyId
};
};

View File

@ -1,9 +1,9 @@
import { Knex } from "knex";
export type TGenerateKMSDTO = {
scopeType: "project" | "org";
scopeId: string;
orgId: string;
isReserved?: boolean;
slug?: string;
tx?: Knex;
};
@ -12,7 +12,17 @@ export type TEncryptWithKmsDTO = {
plainText: Buffer;
};
export type TEncryptionWithKeyDTO = {
key: Buffer;
plainText: Buffer;
};
export type TDecryptWithKmsDTO = {
kmsId: string;
cipherTextBlob: Buffer;
};
export type TDecryptWithKeyDTO = {
key: Buffer;
cipherTextBlob: Buffer;
};

View File

@ -207,9 +207,9 @@ export const orgDALFactory = (db: TDbClient) => {
}
};
const updateById = async (orgId: string, data: Partial<TOrganizations>) => {
const updateById = async (orgId: string, data: Partial<TOrganizations>, tx?: Knex) => {
try {
const [org] = await db(TableName.Organization)
const [org] = await (tx || db)(TableName.Organization)
.where({ id: orgId })
.update({ ...data })
.returning("*");

View File

@ -144,10 +144,7 @@ export const orgServiceFactory = ({
return members;
};
const findAllWorkspaces = async ({ actor, actorId, actorOrgId, actorAuthMethod, orgId }: TFindAllWorkspacesDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Workspace);
const findAllWorkspaces = async ({ actor, actorId, orgId }: TFindAllWorkspacesDTO) => {
const organizationWorkspaceIds = new Set((await projectDAL.find({ orgId })).map((workspace) => workspace.id));
let workspaces: (TProjects & { organization: string } & {

View File

@ -71,9 +71,8 @@ export const getProjectKmsCertificateKeyId = async ({
if (!project.kmsCertificateKeyId) {
// create default kms key for certificate service
const key = await kmsService.generateKmsKey({
scopeId: projectId,
scopeType: "project",
isReserved: true,
orgId: project.orgId,
tx
});

View File

@ -322,7 +322,7 @@ export const secretFolderDALFactory = (db: TDbClient) => {
.first();
if (folder) {
const { envId, envName, envSlug, ...el } = folder;
return { ...el, environment: { envId, envName, envSlug } };
return { ...el, environment: { envId, envName, envSlug }, envId };
}
} catch (error) {
throw new DatabaseError({ error, name: "Find by id" });

View File

@ -6,7 +6,7 @@ import { TSecretFoldersInsert } from "@app/db/schemas";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { TSecretSnapshotServiceFactory } from "@app/ee/services/secret-snapshot/secret-snapshot-service";
import { BadRequestError } from "@app/lib/errors";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { TProjectDALFactory } from "../project/project-dal";
import { TProjectEnvDALFactory } from "../project-env/project-env-dal";
@ -14,6 +14,7 @@ import { TSecretFolderDALFactory } from "./secret-folder-dal";
import {
TCreateFolderDTO,
TDeleteFolderDTO,
TGetFolderByIdDTO,
TGetFolderDTO,
TUpdateFolderDTO,
TUpdateManyFoldersDTO
@ -368,11 +369,22 @@ export const secretFolderServiceFactory = ({
return folders;
};
const getFolderById = async ({ actor, actorId, actorOrgId, actorAuthMethod, id }: TGetFolderByIdDTO) => {
const folder = await folderDAL.findById(id);
if (!folder) throw new NotFoundError({ message: "folder not found" });
// folder list is allowed to be read by anyone
// permission to check does user has access
await permissionService.getProjectPermission(actor, actorId, folder.projectId, actorAuthMethod, actorOrgId);
return folder;
};
return {
createFolder,
updateFolder,
updateManyFolders,
deleteFolder,
getFolders
getFolders,
getFolderById
};
};

View File

@ -37,3 +37,7 @@ export type TGetFolderDTO = {
environment: string;
path: string;
} & TProjectPermission;
export type TGetFolderByIdDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;

View File

@ -1,9 +1,9 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { TableName, TSecretSharing } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify } from "@app/lib/knex";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TSecretSharingDALFactory = ReturnType<typeof secretSharingDALFactory>;
@ -13,15 +13,58 @@ export const secretSharingDALFactory = (db: TDbClient) => {
const pruneExpiredSharedSecrets = async (tx?: Knex) => {
try {
const today = new Date();
const docs = await (tx || db)(TableName.SecretSharing).where("expiresAt", "<", today).del();
const docs = await (tx || db)(TableName.SecretSharing)
.where("expiresAt", "<", today)
.andWhere("encryptedValue", "<>", "")
.update({
encryptedValue: "",
tag: "",
iv: "",
hashedHex: ""
});
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "pruneExpiredSharedSecrets" });
}
};
const findActiveSharedSecrets = async (filters: Partial<TSecretSharing>, tx?: Knex) => {
try {
const now = new Date();
return await (tx || db)(TableName.SecretSharing)
.where(filters)
.andWhere("expiresAt", ">", now)
.andWhere("encryptedValue", "<>", "")
.select(selectAllTableCols(TableName.SecretSharing))
.orderBy("expiresAt", "asc");
} catch (error) {
throw new DatabaseError({
error,
name: "Find Active Shared Secrets"
});
}
};
const softDeleteById = async (id: string) => {
try {
await sharedSecretOrm.updateById(id, {
encryptedValue: "",
iv: "",
tag: "",
hashedHex: ""
});
} catch (error) {
throw new DatabaseError({
error,
name: "Soft Delete Shared Secret"
});
}
};
return {
...sharedSecretOrm,
pruneExpiredSharedSecrets
pruneExpiredSharedSecrets,
softDeleteById,
findActiveSharedSecrets
};
};

View File

@ -101,7 +101,7 @@ export const secretSharingServiceFactory = ({
const { actor, actorId, orgId, actorAuthMethod, actorOrgId } = getSharedSecretsInput;
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
if (!permission) throw new UnauthorizedError({ name: "User not in org" });
const userSharedSecrets = await secretSharingDAL.find({ userId: actorId, orgId }, { sort: [["expiresAt", "asc"]] });
const userSharedSecrets = await secretSharingDAL.findActiveSharedSecrets({ userId: actorId, orgId });
return userSharedSecrets;
};
@ -113,7 +113,7 @@ export const secretSharingServiceFactory = ({
}
if (sharedSecret.expiresAfterViews != null && sharedSecret.expiresAfterViews >= 0) {
if (sharedSecret.expiresAfterViews === 0) {
await secretSharingDAL.deleteById(sharedSecretId);
await secretSharingDAL.softDeleteById(sharedSecretId);
return;
}
await secretSharingDAL.updateById(sharedSecretId, { $decr: { expiresAfterViews: 1 } });

View File

@ -642,7 +642,7 @@ export const secretQueueFactory = ({
});
queueService.start(QueueName.SecretWebhook, async (job) => {
await fnTriggerWebhook({ ...job.data, projectEnvDAL, webhookDAL });
await fnTriggerWebhook({ ...job.data, projectEnvDAL, webhookDAL, projectDAL });
});
return {

View File

@ -11,6 +11,9 @@ import {
} from "@app/db/schemas";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { TSecretApprovalPolicyServiceFactory } from "@app/ee/services/secret-approval-policy/secret-approval-policy-service";
import { TSecretApprovalRequestDALFactory } from "@app/ee/services/secret-approval-request/secret-approval-request-dal";
import { TSecretApprovalRequestSecretDALFactory } from "@app/ee/services/secret-approval-request/secret-approval-request-secret-dal";
import { TSecretSnapshotServiceFactory } from "@app/ee/services/secret-snapshot/secret-snapshot-service";
import { getConfig } from "@app/lib/config/env";
import {
@ -18,9 +21,10 @@ import {
decryptSymmetric128BitHexKeyUTF8,
encryptSymmetric128BitHexKeyUTF8
} from "@app/lib/crypto";
import { BadRequestError } from "@app/lib/errors";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { groupBy, pick } from "@app/lib/fn";
import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { ActorType } from "../auth/auth-type";
import { TProjectDALFactory } from "../project/project-dal";
@ -44,6 +48,7 @@ import {
} from "./secret-fns";
import { TSecretQueueFactory } from "./secret-queue";
import {
SecretOperations,
TAttachSecretTagsDTO,
TBackFillSecretReferencesDTO,
TCreateBulkSecretDTO,
@ -59,6 +64,7 @@ import {
TGetSecretsDTO,
TGetSecretsRawDTO,
TGetSecretVersionsDTO,
TMoveSecretsDTO,
TUpdateBulkSecretDTO,
TUpdateManySecretRawDTO,
TUpdateSecretDTO,
@ -84,6 +90,12 @@ type TSecretServiceFactoryDep = {
projectBotService: Pick<TProjectBotServiceFactory, "getBotKey">;
secretImportDAL: Pick<TSecretImportDALFactory, "find" | "findByFolderIds">;
secretVersionTagDAL: Pick<TSecretVersionTagDALFactory, "insertMany">;
secretApprovalPolicyService: Pick<TSecretApprovalPolicyServiceFactory, "getSecretApprovalPolicy">;
secretApprovalRequestDAL: Pick<TSecretApprovalRequestDALFactory, "create" | "transaction">;
secretApprovalRequestSecretDAL: Pick<
TSecretApprovalRequestSecretDALFactory,
"insertMany" | "insertApprovalSecretTags"
>;
};
export type TSecretServiceFactory = ReturnType<typeof secretServiceFactory>;
@ -100,7 +112,10 @@ export const secretServiceFactory = ({
projectDAL,
projectBotService,
secretImportDAL,
secretVersionTagDAL
secretVersionTagDAL,
secretApprovalPolicyService,
secretApprovalRequestDAL,
secretApprovalRequestSecretDAL
}: TSecretServiceFactoryDep) => {
const getSecretReference = async (projectId: string) => {
// if bot key missing means e2e still exist
@ -1683,6 +1698,393 @@ export const secretServiceFactory = ({
return { message: "Successfully backfilled secret references" };
};
const moveSecrets = async ({
sourceEnvironment,
sourceSecretPath,
destinationEnvironment,
destinationSecretPath,
secretIds,
projectSlug,
shouldOverwrite,
actor,
actorId,
actorAuthMethod,
actorOrgId
}: TMoveSecretsDTO) => {
const project = await projectDAL.findProjectBySlug(projectSlug, actorOrgId);
if (!project) {
throw new NotFoundError({
message: "Project not found."
});
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
project.id,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Delete,
subject(ProjectPermissionSub.Secrets, { environment: sourceEnvironment, secretPath: sourceSecretPath })
);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Create,
subject(ProjectPermissionSub.Secrets, { environment: destinationEnvironment, secretPath: destinationSecretPath })
);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Edit,
subject(ProjectPermissionSub.Secrets, { environment: destinationEnvironment, secretPath: destinationSecretPath })
);
const botKey = await projectBotService.getBotKey(project.id);
if (!botKey) {
throw new BadRequestError({ message: "Project bot not found", name: "bot_not_found_error" });
}
const sourceFolder = await folderDAL.findBySecretPath(project.id, sourceEnvironment, sourceSecretPath);
if (!sourceFolder) {
throw new NotFoundError({
message: "Source path does not exist."
});
}
const destinationFolder = await folderDAL.findBySecretPath(
project.id,
destinationEnvironment,
destinationSecretPath
);
if (!destinationFolder) {
throw new NotFoundError({
message: "Destination path does not exist."
});
}
const sourceSecrets = await secretDAL.find({
type: SecretType.Shared,
$in: {
id: secretIds
}
});
if (sourceSecrets.length !== secretIds.length) {
throw new BadRequestError({
message: "Invalid secrets"
});
}
const decryptedSourceSecrets = sourceSecrets.map((secret) => ({
...secret,
secretKey: decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretKeyCiphertext,
iv: secret.secretKeyIV,
tag: secret.secretKeyTag,
key: botKey
}),
secretValue: decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretValueCiphertext,
iv: secret.secretValueIV,
tag: secret.secretValueTag,
key: botKey
})
}));
let isSourceUpdated = false;
let isDestinationUpdated = false;
// Moving secrets is a two-step process.
await secretDAL.transaction(async (tx) => {
// First step is to create/update the secret in the destination:
const destinationSecretsFromDB = await secretDAL.find(
{
folderId: destinationFolder.id
},
{ tx }
);
const decryptedDestinationSecrets = destinationSecretsFromDB.map((secret) => {
return {
...secret,
secretKey: decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretKeyCiphertext,
iv: secret.secretKeyIV,
tag: secret.secretKeyTag,
key: botKey
}),
secretValue: decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretValueCiphertext,
iv: secret.secretValueIV,
tag: secret.secretValueTag,
key: botKey
})
};
});
const destinationSecretsGroupedByBlindIndex = groupBy(
decryptedDestinationSecrets.filter(({ secretBlindIndex }) => Boolean(secretBlindIndex)),
(i) => i.secretBlindIndex as string
);
const locallyCreatedSecrets = decryptedSourceSecrets
.filter(({ secretBlindIndex }) => !destinationSecretsGroupedByBlindIndex[secretBlindIndex as string]?.[0])
.map((el) => ({ ...el, operation: SecretOperations.Create }));
const locallyUpdatedSecrets = decryptedSourceSecrets
.filter(
({ secretBlindIndex, secretKey, secretValue }) =>
destinationSecretsGroupedByBlindIndex[secretBlindIndex as string]?.[0] &&
// if key or value changed
(destinationSecretsGroupedByBlindIndex[secretBlindIndex as string]?.[0]?.secretKey !== secretKey ||
destinationSecretsGroupedByBlindIndex[secretBlindIndex as string]?.[0]?.secretValue !== secretValue)
)
.map((el) => ({ ...el, operation: SecretOperations.Update }));
if (locallyUpdatedSecrets.length > 0 && !shouldOverwrite) {
const existingKeys = locallyUpdatedSecrets.map((s) => s.secretKey);
throw new BadRequestError({
message: `Failed to move secrets. The following secrets already exist in the destination: ${existingKeys.join(
","
)}`
});
}
const isEmpty = locallyCreatedSecrets.length + locallyUpdatedSecrets.length === 0;
if (isEmpty) {
throw new BadRequestError({
message: "Selected secrets already exist in the destination."
});
}
const destinationFolderPolicy = await secretApprovalPolicyService.getSecretApprovalPolicy(
project.id,
destinationFolder.environment.slug,
destinationFolder.path
);
if (destinationFolderPolicy && actor === ActorType.USER) {
// if secret approval policy exists for destination, we create the secret approval request
const localSecretsIds = decryptedDestinationSecrets.map(({ id }) => id);
const latestSecretVersions = await secretVersionDAL.findLatestVersionMany(
destinationFolder.id,
localSecretsIds,
tx
);
const approvalRequestDoc = await secretApprovalRequestDAL.create(
{
folderId: destinationFolder.id,
slug: alphaNumericNanoId(),
policyId: destinationFolderPolicy.id,
status: "open",
hasMerged: false,
committerUserId: actorId
},
tx
);
const commits = locallyCreatedSecrets.concat(locallyUpdatedSecrets).map((doc) => {
const { operation } = doc;
const localSecret = destinationSecretsGroupedByBlindIndex[doc.secretBlindIndex as string]?.[0];
return {
op: operation,
keyEncoding: doc.keyEncoding,
algorithm: doc.algorithm,
requestId: approvalRequestDoc.id,
metadata: doc.metadata,
secretKeyIV: doc.secretKeyIV,
secretKeyTag: doc.secretKeyTag,
secretKeyCiphertext: doc.secretKeyCiphertext,
secretValueIV: doc.secretValueIV,
secretValueTag: doc.secretValueTag,
secretValueCiphertext: doc.secretValueCiphertext,
secretBlindIndex: doc.secretBlindIndex,
secretCommentIV: doc.secretCommentIV,
secretCommentTag: doc.secretCommentTag,
secretCommentCiphertext: doc.secretCommentCiphertext,
skipMultilineEncoding: doc.skipMultilineEncoding,
// except create operation other two needs the secret id and version id
...(operation !== SecretOperations.Create
? { secretId: localSecret.id, secretVersion: latestSecretVersions[localSecret.id].id }
: {})
};
});
await secretApprovalRequestSecretDAL.insertMany(commits, tx);
} else {
// apply changes directly
if (locallyCreatedSecrets.length) {
await fnSecretBulkInsert({
folderId: destinationFolder.id,
secretVersionDAL,
secretDAL,
tx,
secretTagDAL,
secretVersionTagDAL,
inputSecrets: locallyCreatedSecrets.map((doc) => {
return {
keyEncoding: doc.keyEncoding,
algorithm: doc.algorithm,
type: doc.type,
metadata: doc.metadata,
secretKeyIV: doc.secretKeyIV,
secretKeyTag: doc.secretKeyTag,
secretKeyCiphertext: doc.secretKeyCiphertext,
secretValueIV: doc.secretValueIV,
secretValueTag: doc.secretValueTag,
secretValueCiphertext: doc.secretValueCiphertext,
secretBlindIndex: doc.secretBlindIndex,
secretCommentIV: doc.secretCommentIV,
secretCommentTag: doc.secretCommentTag,
secretCommentCiphertext: doc.secretCommentCiphertext,
skipMultilineEncoding: doc.skipMultilineEncoding
};
})
});
}
if (locallyUpdatedSecrets.length) {
await fnSecretBulkUpdate({
projectId: project.id,
folderId: destinationFolder.id,
secretVersionDAL,
secretDAL,
tx,
secretTagDAL,
secretVersionTagDAL,
inputSecrets: locallyUpdatedSecrets.map((doc) => {
return {
filter: {
folderId: destinationFolder.id,
id: destinationSecretsGroupedByBlindIndex[doc.secretBlindIndex as string][0].id
},
data: {
keyEncoding: doc.keyEncoding,
algorithm: doc.algorithm,
type: doc.type,
metadata: doc.metadata,
secretKeyIV: doc.secretKeyIV,
secretKeyTag: doc.secretKeyTag,
secretKeyCiphertext: doc.secretKeyCiphertext,
secretValueIV: doc.secretValueIV,
secretValueTag: doc.secretValueTag,
secretValueCiphertext: doc.secretValueCiphertext,
secretBlindIndex: doc.secretBlindIndex,
secretCommentIV: doc.secretCommentIV,
secretCommentTag: doc.secretCommentTag,
secretCommentCiphertext: doc.secretCommentCiphertext,
skipMultilineEncoding: doc.skipMultilineEncoding
}
};
})
});
}
isDestinationUpdated = true;
}
// Next step is to delete the secrets from the source folder:
const sourceSecretsGroupByBlindIndex = groupBy(sourceSecrets, (i) => i.secretBlindIndex as string);
const locallyDeletedSecrets = decryptedSourceSecrets.map((el) => ({ ...el, operation: SecretOperations.Delete }));
const sourceFolderPolicy = await secretApprovalPolicyService.getSecretApprovalPolicy(
project.id,
sourceFolder.environment.slug,
sourceFolder.path
);
if (sourceFolderPolicy && actor === ActorType.USER) {
// if secret approval policy exists for source, we create the secret approval request
const localSecretsIds = decryptedSourceSecrets.map(({ id }) => id);
const latestSecretVersions = await secretVersionDAL.findLatestVersionMany(sourceFolder.id, localSecretsIds, tx);
const approvalRequestDoc = await secretApprovalRequestDAL.create(
{
folderId: sourceFolder.id,
slug: alphaNumericNanoId(),
policyId: sourceFolderPolicy.id,
status: "open",
hasMerged: false,
committerUserId: actorId
},
tx
);
const commits = locallyDeletedSecrets.map((doc) => {
const { operation } = doc;
const localSecret = sourceSecretsGroupByBlindIndex[doc.secretBlindIndex as string]?.[0];
return {
op: operation,
keyEncoding: doc.keyEncoding,
algorithm: doc.algorithm,
requestId: approvalRequestDoc.id,
metadata: doc.metadata,
secretKeyIV: doc.secretKeyIV,
secretKeyTag: doc.secretKeyTag,
secretKeyCiphertext: doc.secretKeyCiphertext,
secretValueIV: doc.secretValueIV,
secretValueTag: doc.secretValueTag,
secretValueCiphertext: doc.secretValueCiphertext,
secretBlindIndex: doc.secretBlindIndex,
secretCommentIV: doc.secretCommentIV,
secretCommentTag: doc.secretCommentTag,
secretCommentCiphertext: doc.secretCommentCiphertext,
skipMultilineEncoding: doc.skipMultilineEncoding,
secretId: localSecret.id,
secretVersion: latestSecretVersions[localSecret.id].id
};
});
await secretApprovalRequestSecretDAL.insertMany(commits, tx);
} else {
// if no secret approval policy is present, we delete directly.
await secretDAL.delete(
{
$in: {
id: locallyDeletedSecrets.map(({ id }) => id)
},
folderId: sourceFolder.id
},
tx
);
isSourceUpdated = true;
}
});
if (isDestinationUpdated) {
await snapshotService.performSnapshot(destinationFolder.id);
await secretQueueService.syncSecrets({
projectId: project.id,
secretPath: destinationFolder.path,
environmentSlug: destinationFolder.environment.slug,
actorId,
actor
});
}
if (isSourceUpdated) {
await snapshotService.performSnapshot(sourceFolder.id);
await secretQueueService.syncSecrets({
projectId: project.id,
secretPath: sourceFolder.path,
environmentSlug: sourceFolder.environment.slug,
actorId,
actor
});
}
return {
projectId: project.id,
isSourceUpdated,
isDestinationUpdated
};
};
return {
attachTags,
detachTags,
@ -1703,6 +2105,7 @@ export const secretServiceFactory = ({
updateManySecretsRaw,
deleteManySecretsRaw,
getSecretVersions,
backfillSecretReferences
backfillSecretReferences,
moveSecrets
};
};

View File

@ -397,3 +397,13 @@ export type TSyncSecretsDTO<T extends boolean = false> = {
// used for import creation to trigger replication
pickOnlyImportIds?: string[];
});
export type TMoveSecretsDTO = {
projectSlug: string;
sourceEnvironment: string;
sourceSecretPath: string;
destinationEnvironment: string;
destinationSecretPath: string;
secretIds: string[];
shouldOverwrite: boolean;
} & Omit<TProjectPermission, "projectId">;

View File

@ -1,6 +1,7 @@
import bcrypt from "bcrypt";
import { TSuperAdmin, TSuperAdminUpdate } from "@app/db/schemas";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
@ -20,6 +21,7 @@ type TSuperAdminServiceFactoryDep = {
authService: Pick<TAuthLoginFactory, "generateUserTokens">;
orgService: Pick<TOrgServiceFactory, "createOrganization">;
keyStore: Pick<TKeyStoreFactory, "getItem" | "setItemWithExpiry" | "deleteItem">;
licenseService: Pick<TLicenseServiceFactory, "onPremFeatures">;
};
export type TSuperAdminServiceFactory = ReturnType<typeof superAdminServiceFactory>;
@ -36,7 +38,8 @@ export const superAdminServiceFactory = ({
userDAL,
authService,
orgService,
keyStore
keyStore,
licenseService
}: TSuperAdminServiceFactoryDep) => {
const initServerCfg = async () => {
// TODO(akhilmhdh): bad pattern time less change this later to me itself
@ -219,6 +222,12 @@ export const superAdminServiceFactory = ({
};
const deleteUser = async (userId: string) => {
if (!licenseService.onPremFeatures?.instanceUserManagement) {
throw new BadRequestError({
message: "Failed to delete user due to plan restriction. Upgrade to Infisical's Pro plan."
});
}
const user = await userDAL.deleteById(userId);
return user;
};

View File

@ -9,6 +9,7 @@ import { infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { BadRequestError } from "@app/lib/errors";
import { logger } from "@app/lib/logger";
import { TProjectDALFactory } from "../project/project-dal";
import { TProjectEnvDALFactory } from "../project-env/project-env-dal";
import { TWebhookDALFactory } from "./webhook-dal";
import { WebhookType } from "./webhook-types";
@ -66,11 +67,16 @@ export const triggerWebhookRequest = async (webhook: TWebhooks, data: Record<str
export const getWebhookPayload = (
eventName: string,
workspaceId: string,
environment: string,
secretPath?: string,
type?: string | null
details: {
workspaceName: string;
workspaceId: string;
environment: string;
secretPath?: string;
type?: string | null;
}
) => {
const { workspaceName, workspaceId, environment, secretPath, type } = details;
switch (type) {
case WebhookType.SLACK:
return {
@ -80,8 +86,8 @@ export const getWebhookPayload = (
color: "#E7F256",
fields: [
{
title: "Workspace ID",
value: workspaceId,
title: "Project",
value: workspaceName,
short: false
},
{
@ -117,7 +123,9 @@ export type TFnTriggerWebhookDTO = {
environment: string;
webhookDAL: Pick<TWebhookDALFactory, "findAllWebhooks" | "transaction" | "update" | "bulkUpdate">;
projectEnvDAL: Pick<TProjectEnvDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "findById">;
};
// this is reusable function
// used in secret queue to trigger webhook and update status when secrets changes
export const fnTriggerWebhook = async ({
@ -125,7 +133,8 @@ export const fnTriggerWebhook = async ({
secretPath,
projectId,
webhookDAL,
projectEnvDAL
projectEnvDAL,
projectDAL
}: TFnTriggerWebhookDTO) => {
const webhooks = await webhookDAL.findAllWebhooks(projectId, environment);
const toBeTriggeredHooks = webhooks.filter(
@ -134,9 +143,19 @@ export const fnTriggerWebhook = async ({
);
if (!toBeTriggeredHooks.length) return;
logger.info("Secret webhook job started", { environment, secretPath, projectId });
const project = await projectDAL.findById(projectId);
const webhooksTriggered = await Promise.allSettled(
toBeTriggeredHooks.map((hook) =>
triggerWebhookRequest(hook, getWebhookPayload("secrets.modified", projectId, environment, secretPath, hook.type))
triggerWebhookRequest(
hook,
getWebhookPayload("secrets.modified", {
workspaceName: project.name,
workspaceId: projectId,
environment,
secretPath,
type: hook.type
})
)
)
);

View File

@ -6,6 +6,7 @@ import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services
import { infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { BadRequestError } from "@app/lib/errors";
import { TProjectDALFactory } from "../project/project-dal";
import { TProjectEnvDALFactory } from "../project-env/project-env-dal";
import { TWebhookDALFactory } from "./webhook-dal";
import { decryptWebhookDetails, getWebhookPayload, triggerWebhookRequest } from "./webhook-fns";
@ -20,12 +21,18 @@ import {
type TWebhookServiceFactoryDep = {
webhookDAL: TWebhookDALFactory;
projectEnvDAL: TProjectEnvDALFactory;
projectDAL: Pick<TProjectDALFactory, "findById">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission">;
};
export type TWebhookServiceFactory = ReturnType<typeof webhookServiceFactory>;
export const webhookServiceFactory = ({ webhookDAL, projectEnvDAL, permissionService }: TWebhookServiceFactoryDep) => {
export const webhookServiceFactory = ({
webhookDAL,
projectEnvDAL,
permissionService,
projectDAL
}: TWebhookServiceFactoryDep) => {
const createWebhook = async ({
actor,
actorId,
@ -124,13 +131,21 @@ export const webhookServiceFactory = ({ webhookDAL, projectEnvDAL, permissionSer
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Webhooks);
const project = await projectDAL.findById(webhook.projectId);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Webhooks);
let webhookError: string | undefined;
try {
await triggerWebhookRequest(
webhook,
getWebhookPayload("test", webhook.projectId, webhook.environment.slug, webhook.secretPath, webhook.type)
getWebhookPayload("test", {
workspaceName: project.name,
workspaceId: webhook.projectId,
environment: webhook.environment.slug,
secretPath: webhook.secretPath,
type: webhook.type
})
);
} catch (err) {
webhookError = (err as Error).message;

View File

@ -19,7 +19,7 @@ require (
github.com/petar-dambovaliev/aho-corasick v0.0.0-20211021192214-5ab2d9280aa9
github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8
github.com/posthog/posthog-go v0.0.0-20221221115252-24dfed35d71a
github.com/rs/cors v1.9.0
github.com/rs/cors v1.11.0
github.com/rs/zerolog v1.26.1
github.com/spf13/cobra v1.6.1
github.com/spf13/viper v1.8.1
@ -52,7 +52,7 @@ require (
github.com/chzyer/readline v1.5.1 // indirect
github.com/danieljoos/wincred v1.2.0 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/dvsekhvalnov/jose2go v1.5.0 // indirect
github.com/dvsekhvalnov/jose2go v1.6.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/fsnotify/fsnotify v1.4.9 // indirect
github.com/go-logr/logr v1.4.1 // indirect

View File

@ -117,8 +117,8 @@ github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/denisbrodbeck/machineid v1.0.1 h1:geKr9qtkB876mXguW2X6TU4ZynleN6ezuMSRhl4D7AQ=
github.com/denisbrodbeck/machineid v1.0.1/go.mod h1:dJUwb7PTidGDeYyUBmXZ2GphQBbjJCrnectwCyxcUSI=
github.com/dvsekhvalnov/jose2go v1.5.0 h1:3j8ya4Z4kMCwT5nXIKFSV84YS+HdqSSO0VsTQxaLAeM=
github.com/dvsekhvalnov/jose2go v1.5.0/go.mod h1:QsHjhyTlD/lAVqn/NSbVZmSCGeDehTB/mPZadG+mhXU=
github.com/dvsekhvalnov/jose2go v1.6.0 h1:Y9gnSnP4qEI0+/uQkHvFXeD2PLPJeXEL+ySMEA2EjTY=
github.com/dvsekhvalnov/jose2go v1.6.0/go.mod h1:QsHjhyTlD/lAVqn/NSbVZmSCGeDehTB/mPZadG+mhXU=
github.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98=
@ -356,8 +356,8 @@ github.com/rivo/uniseg v0.2.0 h1:S1pD9weZBuJdFmowNwbpi7BJ8TNftyUImj/0WQi72jY=
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
github.com/rogpeppe/fastuuid v1.2.0/go.mod h1:jVj6XXZzXRy/MSR5jhDC/2q6DgLz+nrA6LYCDYWNEvQ=
github.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rs/cors v1.9.0 h1:l9HGsTsHJcvW14Nk7J9KFz8bzeAWXn3CG6bgt7LsrAE=
github.com/rs/cors v1.9.0/go.mod h1:XyqrcTp5zjWr1wsJ8PIRZssZ8b/WMcMf71DJnit4EMU=
github.com/rs/cors v1.11.0 h1:0B9GE/r9Bc2UxRMMtymBkHTenPkHDv0CW4Y98GBY+po=
github.com/rs/cors v1.11.0/go.mod h1:XyqrcTp5zjWr1wsJ8PIRZssZ8b/WMcMf71DJnit4EMU=
github.com/rs/xid v1.3.0/go.mod h1:trrq9SKmegXys3aeAKXMUTdJsYXVwGY3RLcfgqegfbg=
github.com/rs/zerolog v1.26.1 h1:/ihwxqH+4z8UxyI70wM1z9yCvkWcfz/a3mj48k/Zngc=
github.com/rs/zerolog v1.26.1/go.mod h1:/wSSJWX7lVrsOwlbyTRSOJvqRlc+WjWlfes+CiJ+tmc=

View File

@ -0,0 +1,50 @@
---
title: "Hiring"
sidebarTitle: "Hiring"
description: "The guide to hiring at Infisical."
---
Infisical is actively growing and we are hiring for many positions at any given time. This page describes some details of the hiring process we have.
## Strategy
Infisical recruitment strategy relies on 100% inbound interest by default. Many of our team members have previously used Infisical or contributed to our [open source project](https://github.com/Infisical/infisical). This allows us to hire the best candidates who are most interested in working at Infisical.
## Geography
Infisical is a remote-first company, and we have team members across the whole globe. That being said, there are some legal and accounting limitations that we need to abide by. As a result, we are currently only open to hiring from the following countries:
- Australia
- Austria
- Belgium
- Brazil
- Canada
- Chile
- Costa Rica
- Denmark
- Finland
- France
- Germany
- India
- Ireland
- Israel
- Italy
- Japan
- Kenya
- Latvia
- Luxembourg
- Mexico
- Netherlands
- New Zealand
- Philippines
- Poland
- Portugal
- Singapore
- South Africa
- South Korea
- Spain
- Switzerland
- Sweden
- UAE
- United Kingdom
- United States

View File

@ -58,7 +58,8 @@
"pages": [
"handbook/onboarding",
"handbook/spending-money",
"handbook/time-off"
"handbook/time-off",
"handbook/hiring"
]
}
],

View File

@ -0,0 +1,4 @@
---
title: "Attach"
openapi: "POST /api/v1/auth/aws-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Login"
openapi: "POST /api/v1/auth/aws-auth/login"
---

View File

@ -0,0 +1,4 @@
---
title: "Retrieve"
openapi: "GET /api/v1/auth/aws-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Revoke"
openapi: "DELETE /api/v1/auth/aws-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Update"
openapi: "PATCH /api/v1/auth/aws-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Attach"
openapi: "POST /api/v1/auth/azure-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Login"
openapi: "POST /api/v1/auth/azure-auth/login"
---

View File

@ -0,0 +1,4 @@
---
title: "Retrieve"
openapi: "GET /api/v1/auth/azure-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Revoke"
openapi: "DELETE /api/v1/auth/azure-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Update"
openapi: "PATCH /api/v1/auth/azure-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Get by ID"
openapi: "GET /api/v1/folders/{id}"
---

View File

@ -0,0 +1,4 @@
---
title: "Attach"
openapi: "POST /api/v1/auth/gcp-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Login"
openapi: "POST /api/v1/auth/gcp-auth/login"
---

View File

@ -0,0 +1,4 @@
---
title: "Retrieve"
openapi: "GET /api/v1/auth/gcp-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Revoke"
openapi: "DELETE /api/v1/auth/gcp-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Update"
openapi: "PATCH /api/v1/auth/gcp-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Attach"
openapi: "POST /api/v1/auth/kubernetes-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Login"
openapi: "POST /api/v1/auth/kubernetes-auth/login"
---

View File

@ -0,0 +1,4 @@
---
title: "Retrieve"
openapi: "GET /api/v1/auth/kubernetes-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Revoke"
openapi: "DELETE /api/v1/auth/kubernetes-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Update"
openapi: "PATCH /api/v1/auth/kubernetes-auth/identities/{identityId}"
---

View File

@ -0,0 +1,4 @@
---
title: "Attach"
openapi: "POST /api/v1/auth/oidc-auth/identities/{identityId}"
---

Some files were not shown because too many files have changed in this diff Show More