Compare commits

..

104 Commits

Author SHA1 Message Date
carlosmonastyrski
9ea9f90928 PIT: add envID to rollback endpoint 2025-05-20 09:34:43 -03:00
carlosmonastyrski
6319f53802 PIT: UI views 2025-05-20 08:22:14 -03:00
carlosmonastyrski
8bfd3913da PIT: add backend logic for deep PIT and rollback 2025-05-14 10:26:41 -03:00
carlosmonastyrski
9e1d38a27b Add PIT rollback 2025-05-09 16:03:50 -03:00
carlosmonastyrski
78d5bc823d PIT: Add folder reconstruction functions 2025-05-09 09:20:17 -03:00
carlosmonastyrski
e8d424bbb0 PIT: Add initialization and checkpoint logic 2025-05-08 09:41:01 -03:00
carlosmonastyrski
e58dbe853e Minor improvements on commits code quality 2025-05-07 08:38:19 -03:00
carlosmonastyrski
f493a617b1 Add new commit logic on every folder/secret operation 2025-05-06 18:57:25 -03:00
carlosmonastyrski
32a3e1d200 commit 2025-05-06 08:11:50 -03:00
carlosmonastyrski
c6e56f0380 Stop removing secret/folder versions on projects with version >= 3 2025-05-05 16:43:58 -03:00
Daniel Hougaard
95b6676976 Merge pull request #3539 from Infisical/daniel/gateway-helm-docs
docs(gateway-helm): helm deployment
2025-05-05 17:45:36 +04:00
Maidul Islam
15c0834d56 Merge pull request #3530 from Infisical/email-revamp
improvemet(email-templates): migrate email templates to react email
2025-05-04 23:04:38 -04:00
Daniel Hougaard
edd415aed8 Update overview.mdx 2025-05-05 00:40:49 +04:00
Daniel Hougaard
c816cbc9a9 docs(gateway-helm): helm deployment 2025-05-05 00:09:59 +04:00
Daniel Hougaard
416811d594 Merge pull request #3524 from Infisical/daniel/gateway-helm
feat(helm): infisical helm
2025-05-04 23:52:19 +04:00
Maidul Islam
80a9d2bba9 Merge pull request #3538 from Infisical/doc/add-auto-deployment-ref-for-daemonsets-and-statefulsets
doc: added daemonset and statefulset auto-redeploy example
2025-05-04 14:16:41 -04:00
Sheen
f5e34ea59e doc: added daemonset and statefulset auto-redeploy example 2025-05-04 15:28:12 +00:00
Scott Wilson
bec3cec040 fix: correct secret-scanning link 2025-05-02 15:52:13 -07:00
Maidul Islam
d1122886fd Merge pull request #3532 from Infisical/add-missing-identity-specific-privilege-v2-docs-api
Add identity-specific-privilege v2 API to docs
2025-05-02 16:46:45 -04:00
BlackMagiq
3757f190f0 Merge pull request #3522 from Infisical/host-groups
Infisical SSH - Add Support for Host Groups
2025-05-02 13:46:02 -07:00
Maidul Islam
fec55bc9f8 fix greptile recs 2025-05-02 16:40:56 -04:00
Tuan Dang
a285a14fff Fix SshHostsTable component 2025-05-02 13:38:21 -07:00
Tuan Dang
9ec7d0d03e Update login mapping rendering on ssh hosts 2025-05-02 13:37:39 -07:00
Tuan Dang
d5246c2891 Update rendering on login mappings on hosts table 2025-05-02 13:30:48 -07:00
Daniel Hougaard
dcb7215b7d requested changes 2025-05-03 00:20:25 +04:00
x032205
c0f383ce1d Merge pull request #3536 from Infisical/vite-allowed-hosts
feat(vite.config): Allowed Hosts Defined Through Env Variable
2025-05-02 16:16:40 -04:00
Tuan Dang
0dcb223f80 Fix merge conflicts 2025-05-02 13:06:18 -07:00
Scott Wilson
f9f098af86 fix: try updating tsup.config to account for .tsx 2025-05-02 12:20:17 -07:00
Tuan Dang
6a5748150a Revise PR based on review 2025-05-02 12:16:51 -07:00
Scott Wilson
3ef053f255 fix: test adding explicity .tsx path 2025-05-02 12:13:23 -07:00
carlosmonastyrski
ed914d49ee Merge pull request #3531 from Infisical/feat/githubSsoDefaultOrganizationSetting
Add Github SSO users to default organization on signup
2025-05-02 15:59:33 -03:00
Scott Wilson
8f7a652741 fix: correct imports 2025-05-02 11:57:18 -07:00
x
e43f583eb6 feat(vite.config): Allowed Hosts Defined Through Env Variable 2025-05-02 14:45:44 -04:00
Scott Wilson
717c947e53 fix: try removing jsx usage 2025-05-02 11:42:20 -07:00
Scott Wilson
8ad334b3ab fix: try reverting ts jsx type 2025-05-02 11:34:18 -07:00
Scott Wilson
c7e707f20a improvement: address feedback 2025-05-02 11:08:41 -07:00
carlosmonastyrski
46755f724c Improve /complete-account/signup body schema 2025-05-02 13:06:45 -03:00
carlosmonastyrski
e12f4ad253 Add cloud check on github add user to default org 2025-05-02 12:58:36 -03:00
Daniel Hougaard
5dbded60f4 Delete Dockerfile.gateway 2025-05-02 16:38:31 +04:00
Daniel Hougaard
a80d5f10e5 fix(gateway-helm): requested changes 2025-05-02 16:38:02 +04:00
Sheen
0faa8f4bb0 Merge pull request #3533 from Infisical/doc/add-mention-of-pkce-and-eddsa-alg
doc: add mentions of PKCE and eddsa alg for oidc
2025-05-02 19:42:33 +08:00
carlosmonastyrski
365b4b975e Add minor improvements to Github SSO users added to default organization on signup 2025-05-02 08:22:47 -03:00
Sheen
fbf634f7da doc: add mentions of PKCE and eddsa alg for oidc 2025-05-02 07:57:37 +00:00
Maidul Islam
47bb3c10fa Add identity-specific-privilege v2 API to docs
Add identity-specific-privilege v2 API to docs
2025-05-02 00:32:17 -04:00
x032205
1f3e7da3b7 Merge pull request #3487 from Infisical/ENG-2633
feat(secret-sync): Hashicorp Vault App Connection & Secret Sync
2025-05-01 20:31:18 -04:00
x032205
81396f6b51 Small docs change 2025-05-01 20:23:29 -04:00
carlosmonastyrski
63279280fd Add Github SSO users to default organization on signup 2025-05-01 20:41:30 -03:00
Scott Wilson
66fbcc6806 improvemet(email-templates): migrate email templates to react email 2025-05-01 14:57:24 -07:00
Daniel Hougaard
f2d9593660 Merge pull request #3486 from Infisical/daniel/ms-teams-integration
feat(workflow-integrations): microsoft teams
2025-05-02 00:46:19 +04:00
Daniel Hougaard
219964a242 fix: query invalidation 2025-05-02 00:41:46 +04:00
Daniel Hougaard
240f558231 fix: added empty state 2025-05-01 23:49:18 +04:00
Daniel Hougaard
f3b3df1010 Update MicrosoftTeamsIntegrationForm.tsx 2025-05-01 20:43:23 +04:00
Daniel Hougaard
1fd6cd4787 Update MicrosoftTeamsIntegrationForm.tsx 2025-05-01 20:34:09 +04:00
Daniel Hougaard
a7d715ed08 Update MicrosoftTeamsIntegrationForm.tsx 2025-05-01 20:26:47 +04:00
x
a758503f40 new paths get created 2025-05-01 11:53:41 -04:00
Daniel Hougaard
550cb2b5ec smaller ui improvements 2025-05-01 19:50:50 +04:00
Daniel Hougaard
75cb259c51 add description tooltip 2025-05-01 19:15:29 +04:00
x
be2c5a9e57 merge conflicts 2025-05-01 10:48:33 -04:00
Daniel Hougaard
a077a9d6f2 Update OauthCallbackPage.tsx 2025-05-01 18:27:29 +04:00
Daniel Hougaard
835b2fba9c requested changes 2025-05-01 18:02:27 +04:00
Daniel Hougaard
82c7dad6c8 feat(helm): infisical helm 2025-05-01 06:45:40 +04:00
Tuan Dang
83df0850ce Fix frontend lint issues 2025-04-30 19:44:56 -07:00
Tuan Dang
ae43435509 Revise PR based on coderabbit, greptile review 2025-04-30 19:39:02 -07:00
Tuan Dang
7811178261 Fix merge conflicts 2025-04-30 18:32:56 -07:00
Tuan Dang
b21b0b340b Complete preliminary ssh host group feature 2025-04-30 18:14:31 -07:00
Daniel Hougaard
9e56790886 Update OauthCallbackPage.tsx 2025-05-01 04:13:46 +04:00
Daniel Hougaard
e08c5f265e fix: improve auth step to avoid takeovers 2025-05-01 04:08:58 +04:00
Tuan Dang
b06eeb0d40 Add add/remove/list hosts in ssh host groups functionality 2025-04-29 23:31:57 -07:00
x
5d366687a5 review fixes 2025-04-30 01:16:40 -04:00
x
4720914839 Merge branch 'main' into ENG-2633 2025-04-30 00:54:37 -04:00
Daniel Hougaard
aedc6e16ad Update .infisicalignore 2025-04-30 02:51:48 +04:00
Daniel Hougaard
1ec7c67212 Merge branch 'heads/main' into daniel/ms-teams-integration 2025-04-30 02:39:08 +04:00
Daniel Hougaard
ff0ff622a6 requested changes 2025-04-30 02:35:07 +04:00
Tuan Dang
a9a16c9bd1 Begin work on ssh host groups 2025-04-29 13:39:24 -07:00
Daniel Hougaard
929434d17f docs: improved ms teams workflow integration self-hosting docs 2025-04-30 00:15:58 +04:00
x
ee2e2246da solved merge conflicts 2025-04-28 18:51:20 -04:00
x
e30d400afa Support for namespaces (for HCP) 2025-04-28 18:34:33 -04:00
Daniel Hougaard
f35cd2d6a6 Update project-service.ts 2025-04-28 05:20:56 +04:00
Daniel Hougaard
b259428075 fix: secret scanning & route mismatch 2025-04-28 05:16:33 +04:00
Daniel Hougaard
f54a10f626 Merge branch 'heads/main' into daniel/ms-teams-integration 2025-04-28 05:08:04 +04:00
Daniel Hougaard
63a3ce2dba feat(workflow-integrations): ms-teams audit logs and pagination support 2025-04-28 04:58:25 +04:00
Daniel Hougaard
9aabc3ced7 better error logs 2025-04-28 04:07:32 +04:00
Daniel Hougaard
fe9ec6b030 docs(workflow-integrations): microsoft teams 2025-04-28 04:07:01 +04:00
Daniel Hougaard
bef55043f7 Update OrgWorkflowIntegrationTab.tsx 2025-04-26 10:16:30 +04:00
Daniel Hougaard
0323d152da feat(microsoft-teams): better authentication flow and doc references 2025-04-26 09:46:23 +04:00
x
b6566943c6 solve merge conflicts 2025-04-25 19:11:00 -04:00
Daniel Hougaard
8987938642 fix(microsoft-teams-integration): bug fixes 2025-04-25 11:03:31 +04:00
x
3f00359459 implemented blockLocalAndPrivateIpAddresses 2025-04-24 23:53:20 -04:00
x
a5b5b90ca1 nit: make docs a bit more future proof and descriptive 2025-04-24 23:48:09 -04:00
x
fd0a00023b nit: organize docs sidebar alphabetically 2025-04-24 23:40:47 -04:00
x
dd112b3850 review fixes 2025-04-24 22:39:20 -04:00
x
c01c58fdcb small nit for consistency 2025-04-24 22:10:21 -04:00
x
4bba207552 fix UpdateHCVaultConnectionSchema only supporting AccessToken 2025-04-24 22:09:06 -04:00
Daniel Hougaard
8563eb850b fix: ts errors 2025-04-25 05:40:28 +04:00
x
4225bf6e0e Merge branch 'main' into ENG-2633 2025-04-24 21:23:38 -04:00
x
fab385fdd9 feat(docs): Hashicorp Vault App Connection & Secret Sync Docs 2025-04-24 21:22:44 -04:00
Daniel Hougaard
a204629bef Merge branch 'heads/main' into daniel/ms-teams-integration 2025-04-25 05:22:35 +04:00
Daniel Hougaard
50679ba29d fix: requested changes 2025-04-25 05:22:17 +04:00
Daniel Hougaard
f5fa57d6c5 fix: further cleanup 2025-04-25 04:53:00 +04:00
Daniel Hougaard
6088ae09ab fix: cleanup 2025-04-25 04:40:46 +04:00
Daniel Hougaard
0de15bf70c fix: remove logs 2025-04-25 04:37:22 +04:00
Daniel Hougaard
b5d229a7c5 feat(native-integrations): microsoft teams 2025-04-25 04:35:40 +04:00
x
92084ccd47 feat(secrey-sync): Hashicorp Vault Secret Sync (and minor app connection fixes) 2025-04-24 18:54:05 -04:00
x
418ac20f91 feat(app-connections): Hashicorp Vault App Connection 2025-04-24 15:41:21 -04:00
453 changed files with 24796 additions and 2388 deletions

View File

@@ -0,0 +1,27 @@
name: Release Gateway Helm Chart
on:
workflow_dispatch:
jobs:
release-helm:
name: Release Helm Chart
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Helm
uses: azure/setup-helm@v3
with:
version: v3.10.0
- name: Install python
uses: actions/setup-python@v4
- name: Install Cloudsmith CLI
run: pip install --upgrade cloudsmith-cli
- name: Build and push helm package to CloudSmith
run: cd helm-charts && sh upload-gateway-cloudsmith.sh
env:
CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }}

View File

@@ -24,3 +24,7 @@ frontend/src/hooks/api/secretRotationsV2/types/index.ts:generic-api-key:65
frontend/src/pages/secret-manager/SecretDashboardPage/components/SecretRotationListView/SecretRotationItem.tsx:generic-api-key:26
docs/documentation/platform/kms/overview.mdx:generic-api-key:281
docs/documentation/platform/kms/overview.mdx:generic-api-key:344
frontend/src/pages/secret-manager/OverviewPage/components/SecretOverviewTableRow/SecretOverviewTableRow.tsx:generic-api-key:85
docs/cli/commands/user.mdx:generic-api-key:51
frontend/src/pages/secret-manager/OverviewPage/components/SecretOverviewTableRow/SecretOverviewTableRow.tsx:generic-api-key:76
docs/integrations/app-connections/hashicorp-vault.mdx:generic-api-key:188

View File

@@ -69,6 +69,15 @@ module.exports = {
["^\\."]
]
}
],
"import/extensions": [
"error",
"ignorePackages",
{
"": "never", // this is required to get the .tsx to work...
ts: "never",
tsx: "never"
}
]
}
};

3168
backend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -72,7 +72,8 @@
"seed:new": "tsx ./scripts/create-seed-file.ts",
"seed": "knex --knexfile ./dist/db/knexfile.ts --client pg seed:run",
"seed-dev": "knex --knexfile ./src/db/knexfile.ts --client pg seed:run",
"db:reset": "npm run migration:rollback -- --all && npm run migration:latest"
"db:reset": "npm run migration:rollback -- --all && npm run migration:latest",
"email:dev": "email dev --dir src/services/smtp/emails"
},
"keywords": [],
"author": "",
@@ -96,6 +97,7 @@
"@types/picomatch": "^2.3.3",
"@types/pkcs11js": "^1.0.4",
"@types/prompt-sync": "^4.2.3",
"@types/react": "^19.1.2",
"@types/resolve": "^1.20.6",
"@types/safe-regex": "^1.1.6",
"@types/sjcl": "^1.0.34",
@@ -115,6 +117,7 @@
"nodemon": "^3.0.2",
"pino-pretty": "^10.2.3",
"prompt-sync": "^4.2.0",
"react-email": "4.0.7",
"rimraf": "^5.0.5",
"ts-node": "^10.9.2",
"tsc-alias": "^1.8.8",
@@ -164,6 +167,7 @@
"@opentelemetry/semantic-conventions": "^1.27.0",
"@peculiar/asn1-schema": "^2.3.8",
"@peculiar/x509": "^1.12.1",
"@react-email/components": "0.0.36",
"@serdnam/pino-cloudwatch-transport": "^1.0.4",
"@sindresorhus/slugify": "1.1.0",
"@slack/oauth": "^3.0.2",
@@ -175,6 +179,7 @@
"axios": "^1.6.7",
"axios-retry": "^4.0.0",
"bcrypt": "^5.1.1",
"botbuilder": "^4.23.2",
"bullmq": "^5.4.2",
"cassandra-driver": "^4.7.2",
"connect-redis": "^7.1.1",
@@ -222,6 +227,8 @@
"posthog-node": "^3.6.2",
"probot": "^13.3.8",
"re2": "^1.21.4",
"react": "19.1.0",
"react-dom": "19.1.0",
"safe-regex": "^2.1.1",
"scim-patch": "^0.8.3",
"scim2-parse-filter": "^0.2.10",

View File

@@ -41,6 +41,7 @@ import { TSecretSnapshotServiceFactory } from "@app/ee/services/secret-snapshot/
import { TSshCertificateAuthorityServiceFactory } from "@app/ee/services/ssh/ssh-certificate-authority-service";
import { TSshCertificateTemplateServiceFactory } from "@app/ee/services/ssh-certificate-template/ssh-certificate-template-service";
import { TSshHostServiceFactory } from "@app/ee/services/ssh-host/ssh-host-service";
import { TSshHostGroupServiceFactory } from "@app/ee/services/ssh-host-group/ssh-host-group-service";
import { TTrustedIpServiceFactory } from "@app/ee/services/trusted-ip/trusted-ip-service";
import { TAuthMode } from "@app/server/plugins/auth/inject-identity";
import { TApiKeyServiceFactory } from "@app/services/api-key/api-key-service";
@@ -56,6 +57,7 @@ import { TCertificateTemplateServiceFactory } from "@app/services/certificate-te
import { TCmekServiceFactory } from "@app/services/cmek/cmek-service";
import { TExternalGroupOrgRoleMappingServiceFactory } from "@app/services/external-group-org-role-mapping/external-group-org-role-mapping-service";
import { TExternalMigrationServiceFactory } from "@app/services/external-migration/external-migration-service";
import { TFolderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { TGroupProjectServiceFactory } from "@app/services/group-project/group-project-service";
import { THsmServiceFactory } from "@app/services/hsm/hsm-service";
import { TIdentityServiceFactory } from "@app/services/identity/identity-service";
@@ -71,6 +73,7 @@ import { TIdentityTokenAuthServiceFactory } from "@app/services/identity-token-a
import { TIdentityUaServiceFactory } from "@app/services/identity-ua/identity-ua-service";
import { TIntegrationServiceFactory } from "@app/services/integration/integration-service";
import { TIntegrationAuthServiceFactory } from "@app/services/integration-auth/integration-auth-service";
import { TMicrosoftTeamsServiceFactory } from "@app/services/microsoft-teams/microsoft-teams-service";
import { TOrgRoleServiceFactory } from "@app/services/org/org-role-service";
import { TOrgServiceFactory } from "@app/services/org/org-service";
import { TOrgAdminServiceFactory } from "@app/services/org-admin/org-admin-service";
@@ -213,6 +216,7 @@ declare module "fastify" {
sshCertificateAuthority: TSshCertificateAuthorityServiceFactory;
sshCertificateTemplate: TSshCertificateTemplateServiceFactory;
sshHost: TSshHostServiceFactory;
sshHostGroup: TSshHostGroupServiceFactory;
certificateAuthority: TCertificateAuthorityServiceFactory;
certificateAuthorityCrl: TCertificateAuthorityCrlServiceFactory;
certificateEst: TCertificateEstServiceFactory;
@@ -246,8 +250,10 @@ declare module "fastify" {
kmipOperation: TKmipOperationServiceFactory;
gateway: TGatewayServiceFactory;
secretRotationV2: TSecretRotationV2ServiceFactory;
microsoftTeams: TMicrosoftTeamsServiceFactory;
assumePrivileges: TAssumePrivilegeServiceFactory;
githubOrgSync: TGithubOrgSyncServiceFactory;
folderCommit: TFolderCommitServiceFactory;
};
// this is exclusive use for middlewares in which we need to inject data
// everywhere else access using service layer

View File

@@ -74,6 +74,24 @@ import {
TExternalKms,
TExternalKmsInsert,
TExternalKmsUpdate,
TFolderCheckpointResources,
TFolderCheckpointResourcesInsert,
TFolderCheckpointResourcesUpdate,
TFolderCheckpoints,
TFolderCheckpointsInsert,
TFolderCheckpointsUpdate,
TFolderCommitChanges,
TFolderCommitChangesInsert,
TFolderCommitChangesUpdate,
TFolderCommits,
TFolderCommitsInsert,
TFolderCommitsUpdate,
TFolderTreeCheckpointResources,
TFolderTreeCheckpointResourcesInsert,
TFolderTreeCheckpointResourcesUpdate,
TFolderTreeCheckpoints,
TFolderTreeCheckpointsInsert,
TFolderTreeCheckpointsUpdate,
TGateways,
TGatewaysInsert,
TGatewaysUpdate,
@@ -386,6 +404,12 @@ import {
TSshCertificateTemplates,
TSshCertificateTemplatesInsert,
TSshCertificateTemplatesUpdate,
TSshHostGroupMemberships,
TSshHostGroupMembershipsInsert,
TSshHostGroupMembershipsUpdate,
TSshHostGroups,
TSshHostGroupsInsert,
TSshHostGroupsUpdate,
TSshHostLoginUserMappings,
TSshHostLoginUserMappingsInsert,
TSshHostLoginUserMappingsUpdate,
@@ -426,6 +450,16 @@ import {
TWorkflowIntegrationsInsert,
TWorkflowIntegrationsUpdate
} from "@app/db/schemas";
import {
TMicrosoftTeamsIntegrations,
TMicrosoftTeamsIntegrationsInsert,
TMicrosoftTeamsIntegrationsUpdate
} from "@app/db/schemas/microsoft-teams-integrations";
import {
TProjectMicrosoftTeamsConfigs,
TProjectMicrosoftTeamsConfigsInsert,
TProjectMicrosoftTeamsConfigsUpdate
} from "@app/db/schemas/project-microsoft-teams-configs";
import {
TSecretReminderRecipients,
TSecretReminderRecipientsInsert,
@@ -445,6 +479,16 @@ declare module "knex/types/tables" {
interface Tables {
[TableName.Users]: KnexOriginal.CompositeTableType<TUsers, TUsersInsert, TUsersUpdate>;
[TableName.Groups]: KnexOriginal.CompositeTableType<TGroups, TGroupsInsert, TGroupsUpdate>;
[TableName.SshHostGroup]: KnexOriginal.CompositeTableType<
TSshHostGroups,
TSshHostGroupsInsert,
TSshHostGroupsUpdate
>;
[TableName.SshHostGroupMembership]: KnexOriginal.CompositeTableType<
TSshHostGroupMemberships,
TSshHostGroupMembershipsInsert,
TSshHostGroupMembershipsUpdate
>;
[TableName.SshHost]: KnexOriginal.CompositeTableType<TSshHosts, TSshHostsInsert, TSshHostsUpdate>;
[TableName.SshCertificateAuthority]: KnexOriginal.CompositeTableType<
TSshCertificateAuthorities,
@@ -1002,6 +1046,16 @@ declare module "knex/types/tables" {
TSecretRotationV2SecretMappingsInsert,
TSecretRotationV2SecretMappingsUpdate
>;
[TableName.MicrosoftTeamsIntegrations]: KnexOriginal.CompositeTableType<
TMicrosoftTeamsIntegrations,
TMicrosoftTeamsIntegrationsInsert,
TMicrosoftTeamsIntegrationsUpdate
>;
[TableName.ProjectMicrosoftTeamsConfigs]: KnexOriginal.CompositeTableType<
TProjectMicrosoftTeamsConfigs,
TProjectMicrosoftTeamsConfigsInsert,
TProjectMicrosoftTeamsConfigsUpdate
>;
[TableName.SecretReminderRecipients]: KnexOriginal.CompositeTableType<
TSecretReminderRecipients,
TSecretReminderRecipientsInsert,
@@ -1012,5 +1066,35 @@ declare module "knex/types/tables" {
TGithubOrgSyncConfigsInsert,
TGithubOrgSyncConfigsUpdate
>;
[TableName.FolderCommit]: KnexOriginal.CompositeTableType<
TFolderCommits,
TFolderCommitsInsert,
TFolderCommitsUpdate
>;
[TableName.FolderCommitChanges]: KnexOriginal.CompositeTableType<
TFolderCommitChanges,
TFolderCommitChangesInsert,
TFolderCommitChangesUpdate
>;
[TableName.FolderCheckpoint]: KnexOriginal.CompositeTableType<
TFolderCheckpoints,
TFolderCheckpointsInsert,
TFolderCheckpointsUpdate
>;
[TableName.FolderCheckpointResources]: KnexOriginal.CompositeTableType<
TFolderCheckpointResources,
TFolderCheckpointResourcesInsert,
TFolderCheckpointResourcesUpdate
>;
[TableName.FolderTreeCheckpoint]: KnexOriginal.CompositeTableType<
TFolderTreeCheckpoints,
TFolderTreeCheckpointsInsert,
TFolderTreeCheckpointsUpdate
>;
[TableName.FolderTreeCheckpointResources]: KnexOriginal.CompositeTableType<
TFolderTreeCheckpointResources,
TFolderTreeCheckpointResourcesInsert,
TFolderTreeCheckpointResourcesUpdate
>;
}
}

View File

@@ -0,0 +1,130 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
const superAdminHasEncryptedMicrosoftTeamsClientIdColumn = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedMicrosoftTeamsAppId"
);
const superAdminHasEncryptedMicrosoftTeamsClientSecret = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedMicrosoftTeamsClientSecret"
);
const superAdminHasEncryptedMicrosoftTeamsBotId = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedMicrosoftTeamsBotId"
);
if (
!superAdminHasEncryptedMicrosoftTeamsClientIdColumn ||
!superAdminHasEncryptedMicrosoftTeamsClientSecret ||
!superAdminHasEncryptedMicrosoftTeamsBotId
) {
await knex.schema.alterTable(TableName.SuperAdmin, (table) => {
if (!superAdminHasEncryptedMicrosoftTeamsClientIdColumn) {
table.binary("encryptedMicrosoftTeamsAppId").nullable();
}
if (!superAdminHasEncryptedMicrosoftTeamsClientSecret) {
table.binary("encryptedMicrosoftTeamsClientSecret").nullable();
}
if (!superAdminHasEncryptedMicrosoftTeamsBotId) {
table.binary("encryptedMicrosoftTeamsBotId").nullable();
}
});
}
if (!(await knex.schema.hasColumn(TableName.WorkflowIntegrations, "status"))) {
await knex.schema.alterTable(TableName.WorkflowIntegrations, (table) => {
table.enu("status", ["pending", "installed", "failed"]).notNullable().defaultTo("installed"); // defaults to installed so we can have backwards compatibility with existing workflow integrations
});
}
if (!(await knex.schema.hasTable(TableName.MicrosoftTeamsIntegrations))) {
await knex.schema.createTable(TableName.MicrosoftTeamsIntegrations, (table) => {
table.uuid("id", { primaryKey: true }).notNullable();
table.foreign("id").references("id").inTable(TableName.WorkflowIntegrations).onDelete("CASCADE"); // the ID itself is the workflow integration ID
table.string("internalTeamsAppId").nullable();
table.string("tenantId").notNullable();
table.binary("encryptedAccessToken").nullable();
table.binary("encryptedBotAccessToken").nullable();
table.timestamp("accessTokenExpiresAt").nullable();
table.timestamp("botAccessTokenExpiresAt").nullable();
table.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.MicrosoftTeamsIntegrations);
}
if (!(await knex.schema.hasTable(TableName.ProjectMicrosoftTeamsConfigs))) {
await knex.schema.createTable(TableName.ProjectMicrosoftTeamsConfigs, (tb) => {
tb.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
tb.string("projectId").notNullable().unique();
tb.foreign("projectId").references("id").inTable(TableName.Project).onDelete("CASCADE");
tb.uuid("microsoftTeamsIntegrationId").notNullable();
tb.foreign("microsoftTeamsIntegrationId")
.references("id")
.inTable(TableName.MicrosoftTeamsIntegrations)
.onDelete("CASCADE");
tb.boolean("isAccessRequestNotificationEnabled").notNullable().defaultTo(false);
tb.boolean("isSecretRequestNotificationEnabled").notNullable().defaultTo(false);
tb.jsonb("accessRequestChannels").notNullable(); // {teamId: string, channelIds: string[]}
tb.jsonb("secretRequestChannels").notNullable(); // {teamId: string, channelIds: string[]}
tb.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.ProjectMicrosoftTeamsConfigs);
}
}
export async function down(knex: Knex): Promise<void> {
const hasEncryptedMicrosoftTeamsClientIdColumn = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedMicrosoftTeamsAppId"
);
const hasEncryptedMicrosoftTeamsClientSecret = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedMicrosoftTeamsClientSecret"
);
const hasEncryptedMicrosoftTeamsBotId = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedMicrosoftTeamsBotId"
);
if (
hasEncryptedMicrosoftTeamsClientIdColumn ||
hasEncryptedMicrosoftTeamsClientSecret ||
hasEncryptedMicrosoftTeamsBotId
) {
await knex.schema.alterTable(TableName.SuperAdmin, (table) => {
if (hasEncryptedMicrosoftTeamsClientIdColumn) {
table.dropColumn("encryptedMicrosoftTeamsAppId");
}
if (hasEncryptedMicrosoftTeamsClientSecret) {
table.dropColumn("encryptedMicrosoftTeamsClientSecret");
}
if (hasEncryptedMicrosoftTeamsBotId) {
table.dropColumn("encryptedMicrosoftTeamsBotId");
}
});
}
if (await knex.schema.hasColumn(TableName.WorkflowIntegrations, "status")) {
await knex.schema.alterTable(TableName.WorkflowIntegrations, (table) => {
table.dropColumn("status");
});
}
if (await knex.schema.hasTable(TableName.ProjectMicrosoftTeamsConfigs)) {
await knex.schema.dropTableIfExists(TableName.ProjectMicrosoftTeamsConfigs);
await dropOnUpdateTrigger(knex, TableName.ProjectMicrosoftTeamsConfigs);
}
if (await knex.schema.hasTable(TableName.MicrosoftTeamsIntegrations)) {
await knex.schema.dropTableIfExists(TableName.MicrosoftTeamsIntegrations);
await dropOnUpdateTrigger(knex, TableName.MicrosoftTeamsIntegrations);
}
}

View File

@@ -0,0 +1,55 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.SshHostGroup))) {
await knex.schema.createTable(TableName.SshHostGroup, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.timestamps(true, true, true);
t.string("projectId").notNullable();
t.foreign("projectId").references("id").inTable(TableName.Project).onDelete("CASCADE");
t.string("name").notNullable();
t.unique(["projectId", "name"]);
});
await createOnUpdateTrigger(knex, TableName.SshHostGroup);
}
if (!(await knex.schema.hasTable(TableName.SshHostGroupMembership))) {
await knex.schema.createTable(TableName.SshHostGroupMembership, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.timestamps(true, true, true);
t.uuid("sshHostGroupId").notNullable();
t.foreign("sshHostGroupId").references("id").inTable(TableName.SshHostGroup).onDelete("CASCADE");
t.uuid("sshHostId").notNullable();
t.foreign("sshHostId").references("id").inTable(TableName.SshHost).onDelete("CASCADE");
t.unique(["sshHostGroupId", "sshHostId"]);
});
await createOnUpdateTrigger(knex, TableName.SshHostGroupMembership);
}
const hasGroupColumn = await knex.schema.hasColumn(TableName.SshHostLoginUser, "sshHostGroupId");
if (!hasGroupColumn) {
await knex.schema.alterTable(TableName.SshHostLoginUser, (t) => {
t.uuid("sshHostGroupId").nullable();
t.foreign("sshHostGroupId").references("id").inTable(TableName.SshHostGroup).onDelete("CASCADE");
t.uuid("sshHostId").nullable().alter();
});
}
}
export async function down(knex: Knex): Promise<void> {
const hasGroupColumn = await knex.schema.hasColumn(TableName.SshHostLoginUser, "sshHostGroupId");
if (hasGroupColumn) {
await knex.schema.alterTable(TableName.SshHostLoginUser, (t) => {
t.dropColumn("sshHostGroupId");
});
}
await knex.schema.dropTableIfExists(TableName.SshHostGroupMembership);
await dropOnUpdateTrigger(knex, TableName.SshHostGroupMembership);
await knex.schema.dropTableIfExists(TableName.SshHostGroup);
await dropOnUpdateTrigger(knex, TableName.SshHostGroup);
}

View File

@@ -0,0 +1,149 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
const hasFolderCommitTable = await knex.schema.hasTable(TableName.FolderCommit);
if (!hasFolderCommitTable) {
await knex.schema.createTable(TableName.FolderCommit, (t) => {
t.uuid("id").primary().defaultTo(knex.fn.uuid());
t.bigIncrements("commitId");
t.jsonb("actorMetadata").notNullable();
t.string("actorType").notNullable();
t.string("message");
t.uuid("folderId").notNullable();
t.uuid("envId").notNullable();
t.foreign("envId").references("id").inTable(TableName.Environment).onDelete("CASCADE");
t.timestamps(true, true, true);
t.index("folderId");
t.index("envId");
});
}
const hasFolderCommitChangesTable = await knex.schema.hasTable(TableName.FolderCommitChanges);
if (!hasFolderCommitChangesTable) {
await knex.schema.createTable(TableName.FolderCommitChanges, (t) => {
t.uuid("id").primary().defaultTo(knex.fn.uuid());
t.uuid("folderCommitId").notNullable();
t.foreign("folderCommitId").references("id").inTable(TableName.FolderCommit).onDelete("CASCADE");
t.string("changeType").notNullable();
t.boolean("isUpdate").notNullable().defaultTo(false);
t.uuid("secretVersionId");
t.foreign("secretVersionId").references("id").inTable(TableName.SecretVersionV2).onDelete("CASCADE");
t.uuid("folderVersionId");
t.foreign("folderVersionId").references("id").inTable(TableName.SecretFolderVersion).onDelete("CASCADE");
t.timestamps(true, true, true);
t.index("folderCommitId");
t.index("secretVersionId");
t.index("folderVersionId");
});
}
const hasFolderCheckpointTable = await knex.schema.hasTable(TableName.FolderCheckpoint);
if (!hasFolderCheckpointTable) {
await knex.schema.createTable(TableName.FolderCheckpoint, (t) => {
t.uuid("id").primary().defaultTo(knex.fn.uuid());
t.uuid("folderCommitId").notNullable();
t.foreign("folderCommitId").references("id").inTable(TableName.FolderCommit).onDelete("CASCADE");
t.timestamps(true, true, true);
t.index("folderCommitId");
});
}
const hasFolderCheckpointResourcesTable = await knex.schema.hasTable(TableName.FolderCheckpointResources);
if (!hasFolderCheckpointResourcesTable) {
await knex.schema.createTable(TableName.FolderCheckpointResources, (t) => {
t.uuid("id").primary().defaultTo(knex.fn.uuid());
t.uuid("folderCheckpointId").notNullable();
t.foreign("folderCheckpointId").references("id").inTable(TableName.FolderCheckpoint).onDelete("CASCADE");
t.uuid("secretVersionId");
t.foreign("secretVersionId").references("id").inTable(TableName.SecretVersionV2).onDelete("CASCADE");
t.uuid("folderVersionId");
t.foreign("folderVersionId").references("id").inTable(TableName.SecretFolderVersion).onDelete("CASCADE");
t.timestamps(true, true, true);
t.index("folderCheckpointId");
t.index("secretVersionId");
t.index("folderVersionId");
});
}
const hasFolderTreeCheckpointTable = await knex.schema.hasTable(TableName.FolderTreeCheckpoint);
if (!hasFolderTreeCheckpointTable) {
await knex.schema.createTable(TableName.FolderTreeCheckpoint, (t) => {
t.uuid("id").primary().defaultTo(knex.fn.uuid());
t.uuid("folderCommitId").notNullable();
t.foreign("folderCommitId").references("id").inTable(TableName.FolderCommit).onDelete("CASCADE");
t.timestamps(true, true, true);
t.index("folderCommitId");
});
}
const hasFolderTreeCheckpointResourcesTable = await knex.schema.hasTable(TableName.FolderTreeCheckpointResources);
if (!hasFolderTreeCheckpointResourcesTable) {
await knex.schema.createTable(TableName.FolderTreeCheckpointResources, (t) => {
t.uuid("id").primary().defaultTo(knex.fn.uuid());
t.uuid("folderTreeCheckpointId").notNullable();
t.foreign("folderTreeCheckpointId").references("id").inTable(TableName.FolderTreeCheckpoint).onDelete("CASCADE");
t.uuid("folderId").notNullable();
t.uuid("folderCommitId").notNullable();
t.foreign("folderCommitId").references("id").inTable(TableName.FolderCommit).onDelete("CASCADE");
t.timestamps(true, true, true);
t.index("folderTreeCheckpointId");
t.index("folderId");
t.index("folderCommitId");
});
}
await createOnUpdateTrigger(knex, TableName.FolderCommit);
await createOnUpdateTrigger(knex, TableName.FolderCommitChanges);
await createOnUpdateTrigger(knex, TableName.FolderCheckpoint);
await createOnUpdateTrigger(knex, TableName.FolderCheckpointResources);
await createOnUpdateTrigger(knex, TableName.FolderTreeCheckpoint);
await createOnUpdateTrigger(knex, TableName.FolderTreeCheckpointResources);
}
export async function down(knex: Knex): Promise<void> {
const hasFolderCheckpointResourcesTable = await knex.schema.hasTable(TableName.FolderCheckpointResources);
const hasFolderTreeCheckpointResourcesTable = await knex.schema.hasTable(TableName.FolderTreeCheckpointResources);
const hasFolderCommitTable = await knex.schema.hasTable(TableName.FolderCommit);
const hasFolderCommitChangesTable = await knex.schema.hasTable(TableName.FolderCommitChanges);
const hasFolderTreeCheckpointTable = await knex.schema.hasTable(TableName.FolderTreeCheckpoint);
const hasFolderCheckpointTable = await knex.schema.hasTable(TableName.FolderCheckpoint);
if (hasFolderTreeCheckpointResourcesTable) {
await dropOnUpdateTrigger(knex, TableName.FolderTreeCheckpointResources);
await knex.schema.dropTableIfExists(TableName.FolderTreeCheckpointResources);
}
if (hasFolderCheckpointResourcesTable) {
await dropOnUpdateTrigger(knex, TableName.FolderCheckpointResources);
await knex.schema.dropTableIfExists(TableName.FolderCheckpointResources);
}
if (hasFolderTreeCheckpointTable) {
await dropOnUpdateTrigger(knex, TableName.FolderTreeCheckpoint);
await knex.schema.dropTableIfExists(TableName.FolderTreeCheckpoint);
}
if (hasFolderCheckpointTable) {
await dropOnUpdateTrigger(knex, TableName.FolderCheckpoint);
await knex.schema.dropTableIfExists(TableName.FolderCheckpoint);
}
if (hasFolderCommitChangesTable) {
await dropOnUpdateTrigger(knex, TableName.FolderCommitChanges);
await knex.schema.dropTableIfExists(TableName.FolderCommitChanges);
}
if (hasFolderCommitTable) {
await dropOnUpdateTrigger(knex, TableName.FolderCommit);
await knex.schema.dropTableIfExists(TableName.FolderCommit);
}
}

View File

@@ -0,0 +1,29 @@
import { Knex } from "knex";
import { inMemoryKeyStore } from "@app/keystore/memory";
import { ProjectType, TableName } from "../schemas";
import { getMigrationPITServices } from "./utils/services";
export async function up(knex: Knex): Promise<void> {
const hasFolderCommitTable = await knex.schema.hasTable(TableName.FolderCommit);
if (hasFolderCommitTable) {
const keyStore = inMemoryKeyStore();
const { folderCommitService } = await getMigrationPITServices({ db: knex, keyStore });
const projects = await knex(TableName.Project).where({ version: 3, type: ProjectType.SecretManager }).select("id");
await knex.transaction(async (tx) => {
for (const project of projects) {
// eslint-disable-next-line no-await-in-loop
await folderCommitService.initializeProject(project.id, tx);
}
});
}
}
export async function down(knex: Knex): Promise<void> {
const hasFolderCommitTable = await knex.schema.hasTable(TableName.FolderCommit);
if (hasFolderCommitTable) {
// delete all existing entries
await knex(TableName.FolderCommit).del();
}
}

View File

@@ -3,12 +3,25 @@ import { Knex } from "knex";
import { initializeHsmModule } from "@app/ee/services/hsm/hsm-fns";
import { hsmServiceFactory } from "@app/ee/services/hsm/hsm-service";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { folderCheckpointDALFactory } from "@app/services/folder-checkpoint/folder-checkpoint-dal";
import { folderCheckpointResourcesDALFactory } from "@app/services/folder-checkpoint-resources/folder-checkpoint-resources-dal";
import { folderCommitDALFactory } from "@app/services/folder-commit/folder-commit-dal";
import { folderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { folderCommitChangesDALFactory } from "@app/services/folder-commit-changes/folder-commit-changes-dal";
import { folderTreeCheckpointDALFactory } from "@app/services/folder-tree-checkpoint/folder-tree-checkpoint-dal";
import { folderTreeCheckpointResourcesDALFactory } from "@app/services/folder-tree-checkpoint-resources/folder-tree-checkpoint-resources-dal";
import { identityDALFactory } from "@app/services/identity/identity-dal";
import { internalKmsDALFactory } from "@app/services/kms/internal-kms-dal";
import { kmskeyDALFactory } from "@app/services/kms/kms-key-dal";
import { kmsRootConfigDALFactory } from "@app/services/kms/kms-root-config-dal";
import { kmsServiceFactory } from "@app/services/kms/kms-service";
import { orgDALFactory } from "@app/services/org/org-dal";
import { projectDALFactory } from "@app/services/project/project-dal";
import { secretFolderDALFactory } from "@app/services/secret-folder/secret-folder-dal";
import { secretFolderVersionDALFactory } from "@app/services/secret-folder/secret-folder-version-dal";
import { secretV2BridgeDALFactory } from "@app/services/secret-v2-bridge/secret-v2-bridge-dal";
import { secretVersionV2BridgeDALFactory } from "@app/services/secret-v2-bridge/secret-version-dal";
import { userDALFactory } from "@app/services/user/user-dal";
import { TMigrationEnvConfig } from "./env-config";
@@ -50,3 +63,37 @@ export const getMigrationEncryptionServices = async ({ envConfig, db, keyStore }
return { kmsService };
};
export const getMigrationPITServices = async ({ db, keyStore }: { db: Knex; keyStore: TKeyStoreFactory }) => {
const projectDAL = projectDALFactory(db);
const folderCommitDAL = folderCommitDALFactory(db);
const folderCommitChangesDAL = folderCommitChangesDALFactory(db);
const folderCheckpointDAL = folderCheckpointDALFactory(db);
const folderTreeCheckpointDAL = folderTreeCheckpointDALFactory(db);
const userDAL = userDALFactory(db);
const identityDAL = identityDALFactory(db);
const folderDAL = secretFolderDALFactory(db);
const folderVersionDAL = secretFolderVersionDALFactory(db);
const secretVersionV2BridgeDAL = secretVersionV2BridgeDALFactory(db);
const folderCheckpointResourcesDAL = folderCheckpointResourcesDALFactory(db);
const secretV2BridgeDAL = secretV2BridgeDALFactory({ db, keyStore });
const folderTreeCheckpointResourcesDAL = folderTreeCheckpointResourcesDALFactory(db);
const folderCommitService = folderCommitServiceFactory({
folderCommitDAL,
folderCommitChangesDAL,
folderCheckpointDAL,
folderTreeCheckpointDAL,
userDAL,
identityDAL,
folderDAL,
folderVersionDAL,
secretVersionV2BridgeDAL,
projectDAL,
folderCheckpointResourcesDAL,
secretV2BridgeDAL,
folderTreeCheckpointResourcesDAL
});
return { folderCommitService };
};

View File

@@ -0,0 +1,23 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const FolderCheckpointResourcesSchema = z.object({
id: z.string().uuid(),
folderCheckpointId: z.string().uuid(),
secretVersionId: z.string().uuid().nullable().optional(),
folderVersionId: z.string().uuid().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TFolderCheckpointResources = z.infer<typeof FolderCheckpointResourcesSchema>;
export type TFolderCheckpointResourcesInsert = Omit<z.input<typeof FolderCheckpointResourcesSchema>, TImmutableDBKeys>;
export type TFolderCheckpointResourcesUpdate = Partial<
Omit<z.input<typeof FolderCheckpointResourcesSchema>, TImmutableDBKeys>
>;

View File

@@ -0,0 +1,19 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const FolderCheckpointsSchema = z.object({
id: z.string().uuid(),
folderCommitId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TFolderCheckpoints = z.infer<typeof FolderCheckpointsSchema>;
export type TFolderCheckpointsInsert = Omit<z.input<typeof FolderCheckpointsSchema>, TImmutableDBKeys>;
export type TFolderCheckpointsUpdate = Partial<Omit<z.input<typeof FolderCheckpointsSchema>, TImmutableDBKeys>>;

View File

@@ -0,0 +1,23 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const FolderCommitChangesSchema = z.object({
id: z.string().uuid(),
folderCommitId: z.string().uuid(),
changeType: z.string(),
isUpdate: z.boolean().default(false),
secretVersionId: z.string().uuid().nullable().optional(),
folderVersionId: z.string().uuid().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TFolderCommitChanges = z.infer<typeof FolderCommitChangesSchema>;
export type TFolderCommitChangesInsert = Omit<z.input<typeof FolderCommitChangesSchema>, TImmutableDBKeys>;
export type TFolderCommitChangesUpdate = Partial<Omit<z.input<typeof FolderCommitChangesSchema>, TImmutableDBKeys>>;

View File

@@ -0,0 +1,24 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const FolderCommitsSchema = z.object({
id: z.string().uuid(),
commitId: z.coerce.number(),
actorMetadata: z.unknown(),
actorType: z.string(),
message: z.string().nullable().optional(),
folderId: z.string().uuid(),
envId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TFolderCommits = z.infer<typeof FolderCommitsSchema>;
export type TFolderCommitsInsert = Omit<z.input<typeof FolderCommitsSchema>, TImmutableDBKeys>;
export type TFolderCommitsUpdate = Partial<Omit<z.input<typeof FolderCommitsSchema>, TImmutableDBKeys>>;

View File

@@ -0,0 +1,26 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const FolderTreeCheckpointResourcesSchema = z.object({
id: z.string().uuid(),
folderTreeCheckpointId: z.string().uuid(),
folderId: z.string().uuid(),
folderCommitId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TFolderTreeCheckpointResources = z.infer<typeof FolderTreeCheckpointResourcesSchema>;
export type TFolderTreeCheckpointResourcesInsert = Omit<
z.input<typeof FolderTreeCheckpointResourcesSchema>,
TImmutableDBKeys
>;
export type TFolderTreeCheckpointResourcesUpdate = Partial<
Omit<z.input<typeof FolderTreeCheckpointResourcesSchema>, TImmutableDBKeys>
>;

View File

@@ -0,0 +1,19 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const FolderTreeCheckpointsSchema = z.object({
id: z.string().uuid(),
folderCommitId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TFolderTreeCheckpoints = z.infer<typeof FolderTreeCheckpointsSchema>;
export type TFolderTreeCheckpointsInsert = Omit<z.input<typeof FolderTreeCheckpointsSchema>, TImmutableDBKeys>;
export type TFolderTreeCheckpointsUpdate = Partial<Omit<z.input<typeof FolderTreeCheckpointsSchema>, TImmutableDBKeys>>;

View File

@@ -22,6 +22,12 @@ export * from "./dynamic-secret-leases";
export * from "./dynamic-secrets";
export * from "./external-group-org-role-mappings";
export * from "./external-kms";
export * from "./folder-checkpoint-resources";
export * from "./folder-checkpoints";
export * from "./folder-commit-changes";
export * from "./folder-commits";
export * from "./folder-tree-checkpoint-resources";
export * from "./folder-tree-checkpoints";
export * from "./gateways";
export * from "./git-app-install-sessions";
export * from "./git-app-org";
@@ -58,6 +64,7 @@ export * from "./kms-keys";
export * from "./kms-root-config";
export * from "./ldap-configs";
export * from "./ldap-group-maps";
export * from "./microsoft-teams-integrations";
export * from "./models";
export * from "./oidc-configs";
export * from "./org-bots";
@@ -127,6 +134,8 @@ export * from "./ssh-certificate-authority-secrets";
export * from "./ssh-certificate-bodies";
export * from "./ssh-certificate-templates";
export * from "./ssh-certificates";
export * from "./ssh-host-group-memberships";
export * from "./ssh-host-groups";
export * from "./ssh-host-login-user-mappings";
export * from "./ssh-host-login-users";
export * from "./ssh-hosts";

View File

@@ -0,0 +1,31 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const MicrosoftTeamsIntegrationsSchema = z.object({
id: z.string().uuid(),
internalTeamsAppId: z.string().nullable().optional(),
tenantId: z.string(),
encryptedAccessToken: zodBuffer.nullable().optional(),
encryptedBotAccessToken: zodBuffer.nullable().optional(),
accessTokenExpiresAt: z.date().nullable().optional(),
botAccessTokenExpiresAt: z.date().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TMicrosoftTeamsIntegrations = z.infer<typeof MicrosoftTeamsIntegrationsSchema>;
export type TMicrosoftTeamsIntegrationsInsert = Omit<
z.input<typeof MicrosoftTeamsIntegrationsSchema>,
TImmutableDBKeys
>;
export type TMicrosoftTeamsIntegrationsUpdate = Partial<
Omit<z.input<typeof MicrosoftTeamsIntegrationsSchema>, TImmutableDBKeys>
>;

View File

@@ -2,6 +2,8 @@ import { z } from "zod";
export enum TableName {
Users = "users",
SshHostGroup = "ssh_host_groups",
SshHostGroupMembership = "ssh_host_group_memberships",
SshHost = "ssh_hosts",
SshHostLoginUser = "ssh_host_login_users",
SshHostLoginUserMapping = "ssh_host_login_user_mappings",
@@ -147,11 +149,19 @@ export enum TableName {
KmipClientCertificates = "kmip_client_certificates",
SecretRotationV2 = "secret_rotations_v2",
SecretRotationV2SecretMapping = "secret_rotation_v2_secret_mappings",
MicrosoftTeamsIntegrations = "microsoft_teams_integrations",
ProjectMicrosoftTeamsConfigs = "project_microsoft_teams_configs",
SecretReminderRecipients = "secret_reminder_recipients",
GithubOrgSyncConfig = "github_org_sync_configs"
GithubOrgSyncConfig = "github_org_sync_configs",
FolderCommit = "folder_commits",
FolderCommitChanges = "folder_commit_changes",
FolderCheckpoint = "folder_checkpoints",
FolderCheckpointResources = "folder_checkpoint_resources",
FolderTreeCheckpoint = "folder_tree_checkpoints",
FolderTreeCheckpointResources = "folder_tree_checkpoint_resources"
}
export type TImmutableDBKeys = "id" | "createdAt" | "updatedAt";
export type TImmutableDBKeys = "id" | "createdAt" | "updatedAt" | "commitId";
export const UserDeviceSchema = z
.object({

View File

@@ -23,7 +23,6 @@ export const OrganizationsSchema = z.object({
defaultMembershipRole: z.string().default("member"),
enforceMfa: z.boolean().default(false),
selectedMfaMethod: z.string().nullable().optional(),
secretShareSendToAnyone: z.boolean().default(true).nullable().optional(),
allowSecretSharingOutsideOrganization: z.boolean().default(true).nullable().optional(),
shouldUseNewPrivilegeSystem: z.boolean().default(true),
privilegeUpgradeInitiatedByUsername: z.string().nullable().optional(),

View File

@@ -0,0 +1,29 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const ProjectMicrosoftTeamsConfigsSchema = z.object({
id: z.string().uuid(),
projectId: z.string(),
microsoftTeamsIntegrationId: z.string().uuid(),
isAccessRequestNotificationEnabled: z.boolean().default(false),
isSecretRequestNotificationEnabled: z.boolean().default(false),
accessRequestChannels: z.unknown(),
secretRequestChannels: z.unknown(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TProjectMicrosoftTeamsConfigs = z.infer<typeof ProjectMicrosoftTeamsConfigsSchema>;
export type TProjectMicrosoftTeamsConfigsInsert = Omit<
z.input<typeof ProjectMicrosoftTeamsConfigsSchema>,
TImmutableDBKeys
>;
export type TProjectMicrosoftTeamsConfigsUpdate = Partial<
Omit<z.input<typeof ProjectMicrosoftTeamsConfigsSchema>, TImmutableDBKeys>
>;

View File

@@ -0,0 +1,22 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const SshHostGroupMembershipsSchema = z.object({
id: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date(),
sshHostGroupId: z.string().uuid(),
sshHostId: z.string().uuid()
});
export type TSshHostGroupMemberships = z.infer<typeof SshHostGroupMembershipsSchema>;
export type TSshHostGroupMembershipsInsert = Omit<z.input<typeof SshHostGroupMembershipsSchema>, TImmutableDBKeys>;
export type TSshHostGroupMembershipsUpdate = Partial<
Omit<z.input<typeof SshHostGroupMembershipsSchema>, TImmutableDBKeys>
>;

View File

@@ -0,0 +1,20 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const SshHostGroupsSchema = z.object({
id: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date(),
projectId: z.string(),
name: z.string()
});
export type TSshHostGroups = z.infer<typeof SshHostGroupsSchema>;
export type TSshHostGroupsInsert = Omit<z.input<typeof SshHostGroupsSchema>, TImmutableDBKeys>;
export type TSshHostGroupsUpdate = Partial<Omit<z.input<typeof SshHostGroupsSchema>, TImmutableDBKeys>>;

View File

@@ -11,8 +11,9 @@ export const SshHostLoginUsersSchema = z.object({
id: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date(),
sshHostId: z.string().uuid(),
loginUser: z.string()
sshHostId: z.string().uuid().nullable().optional(),
loginUser: z.string(),
sshHostGroupId: z.string().uuid().nullable().optional()
});
export type TSshHostLoginUsers = z.infer<typeof SshHostLoginUsersSchema>;

View File

@@ -26,7 +26,10 @@ export const SuperAdminSchema = z.object({
encryptedSlackClientSecret: zodBuffer.nullable().optional(),
authConsentContent: z.string().nullable().optional(),
pageFrameContent: z.string().nullable().optional(),
adminIdentityIds: z.string().array().nullable().optional()
adminIdentityIds: z.string().array().nullable().optional(),
encryptedMicrosoftTeamsAppId: zodBuffer.nullable().optional(),
encryptedMicrosoftTeamsClientSecret: zodBuffer.nullable().optional(),
encryptedMicrosoftTeamsBotId: zodBuffer.nullable().optional()
});
export type TSuperAdmin = z.infer<typeof SuperAdminSchema>;

View File

@@ -14,7 +14,8 @@ export const WorkflowIntegrationsSchema = z.object({
orgId: z.string().uuid(),
description: z.string().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
updatedAt: z.date(),
status: z.string().default("installed")
});
export type TWorkflowIntegrations = z.infer<typeof WorkflowIntegrationsSchema>;

View File

@@ -18,6 +18,7 @@ import { registerLdapRouter } from "./ldap-router";
import { registerLicenseRouter } from "./license-router";
import { registerOidcRouter } from "./oidc-router";
import { registerOrgRoleRouter } from "./org-role-router";
import { registerPITRouter } from "./pit-router";
import { registerProjectRoleRouter } from "./project-role-router";
import { registerProjectRouter } from "./project-router";
import { registerRateLimitRouter } from "./rate-limit-router";
@@ -34,6 +35,7 @@ import { registerSnapshotRouter } from "./snapshot-router";
import { registerSshCaRouter } from "./ssh-certificate-authority-router";
import { registerSshCertRouter } from "./ssh-certificate-router";
import { registerSshCertificateTemplateRouter } from "./ssh-certificate-template-router";
import { registerSshHostGroupRouter } from "./ssh-host-group-router";
import { registerSshHostRouter } from "./ssh-host-router";
import { registerTrustedIpRouter } from "./trusted-ip-router";
import { registerUserAdditionalPrivilegeRouter } from "./user-additional-privilege-router";
@@ -52,6 +54,7 @@ export const registerV1EERoutes = async (server: FastifyZodProvider) => {
{ prefix: "/workspace" }
);
await server.register(registerSnapshotRouter, { prefix: "/secret-snapshot" });
await server.register(registerPITRouter, { prefix: "/pit" });
await server.register(registerSecretApprovalPolicyRouter, { prefix: "/secret-approvals" });
await server.register(registerSecretApprovalRequestRouter, {
prefix: "/secret-approval-requests"
@@ -88,6 +91,7 @@ export const registerV1EERoutes = async (server: FastifyZodProvider) => {
await sshRouter.register(registerSshCertRouter, { prefix: "/certificates" });
await sshRouter.register(registerSshCertificateTemplateRouter, { prefix: "/certificate-templates" });
await sshRouter.register(registerSshHostRouter, { prefix: "/hosts" });
await sshRouter.register(registerSshHostGroupRouter, { prefix: "/host-groups" });
},
{ prefix: "/ssh" }
);

View File

@@ -0,0 +1,613 @@
/* eslint-disable @typescript-eslint/no-base-to-string */
import { z } from "zod";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { NotFoundError } from "@app/lib/errors";
import { removeTrailingSlash } from "@app/lib/fn";
import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { booleanSchema } from "@app/server/routes/sanitizedSchemas";
import { AuthMode } from "@app/services/auth/auth-type";
import { ChangeType } from "@app/services/folder-commit/folder-commit-service";
const commitHistoryItemSchema = z.object({
id: z.string(),
folderId: z.string(),
actorType: z.string(),
actorMetadata: z.unknown().optional(),
message: z.string().optional().nullable(),
commitId: z.string(),
createdAt: z.string().or(z.date()),
envId: z.string()
});
const versionSchema = z.object({
secretKey: z.string().optional(),
secretComment: z.string().optional().nullable(),
skipMultilineEncoding: z.boolean().optional().nullable(),
secretReminderRepeatDays: z.number().optional().nullable(),
secretReminderNote: z.string().optional().nullable(),
metadata: z.unknown().optional().nullable(),
tags: z.array(z.string()).optional().nullable(),
secretReminderRecipients: z.array(z.any()).optional().nullable(),
secretValue: z.string().optional().nullable(),
name: z.string().optional().nullable()
});
const folderStateSchema = z.array(
z.object({
type: z.string(),
id: z.string(),
versionId: z.string(),
secretKey: z.string().optional(),
secretVersion: z.number().optional(),
folderName: z.string().optional(),
folderVersion: z.number().optional()
})
);
export const registerPITRouter = async (server: FastifyZodProvider) => {
// Get commits count for a folder
server.route({
method: "GET",
url: "/commits/count/:workspaceId",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim()
}),
querystring: z.object({
environment: z.string().trim(),
path: z.string().trim().default("/").transform(removeTrailingSlash)
}),
response: {
200: z.object({
count: z.number(),
folderId: z.string()
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const res = await server.services.folderCommit.getCommitsCount({
actor: req.permission?.type,
actorId: req.permission?.id,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
projectId: req.params.workspaceId,
environment: req.query.environment,
path: req.query.path
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.GET_PROJECT_PIT_COMMIT_COUNT,
metadata: {
environment: req.query.environment,
path: req.query.path,
commitCount: res.count.toString()
}
}
});
return res;
}
});
// Get all commits for a folder
server.route({
method: "GET",
url: "/commits/:workspaceId",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim()
}),
querystring: z.object({
environment: z.string().trim(),
path: z.string().trim().default("/").transform(removeTrailingSlash)
}),
response: {
200: commitHistoryItemSchema.array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const commits = await server.services.folderCommit.getCommitsForFolder({
actor: req.permission?.type,
actorId: req.permission?.id,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
projectId: req.params.workspaceId,
environment: req.query.environment,
path: req.query.path
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.GET_PROJECT_PIT_COMMITS,
metadata: {
environment: req.query.environment,
path: req.query.path,
commitCount: commits.length.toString()
}
}
});
return commits.map((commit) => ({
...commit,
commitId: commit.commitId.toString()
}));
}
});
// Get commit changes for a specific commit
server.route({
method: "GET",
url: "/commits/:workspaceId/:commitId/changes",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim(),
commitId: z.string().trim()
}),
response: {
200: z.object({
changes: z.object({
id: z.string(),
commitId: z.string(),
actorMetadata: z
.union([
z.object({
id: z.string().optional(),
name: z.string().optional()
}),
z.unknown()
])
.optional(),
actorType: z.string(),
message: z.string().optional().nullable(),
folderId: z.string(),
envId: z.string(),
createdAt: z.string().or(z.date()),
updatedAt: z.string().or(z.date()),
changes: z.array(
z.object({
id: z.string(),
folderCommitId: z.string(),
changeType: z.string(),
isUpdate: z.boolean().optional(),
secretVersionId: z.string().optional().nullable(),
folderVersionId: z.string().optional().nullable(),
// Fix these two fields to accept either string or Date objects
createdAt: z.union([z.string(), z.date()]),
updatedAt: z.union([z.string(), z.date()]),
folderName: z.string().optional().nullable(),
folderChangeId: z.string().optional().nullable(),
folderVersion: z.union([z.string(), z.number()]).optional().nullable(),
secretKey: z.string().optional().nullable(),
secretVersion: z.union([z.string(), z.number()]).optional().nullable(),
secretId: z.string().optional().nullable(),
actorMetadata: z
.union([
z.object({
id: z.string().optional(),
name: z.string().optional()
}),
z.unknown()
])
.optional(),
actorType: z.string().optional(),
message: z.string().optional().nullable(),
folderId: z.string().optional().nullable(),
versions: z.array(versionSchema).optional()
})
)
})
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const changes = await server.services.folderCommit.getCommitChanges({
actor: req.permission?.type,
actorId: req.permission?.id,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
projectId: req.params.workspaceId,
commitId: req.params.commitId
});
for (const change of changes.changes) {
if (change.secretVersionId) {
const currentVersion = change.secretVersion || "1";
const previousVersion = (Number.parseInt(currentVersion, 10) - 1).toString();
if (change.secretId) {
// eslint-disable-next-line no-await-in-loop
const versions = await server.services.secret.getSecretVersionsV2ByIds({
actorId: req.permission?.id,
actor: req.permission?.type,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
secretId: change.secretId,
secretVersions: change.isUpdate ? [currentVersion, previousVersion] : [currentVersion],
folderId: change.folderId
});
change.versions = versions?.map((v) => ({
secretKey: v.secretKey,
secretComment: v.secretComment,
skipMultilineEncoding: v.skipMultilineEncoding,
secretReminderRepeatDays: v.secretReminderRepeatDays,
secretReminderNote: v.secretReminderNote,
metadata: v.secretMetadata,
tags: v.tags?.map((t) => t.name),
secretReminderRecipients: v.secretReminderRecipients?.map((r) => r.toString()),
secretValue: v.secretValue
}));
}
} else if (change.folderVersionId && change.folderChangeId) {
const currentVersion = change.folderVersion || "1";
const previousVersion = (Number.parseInt(currentVersion, 10) - 1).toString();
// eslint-disable-next-line no-await-in-loop
const versions = await server.services.folder.getFolderVersionsByIds({
folderId: change.folderChangeId,
folderVersions: change.isUpdate ? [currentVersion, previousVersion] : [currentVersion]
});
change.versions = versions.map((v) => ({
name: v.name
}));
}
}
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.GET_PROJECT_PIT_COMMIT_CHANGES,
metadata: {
commitId: req.params.commitId,
changesCount: (changes.changes?.length || 0).toString()
}
}
});
return {
changes: {
...changes,
commitId: changes.commitId.toString()
}
};
}
});
// Retrieve rollback changes for a commit
server.route({
method: "GET",
url: "/commits/:workspaceId/:commitId/compare",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim(),
commitId: z.string().trim()
}),
querystring: z.object({
folderId: z.string().trim(),
envId: z.string().trim(),
deepRollback: booleanSchema.default(false),
secretPath: z.string().trim().default("/").transform(removeTrailingSlash)
}),
response: {
200: z.array(
z.object({
folderId: z.string(),
folderName: z.string(),
folderPath: z.string().optional(),
changes: z.any()
})
)
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const latestCommit = await server.services.folderCommit.getLatestCommit({
folderId: req.query.folderId,
actor: req.permission?.type,
actorId: req.permission?.id,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
projectId: req.params.workspaceId
});
if (!latestCommit) {
throw new NotFoundError({ message: "Latest commit not found" });
}
let diffs;
if (req.query.deepRollback) {
diffs = await server.services.folderCommit.deepCompareFolder({
targetCommitId: req.params.commitId,
envId: req.query.envId,
actorId: req.permission?.id,
actorType: req.permission?.type,
projectId: req.params.workspaceId
});
} else {
const folder = await server.services.folder.getFolderById({
actor: req.permission?.type,
actorId: req.permission?.id,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
id: req.query.folderId
});
diffs = [
{
folderId: folder.id,
folderName: folder.name,
folderPath: req.query.secretPath,
changes: await server.services.folderCommit.compareFolderStates({
targetCommitId: req.params.commitId,
currentCommitId: latestCommit.id
})
}
];
}
for (const diff of diffs) {
for (const change of diff.changes) {
if (change.secretKey) {
const currentVersion = change.secretVersion || "1";
const previousVersion = change.fromVersion || "1";
// eslint-disable-next-line no-await-in-loop
const versions = await server.services.secret.getSecretVersionsV2ByIds({
actorId: req.permission?.id,
actor: req.permission?.type,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
secretId: change.id,
// if it's update add also the previous secretversionid
secretVersions:
change.changeType === ChangeType.UPDATE ? [currentVersion, previousVersion] : [currentVersion],
folderId: req.query.folderId
});
change.versions = versions?.map((v) => ({
secretKey: v.secretKey,
secretComment: v.secretComment,
skipMultilineEncoding: v.skipMultilineEncoding,
secretReminderRepeatDays: v.secretReminderRepeatDays,
secretReminderNote: v.secretReminderNote,
metadata: v.metadata,
tags: v.tags?.map((t) => t.name),
secretReminderRecipients: v.secretReminderRecipients?.map((r) => r.toString()),
secretValue: v.secretValue
}));
}
if (change.folderVersion) {
const currentVersion = change.folderVersion || "1";
const previousVersion = change.fromVersion || "1";
// eslint-disable-next-line no-await-in-loop
const versions = await server.services.folder.getFolderVersionsByIds({
folderId: change.id,
folderVersions:
change.changeType === ChangeType.UPDATE ? [currentVersion, previousVersion] : [currentVersion]
});
change.versions = versions.map((v) => ({
name: v.name
}));
}
}
}
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.PIT_COMPARE_FOLDER_STATES,
metadata: {
targetCommitId: req.params.commitId,
folderId: req.query.folderId,
deepRollback: req.query.deepRollback,
diffsCount: diffs.length.toString()
}
}
});
return diffs;
}
});
// Rollback to a previous commit
server.route({
method: "POST",
url: "/commits/:workspaceId/:commitId/rollback",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim(),
commitId: z.string().trim()
}),
body: z.object({
folderId: z.string().trim(),
deepRollback: z.boolean().default(false),
message: z.string().trim().optional(),
envId: z.string().trim()
}),
response: {
200: z.object({
success: z.boolean(),
secretChangesCount: z.number().optional(),
folderChangesCount: z.number().optional(),
totalChanges: z.number().optional()
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const latestCommit = await server.services.folderCommit.getLatestCommit({
folderId: req.body.folderId,
actor: req.permission?.type,
actorId: req.permission?.id,
actorOrgId: req.permission?.orgId,
actorAuthMethod: req.permission?.authMethod,
projectId: req.params.workspaceId
});
if (!latestCommit) {
throw new NotFoundError({ message: "Latest commit not found" });
}
if (req.body.deepRollback) {
await server.services.folderCommit.deepRollbackFolder(
req.params.commitId,
req.body.envId,
req.permission.id,
req.permission.type,
req.params.workspaceId
);
return { success: true };
}
const diff = await server.services.folderCommit.compareFolderStates({
currentCommitId: latestCommit.id,
targetCommitId: req.params.commitId
});
const response = await server.services.folderCommit.applyFolderStateDifferences({
differences: diff,
actorInfo: {
actorType: req.permission.type,
actorId: req.permission.id,
message: req.body.message || "Rollback to previous commit"
},
folderId: req.body.folderId,
projectId: req.params.workspaceId,
reconstructNewFolders: req.body.deepRollback
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.PIT_ROLLBACK_COMMIT,
metadata: {
targetCommitId: req.params.commitId,
folderId: req.body.folderId,
deepRollback: req.body.deepRollback,
message: req.body.message || "Rollback to previous commit",
totalChanges: response.totalChanges.toString()
}
}
});
return {
success: true,
secretChangesCount: response.secretChangesCount,
folderChangesCount: response.folderChangesCount,
totalChanges: response.totalChanges
};
}
});
// Revert commit
server.route({
method: "POST",
url: "/commits/:workspaceId/:commitId/revert",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim(),
commitId: z.string().trim()
}),
response: {
200: z.object({
success: z.boolean(),
message: z.string(),
originalCommitId: z.string(),
revertCommitId: z.string().optional(),
changesReverted: z.number().optional()
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const response = await server.services.folderCommit.revertCommitChanges({
commitId: req.params.commitId,
actor: req.permission?.type,
actorId: req.permission?.id,
actorAuthMethod: req.permission?.authMethod,
actorOrgId: req.permission?.orgId,
projectId: req.params.workspaceId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.PIT_REVERT_COMMIT,
metadata: {
commitId: req.params.commitId,
revertCommitId: response.revertCommitId,
changesReverted: response.changesReverted?.toString()
}
}
});
return response;
}
});
// Folder state at commit
server.route({
method: "GET",
url: "/commits/:workspaceId/:commitId",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim(),
commitId: z.string().trim()
}),
querystring: z.object({
folderId: z.string().trim()
}),
response: {
200: folderStateSchema
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const response = await server.services.folderCommit.reconstructFolderState(req.params.commitId);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.PIT_GET_FOLDER_STATE,
metadata: {
commitId: req.params.commitId,
folderId: req.query.folderId,
resourceCount: response.length.toString()
}
}
});
return response.map((item) => ({
...item,
secretVersion: item.secretVersion ? Number(item.secretVersion) : undefined,
folderVersion: item.folderVersion ? Number(item.folderVersion) : undefined
}));
}
});
};

View File

@@ -0,0 +1,360 @@
import { z } from "zod";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { loginMappingSchema, sanitizedSshHost } from "@app/ee/services/ssh-host/ssh-host-schema";
import { sanitizedSshHostGroup } from "@app/ee/services/ssh-host-group/ssh-host-group-schema";
import { EHostGroupMembershipFilter } from "@app/ee/services/ssh-host-group/ssh-host-group-types";
import { ApiDocsTags, SSH_HOST_GROUPS } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { slugSchema } from "@app/server/lib/schemas";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
export const registerSshHostGroupRouter = async (server: FastifyZodProvider) => {
server.route({
method: "GET",
url: "/:sshHostGroupId",
config: {
rateLimit: readLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Get SSH Host Group",
params: z.object({
sshHostGroupId: z.string().describe(SSH_HOST_GROUPS.GET.sshHostGroupId)
}),
response: {
200: sanitizedSshHostGroup.extend({
loginMappings: z.array(loginMappingSchema)
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const sshHostGroup = await server.services.sshHostGroup.getSshHostGroup({
sshHostGroupId: req.params.sshHostGroupId,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHostGroup.projectId,
event: {
type: EventType.GET_SSH_HOST_GROUP,
metadata: {
sshHostGroupId: sshHostGroup.id,
name: sshHostGroup.name
}
}
});
return sshHostGroup;
}
});
server.route({
method: "POST",
url: "/",
config: {
rateLimit: writeLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Create SSH Host Group",
body: z.object({
projectId: z.string().describe(SSH_HOST_GROUPS.CREATE.projectId),
name: slugSchema({ min: 1, max: 64, field: "name" }).describe(SSH_HOST_GROUPS.CREATE.name),
loginMappings: z.array(loginMappingSchema).default([]).describe(SSH_HOST_GROUPS.CREATE.loginMappings)
}),
response: {
200: sanitizedSshHostGroup.extend({
loginMappings: z.array(loginMappingSchema)
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const sshHostGroup = await server.services.sshHostGroup.createSshHostGroup({
...req.body,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHostGroup.projectId,
event: {
type: EventType.CREATE_SSH_HOST_GROUP,
metadata: {
sshHostGroupId: sshHostGroup.id,
name: sshHostGroup.name,
loginMappings: sshHostGroup.loginMappings
}
}
});
return sshHostGroup;
}
});
server.route({
method: "PATCH",
url: "/:sshHostGroupId",
config: {
rateLimit: writeLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Update SSH Host Group",
params: z.object({
sshHostGroupId: z.string().trim().describe(SSH_HOST_GROUPS.UPDATE.sshHostGroupId)
}),
body: z.object({
name: slugSchema({ min: 1, max: 64, field: "name" }).describe(SSH_HOST_GROUPS.UPDATE.name).optional(),
loginMappings: z.array(loginMappingSchema).optional().describe(SSH_HOST_GROUPS.UPDATE.loginMappings)
}),
response: {
200: sanitizedSshHostGroup.extend({
loginMappings: z.array(loginMappingSchema)
})
}
},
handler: async (req) => {
const sshHostGroup = await server.services.sshHostGroup.updateSshHostGroup({
sshHostGroupId: req.params.sshHostGroupId,
...req.body,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHostGroup.projectId,
event: {
type: EventType.UPDATE_SSH_HOST_GROUP,
metadata: {
sshHostGroupId: sshHostGroup.id,
name: sshHostGroup.name,
loginMappings: sshHostGroup.loginMappings
}
}
});
return sshHostGroup;
}
});
server.route({
method: "DELETE",
url: "/:sshHostGroupId",
config: {
rateLimit: writeLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Delete SSH Host Group",
params: z.object({
sshHostGroupId: z.string().describe(SSH_HOST_GROUPS.DELETE.sshHostGroupId)
}),
response: {
200: sanitizedSshHostGroup.extend({
loginMappings: z.array(loginMappingSchema)
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const sshHostGroup = await server.services.sshHostGroup.deleteSshHostGroup({
sshHostGroupId: req.params.sshHostGroupId,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHostGroup.projectId,
event: {
type: EventType.DELETE_SSH_HOST_GROUP,
metadata: {
sshHostGroupId: sshHostGroup.id,
name: sshHostGroup.name
}
}
});
return sshHostGroup;
}
});
server.route({
method: "GET",
url: "/:sshHostGroupId/hosts",
config: {
rateLimit: readLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Get SSH Hosts in a Host Group",
params: z.object({
sshHostGroupId: z.string().describe(SSH_HOST_GROUPS.GET.sshHostGroupId)
}),
querystring: z.object({
filter: z.nativeEnum(EHostGroupMembershipFilter).optional().describe(SSH_HOST_GROUPS.GET.filter)
}),
response: {
200: z.object({
hosts: sanitizedSshHost
.pick({
id: true,
hostname: true,
alias: true
})
.merge(
z.object({
isPartOfGroup: z.boolean(),
joinedGroupAt: z.date().nullable()
})
)
.array(),
totalCount: z.number()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { sshHostGroup, hosts, totalCount } = await server.services.sshHostGroup.listSshHostGroupHosts({
sshHostGroupId: req.params.sshHostGroupId,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.query
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHostGroup.projectId,
event: {
type: EventType.GET_SSH_HOST_GROUP_HOSTS,
metadata: {
sshHostGroupId: req.params.sshHostGroupId,
name: sshHostGroup.name
}
}
});
return { hosts, totalCount };
}
});
server.route({
method: "POST",
url: "/:sshHostGroupId/hosts/:hostId",
config: {
rateLimit: writeLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Add an SSH Host to a Host Group",
params: z.object({
sshHostGroupId: z.string().describe(SSH_HOST_GROUPS.ADD_HOST.sshHostGroupId),
hostId: z.string().describe(SSH_HOST_GROUPS.ADD_HOST.hostId)
}),
response: {
200: sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { sshHostGroup, sshHost } = await server.services.sshHostGroup.addHostToSshHostGroup({
sshHostGroupId: req.params.sshHostGroupId,
hostId: req.params.hostId,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHost.projectId,
event: {
type: EventType.ADD_HOST_TO_SSH_HOST_GROUP,
metadata: {
sshHostGroupId: sshHostGroup.id,
sshHostId: sshHost.id,
hostname: sshHost.hostname
}
}
});
return sshHost;
}
});
server.route({
method: "DELETE",
url: "/:sshHostGroupId/hosts/:hostId",
config: {
rateLimit: writeLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHostGroups],
description: "Remove an SSH Host from a Host Group",
params: z.object({
sshHostGroupId: z.string().describe(SSH_HOST_GROUPS.DELETE_HOST.sshHostGroupId),
hostId: z.string().describe(SSH_HOST_GROUPS.DELETE_HOST.hostId)
}),
response: {
200: sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { sshHostGroup, sshHost } = await server.services.sshHostGroup.removeHostFromSshHostGroup({
sshHostGroupId: req.params.sshHostGroupId,
hostId: req.params.hostId,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: sshHost.projectId,
event: {
type: EventType.REMOVE_HOST_FROM_SSH_HOST_GROUP,
metadata: {
sshHostGroupId: sshHostGroup.id,
sshHostId: sshHost.id,
hostname: sshHost.hostname
}
}
});
return sshHost;
}
});
};

View File

@@ -3,8 +3,9 @@ import { z } from "zod";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { SshCertKeyAlgorithm } from "@app/ee/services/ssh-certificate/ssh-certificate-types";
import { loginMappingSchema, sanitizedSshHost } from "@app/ee/services/ssh-host/ssh-host-schema";
import { LoginMappingSource } from "@app/ee/services/ssh-host/ssh-host-types";
import { isValidHostname } from "@app/ee/services/ssh-host/ssh-host-validators";
import { SSH_HOSTS } from "@app/lib/api-docs";
import { ApiDocsTags, SSH_HOSTS } from "@app/lib/api-docs";
import { ms } from "@app/lib/ms";
import { publicSshCaLimit, readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { slugSchema } from "@app/server/lib/schemas";
@@ -21,10 +22,16 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
rateLimit: readLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
response: {
200: z.array(
sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
loginMappings: loginMappingSchema
.extend({
source: z.nativeEnum(LoginMappingSource)
})
.array()
})
)
}
@@ -49,12 +56,18 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
rateLimit: readLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
params: z.object({
sshHostId: z.string().describe(SSH_HOSTS.GET.sshHostId)
}),
response: {
200: sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
loginMappings: loginMappingSchema
.extend({
source: z.nativeEnum(LoginMappingSource)
})
.array()
})
}
},
@@ -91,7 +104,9 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
rateLimit: writeLimit
},
schema: {
description: "Add an SSH Host",
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Register SSH Host",
body: z.object({
projectId: z.string().describe(SSH_HOSTS.CREATE.projectId),
hostname: z
@@ -119,7 +134,11 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
}),
response: {
200: sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
loginMappings: loginMappingSchema
.extend({
source: z.nativeEnum(LoginMappingSource)
})
.array()
})
}
},
@@ -163,6 +182,8 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Update SSH Host",
params: z.object({
sshHostId: z.string().trim().describe(SSH_HOSTS.UPDATE.sshHostId)
@@ -192,7 +213,11 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
}),
response: {
200: sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
loginMappings: loginMappingSchema
.extend({
source: z.nativeEnum(LoginMappingSource)
})
.array()
})
}
},
@@ -235,12 +260,19 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
rateLimit: writeLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Delete SSH Host",
params: z.object({
sshHostId: z.string().describe(SSH_HOSTS.DELETE.sshHostId)
}),
response: {
200: sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
loginMappings: loginMappingSchema
.extend({
source: z.nativeEnum(LoginMappingSource)
})
.array()
})
}
},
@@ -278,6 +310,8 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Issue SSH certificate for user",
params: z.object({
sshHostId: z.string().describe(SSH_HOSTS.ISSUE_SSH_CREDENTIALS.sshHostId)
@@ -350,6 +384,8 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Issue SSH certificate for host",
params: z.object({
sshHostId: z.string().describe(SSH_HOSTS.ISSUE_HOST_CERT.sshHostId)
@@ -414,6 +450,8 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
rateLimit: publicSshCaLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Get public key of the user SSH CA linked to the host",
params: z.object({
sshHostId: z.string().trim().describe(SSH_HOSTS.GET_USER_CA_PUBLIC_KEY.sshHostId)
@@ -435,6 +473,8 @@ export const registerSshHostRouter = async (server: FastifyZodProvider) => {
rateLimit: publicSshCaLimit
},
schema: {
hide: false,
tags: [ApiDocsTags.SshHosts],
description: "Get public key of the host SSH CA linked to the host",
params: z.object({
sshHostId: z.string().trim().describe(SSH_HOSTS.GET_HOST_CA_PUBLIC_KEY.sshHostId)

View File

@@ -6,13 +6,15 @@ import { getConfig } from "@app/lib/config/env";
import { BadRequestError, ForbiddenRequestError, NotFoundError } from "@app/lib/errors";
import { ms } from "@app/lib/ms";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { triggerWorkflowIntegrationNotification } from "@app/lib/workflow-integrations/trigger-notification";
import { TriggerFeature } from "@app/lib/workflow-integrations/types";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TMicrosoftTeamsServiceFactory } from "@app/services/microsoft-teams/microsoft-teams-service";
import { TProjectMicrosoftTeamsConfigDALFactory } from "@app/services/microsoft-teams/project-microsoft-teams-config-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectEnvDALFactory } from "@app/services/project-env/project-env-dal";
import { TProjectMembershipDALFactory } from "@app/services/project-membership/project-membership-dal";
import { TProjectSlackConfigDALFactory } from "@app/services/slack/project-slack-config-dal";
import { triggerSlackNotification } from "@app/services/slack/slack-fns";
import { SlackTriggerFeature } from "@app/services/slack/slack-types";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TUserDALFactory } from "@app/services/user/user-dal";
@@ -67,6 +69,8 @@ type TSecretApprovalRequestServiceFactoryDep = {
>;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "getIntegrationDetailsByProject">;
microsoftTeamsService: Pick<TMicrosoftTeamsServiceFactory, "sendNotification">;
projectMicrosoftTeamsConfigDAL: Pick<TProjectMicrosoftTeamsConfigDALFactory, "getIntegrationDetailsByProject">;
};
export type TAccessApprovalRequestServiceFactory = ReturnType<typeof accessApprovalRequestServiceFactory>;
@@ -84,6 +88,8 @@ export const accessApprovalRequestServiceFactory = ({
smtpService,
userDAL,
kmsService,
microsoftTeamsService,
projectMicrosoftTeamsConfigDAL,
projectSlackConfigDAL
}: TSecretApprovalRequestServiceFactoryDep) => {
const createAccessApprovalRequest = async ({
@@ -219,24 +225,30 @@ export const accessApprovalRequestServiceFactory = ({
const requesterFullName = `${requestedByUser.firstName} ${requestedByUser.lastName}`;
const approvalUrl = `${cfg.SITE_URL}/secret-manager/${project.id}/approval`;
await triggerSlackNotification({
projectId: project.id,
projectSlackConfigDAL,
projectDAL,
kmsService,
notification: {
type: SlackTriggerFeature.ACCESS_REQUEST,
payload: {
projectName: project.name,
requesterFullName,
isTemporary,
requesterEmail: requestedByUser.email as string,
secretPath,
environment: envSlug,
permissions: accessTypes,
approvalUrl,
note
}
await triggerWorkflowIntegrationNotification({
input: {
notification: {
type: TriggerFeature.ACCESS_REQUEST,
payload: {
projectName: project.name,
requesterFullName,
isTemporary,
requesterEmail: requestedByUser.email as string,
secretPath,
environment: envSlug,
permissions: accessTypes,
approvalUrl,
note
}
},
projectId: project.id
},
dependencies: {
projectDAL,
projectSlackConfigDAL,
kmsService,
microsoftTeamsService,
projectMicrosoftTeamsConfigDAL
}
});

View File

@@ -12,6 +12,7 @@ import {
import { SshCaStatus, SshCertType } from "@app/ee/services/ssh/ssh-certificate-authority-types";
import { SshCertKeyAlgorithm } from "@app/ee/services/ssh-certificate/ssh-certificate-types";
import { SshCertTemplateStatus } from "@app/ee/services/ssh-certificate-template/ssh-certificate-template-types";
import { TLoginMapping } from "@app/ee/services/ssh-host/ssh-host-types";
import { SymmetricKeyAlgorithm } from "@app/lib/crypto/cipher";
import { AsymmetricKeyAlgorithm, SigningAlgorithm } from "@app/lib/crypto/sign/types";
import { TProjectPermission } from "@app/lib/types";
@@ -29,6 +30,7 @@ import {
TSecretSyncRaw,
TUpdateSecretSyncDTO
} from "@app/services/secret-sync/secret-sync-types";
import { WorkflowIntegration } from "@app/services/workflow-integration/workflow-integration-types";
import { KmipPermission } from "../kmip/kmip-enum";
import { ApprovalStatus } from "../secret-approval-request/secret-approval-request-types";
@@ -191,12 +193,19 @@ export enum EventType {
UPDATE_SSH_CERTIFICATE_TEMPLATE = "update-ssh-certificate-template",
DELETE_SSH_CERTIFICATE_TEMPLATE = "delete-ssh-certificate-template",
GET_SSH_CERTIFICATE_TEMPLATE = "get-ssh-certificate-template",
GET_SSH_HOST = "get-ssh-host",
CREATE_SSH_HOST = "create-ssh-host",
UPDATE_SSH_HOST = "update-ssh-host",
DELETE_SSH_HOST = "delete-ssh-host",
GET_SSH_HOST = "get-ssh-host",
ISSUE_SSH_HOST_USER_CERT = "issue-ssh-host-user-cert",
ISSUE_SSH_HOST_HOST_CERT = "issue-ssh-host-host-cert",
GET_SSH_HOST_GROUP = "get-ssh-host-group",
CREATE_SSH_HOST_GROUP = "create-ssh-host-group",
UPDATE_SSH_HOST_GROUP = "update-ssh-host-group",
DELETE_SSH_HOST_GROUP = "delete-ssh-host-group",
GET_SSH_HOST_GROUP_HOSTS = "get-ssh-host-group-hosts",
ADD_HOST_TO_SSH_HOST_GROUP = "add-host-to-ssh-host-group",
REMOVE_HOST_FROM_SSH_HOST_GROUP = "remove-host-from-ssh-host-group",
CREATE_CA = "create-certificate-authority",
GET_CA = "get-certificate-authority",
UPDATE_CA = "update-certificate-authority",
@@ -244,11 +253,14 @@ export enum EventType {
GET_CERTIFICATE_TEMPLATE_EST_CONFIG = "get-certificate-template-est-config",
ATTEMPT_CREATE_SLACK_INTEGRATION = "attempt-create-slack-integration",
ATTEMPT_REINSTALL_SLACK_INTEGRATION = "attempt-reinstall-slack-integration",
GET_PROJECT_SLACK_CONFIG = "get-project-slack-config",
UPDATE_PROJECT_SLACK_CONFIG = "update-project-slack-config",
GET_SLACK_INTEGRATION = "get-slack-integration",
UPDATE_SLACK_INTEGRATION = "update-slack-integration",
DELETE_SLACK_INTEGRATION = "delete-slack-integration",
GET_PROJECT_SLACK_CONFIG = "get-project-slack-config",
UPDATE_PROJECT_SLACK_CONFIG = "update-project-slack-config",
GET_PROJECT_WORKFLOW_INTEGRATION_CONFIG = "get-project-workflow-integration-config",
UPDATE_PROJECT_WORKFLOW_INTEGRATION_CONFIG = "update-project-workflow-integration-config",
GET_PROJECT_SSH_CONFIG = "get-project-ssh-config",
UPDATE_PROJECT_SSH_CONFIG = "update-project-ssh-config",
INTEGRATION_SYNCED = "integration-synced",
@@ -321,8 +333,25 @@ export enum EventType {
SECRET_ROTATION_ROTATE_SECRETS = "secret-rotation-rotate-secrets",
PROJECT_ACCESS_REQUEST = "project-access-request",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_CREATE = "microsoft-teams-workflow-integration-create",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_DELETE = "microsoft-teams-workflow-integration-delete",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_UPDATE = "microsoft-teams-workflow-integration-update",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_CHECK_INSTALLATION_STATUS = "microsoft-teams-workflow-integration-check-installation-status",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_GET_TEAMS = "microsoft-teams-workflow-integration-get-teams",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_GET = "microsoft-teams-workflow-integration-get",
MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_LIST = "microsoft-teams-workflow-integration-list",
PROJECT_ASSUME_PRIVILEGE_SESSION_START = "project-assume-privileges-session-start",
PROJECT_ASSUME_PRIVILEGE_SESSION_END = "project-assume-privileges-session-end"
PROJECT_ASSUME_PRIVILEGE_SESSION_END = "project-assume-privileges-session-end",
GET_PROJECT_PIT_COMMITS = "get-project-pit-commits",
GET_PROJECT_PIT_COMMIT_CHANGES = "get-project-pit-commit-changes",
GET_PROJECT_PIT_COMMIT_COUNT = "get-project-pit-commit-count",
PIT_ROLLBACK_COMMIT = "pit-rollback-commit",
PIT_REVERT_COMMIT = "pit-revert-commit",
PIT_GET_FOLDER_STATE = "pit-get-folder-state",
PIT_COMPARE_FOLDER_STATES = "pit-compare-folder-states"
}
export const filterableSecretEvents: EventType[] = [
@@ -1499,12 +1528,7 @@ interface CreateSshHost {
alias: string | null;
userCertTtl: string;
hostCertTtl: string;
loginMappings: {
loginUser: string;
allowedPrincipals: {
usernames: string[];
};
}[];
loginMappings: TLoginMapping[];
userSshCaId: string;
hostSshCaId: string;
};
@@ -1518,12 +1542,7 @@ interface UpdateSshHost {
alias?: string | null;
userCertTtl?: string;
hostCertTtl?: string;
loginMappings?: {
loginUser: string;
allowedPrincipals: {
usernames: string[];
};
}[];
loginMappings?: TLoginMapping[];
userSshCaId?: string;
hostSshCaId?: string;
};
@@ -1567,6 +1586,66 @@ interface IssueSshHostHostCert {
};
}
interface GetSshHostGroupEvent {
type: EventType.GET_SSH_HOST_GROUP;
metadata: {
sshHostGroupId: string;
name: string;
};
}
interface CreateSshHostGroupEvent {
type: EventType.CREATE_SSH_HOST_GROUP;
metadata: {
sshHostGroupId: string;
name: string;
loginMappings: TLoginMapping[];
};
}
interface UpdateSshHostGroupEvent {
type: EventType.UPDATE_SSH_HOST_GROUP;
metadata: {
sshHostGroupId: string;
name?: string;
loginMappings?: TLoginMapping[];
};
}
interface DeleteSshHostGroupEvent {
type: EventType.DELETE_SSH_HOST_GROUP;
metadata: {
sshHostGroupId: string;
name: string;
};
}
interface GetSshHostGroupHostsEvent {
type: EventType.GET_SSH_HOST_GROUP_HOSTS;
metadata: {
sshHostGroupId: string;
name: string;
};
}
interface AddHostToSshHostGroupEvent {
type: EventType.ADD_HOST_TO_SSH_HOST_GROUP;
metadata: {
sshHostGroupId: string;
sshHostId: string;
hostname: string;
};
}
interface RemoveHostFromSshHostGroupEvent {
type: EventType.REMOVE_HOST_FROM_SSH_HOST_GROUP;
metadata: {
sshHostGroupId: string;
sshHostId: string;
hostname: string;
};
}
interface CreateCa {
type: EventType.CREATE_CA;
metadata: {
@@ -1980,22 +2059,24 @@ interface GetSlackIntegration {
};
}
interface UpdateProjectSlackConfig {
type: EventType.UPDATE_PROJECT_SLACK_CONFIG;
interface UpdateProjectWorkflowIntegrationConfig {
type: EventType.UPDATE_PROJECT_WORKFLOW_INTEGRATION_CONFIG;
metadata: {
id: string;
slackIntegrationId: string;
integrationId: string;
integration: WorkflowIntegration;
isAccessRequestNotificationEnabled: boolean;
accessRequestChannels: string;
accessRequestChannels?: string | { teamId: string; channelIds: string[] };
isSecretRequestNotificationEnabled: boolean;
secretRequestChannels: string;
secretRequestChannels?: string | { teamId: string; channelIds: string[] };
};
}
interface GetProjectSlackConfig {
type: EventType.GET_PROJECT_SLACK_CONFIG;
interface GetProjectWorkflowIntegrationConfig {
type: EventType.GET_PROJECT_WORKFLOW_INTEGRATION_CONFIG;
metadata: {
id: string;
integration: WorkflowIntegration;
};
}
@@ -2561,6 +2642,131 @@ interface RotateSecretRotationEvent {
};
}
interface MicrosoftTeamsWorkflowIntegrationCreateEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_CREATE;
metadata: {
tenantId: string;
slug: string;
description?: string;
};
}
interface MicrosoftTeamsWorkflowIntegrationDeleteEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_DELETE;
metadata: {
tenantId: string;
id: string;
slug: string;
};
}
interface MicrosoftTeamsWorkflowIntegrationCheckInstallationStatusEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_CHECK_INSTALLATION_STATUS;
metadata: {
tenantId: string;
slug: string;
};
}
interface MicrosoftTeamsWorkflowIntegrationGetTeamsEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_GET_TEAMS;
metadata: {
tenantId: string;
slug: string;
id: string;
};
}
interface MicrosoftTeamsWorkflowIntegrationGetEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_GET;
metadata: {
tenantId: string;
slug: string;
id: string;
};
}
interface MicrosoftTeamsWorkflowIntegrationListEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_LIST;
metadata: Record<string, string>;
}
interface MicrosoftTeamsWorkflowIntegrationUpdateEvent {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_UPDATE;
metadata: {
tenantId: string;
slug: string;
id: string;
newSlug?: string;
newDescription?: string;
};
}
interface GetProjectPitCommitsEvent {
type: EventType.GET_PROJECT_PIT_COMMITS;
metadata: {
commitCount: string;
environment: string;
path: string;
};
}
interface GetProjectPitCommitChangesEvent {
type: EventType.GET_PROJECT_PIT_COMMIT_CHANGES;
metadata: {
changesCount: string;
commitId: string;
};
}
interface GetProjectPitCommitCountEvent {
type: EventType.GET_PROJECT_PIT_COMMIT_COUNT;
metadata: {
environment: string;
path: string;
commitCount: string;
};
}
interface PitRollbackCommitEvent {
type: EventType.PIT_ROLLBACK_COMMIT;
metadata: {
targetCommitId: string;
folderId: string;
deepRollback: boolean;
message: string;
totalChanges: string;
};
}
interface PitRevertCommitEvent {
type: EventType.PIT_REVERT_COMMIT;
metadata: {
commitId: string;
revertCommitId?: string;
changesReverted?: string;
};
}
interface PitGetFolderStateEvent {
type: EventType.PIT_GET_FOLDER_STATE;
metadata: {
commitId: string;
folderId: string;
resourceCount: string;
};
}
interface PitCompareFolderStatesEvent {
type: EventType.PIT_COMPARE_FOLDER_STATES;
metadata: {
targetCommitId: string;
folderId: string;
deepRollback: boolean;
diffsCount: string;
};
}
export type Event =
| GetSecretsEvent
| GetSecretEvent
@@ -2723,8 +2929,8 @@ export type Event =
| UpdateSlackIntegration
| DeleteSlackIntegration
| GetSlackIntegration
| UpdateProjectSlackConfig
| GetProjectSlackConfig
| UpdateProjectWorkflowIntegrationConfig
| GetProjectWorkflowIntegrationConfig
| GetProjectSshConfig
| UpdateProjectSshConfig
| IntegrationSyncedEvent
@@ -2753,6 +2959,13 @@ export type Event =
| CreateAppConnectionEvent
| UpdateAppConnectionEvent
| DeleteAppConnectionEvent
| GetSshHostGroupEvent
| CreateSshHostGroupEvent
| UpdateSshHostGroupEvent
| DeleteSshHostGroupEvent
| GetSshHostGroupHostsEvent
| AddHostToSshHostGroupEvent
| RemoveHostFromSshHostGroupEvent
| CreateSharedSecretEvent
| DeleteSharedSecretEvent
| ReadSharedSecretEvent
@@ -2794,4 +3007,18 @@ export type Event =
| CreateSecretRotationEvent
| UpdateSecretRotationEvent
| DeleteSecretRotationEvent
| RotateSecretRotationEvent;
| RotateSecretRotationEvent
| MicrosoftTeamsWorkflowIntegrationCreateEvent
| MicrosoftTeamsWorkflowIntegrationDeleteEvent
| MicrosoftTeamsWorkflowIntegrationCheckInstallationStatusEvent
| MicrosoftTeamsWorkflowIntegrationGetTeamsEvent
| MicrosoftTeamsWorkflowIntegrationGetEvent
| MicrosoftTeamsWorkflowIntegrationListEvent
| MicrosoftTeamsWorkflowIntegrationUpdateEvent
| GetProjectPitCommitsEvent
| GetProjectPitCommitChangesEvent
| PitRollbackCommitEvent
| GetProjectPitCommitCountEvent
| PitRevertCommitEvent
| PitCompareFolderStatesEvent
| PitGetFolderStateEvent;

View File

@@ -153,7 +153,7 @@ export const groupDALFactory = (db: TDbClient) => {
totalCount: Number(members?.[0]?.total_count ?? 0)
};
} catch (error) {
throw new DatabaseError({ error, name: "Find all org members" });
throw new DatabaseError({ error, name: "Find all user group members" });
}
};

View File

@@ -28,7 +28,8 @@ export const getDefaultOnPremFeatures = () => {
has_used_trial: true,
secretApproval: true,
secretRotation: true,
caCrl: false
caCrl: false,
sshHostGroups: false
};
};

View File

@@ -10,6 +10,7 @@ export const BillingPlanRows = {
CustomAlerts: { name: "Custom alerts", field: "customAlerts" },
AuditLogs: { name: "Audit logs", field: "auditLogs" },
SamlSSO: { name: "SAML SSO", field: "samlSSO" },
SshHostGroups: { name: "SSH Host Groups", field: "sshHostGroups" },
Hsm: { name: "Hardware Security Module (HSM)", field: "hsm" },
OidcSSO: { name: "OIDC SSO", field: "oidcSSO" },
SecretApproval: { name: "Secret approvals", field: "secretApproval" },

View File

@@ -53,7 +53,8 @@ export const getDefaultOnPremFeatures = (): TFeatureSet => ({
enforceMfa: false,
projectTemplates: false,
kmip: false,
gateway: false
gateway: false,
sshHostGroups: false
});
export const setupLicenseRequestWithStore = (baseURL: string, refreshUrl: string, licenseKey: string) => {

View File

@@ -71,6 +71,7 @@ export type TFeatureSet = {
projectTemplates: false;
kmip: false;
gateway: false;
sshHostGroups: false;
};
export type TOrgPlansTableDTO = {

View File

@@ -17,6 +17,11 @@ export enum ProjectPermissionActions {
Delete = "delete"
}
export enum ProjectPermissionCommitsActions {
Read = "read",
PerformRollback = "perform-rollback"
}
export enum ProjectPermissionSecretActions {
DescribeAndReadValue = "read",
DescribeSecret = "describeSecret",
@@ -126,6 +131,7 @@ export enum ProjectPermissionSub {
SecretRollback = "secret-rollback",
SecretApproval = "secret-approval",
SecretRotation = "secret-rotation",
Commits = "commits",
Identity = "identity",
CertificateAuthorities = "certificate-authorities",
Certificates = "certificates",
@@ -134,6 +140,7 @@ export enum ProjectPermissionSub {
SshCertificates = "ssh-certificates",
SshCertificateTemplates = "ssh-certificate-templates",
SshHosts = "ssh-hosts",
SshHostGroups = "ssh-host-groups",
PkiAlerts = "pki-alerts",
PkiCollections = "pki-collections",
Kms = "kms",
@@ -240,6 +247,7 @@ export type ProjectPermissionSet =
ProjectPermissionSshHostActions,
ProjectPermissionSub.SshHosts | (ForcedSubject<ProjectPermissionSub.SshHosts> & SshHostSubjectFields)
]
| [ProjectPermissionActions, ProjectPermissionSub.SshHostGroups]
| [ProjectPermissionActions, ProjectPermissionSub.PkiAlerts]
| [ProjectPermissionActions, ProjectPermissionSub.PkiCollections]
| [ProjectPermissionSecretSyncActions, ProjectPermissionSub.SecretSyncs]
@@ -249,7 +257,8 @@ export type ProjectPermissionSet =
| [ProjectPermissionActions.Edit, ProjectPermissionSub.Project]
| [ProjectPermissionActions.Read, ProjectPermissionSub.SecretRollback]
| [ProjectPermissionActions.Create, ProjectPermissionSub.SecretRollback]
| [ProjectPermissionActions.Edit, ProjectPermissionSub.Kms];
| [ProjectPermissionActions.Edit, ProjectPermissionSub.Kms]
| [ProjectPermissionCommitsActions, ProjectPermissionSub.Commits];
const SECRET_PATH_MISSING_SLASH_ERR_MSG = "Invalid Secret Path; it must start with a '/'";
const SECRET_PATH_PERMISSION_OPERATOR_SCHEMA = z.union([
@@ -508,6 +517,12 @@ const GeneralPermissionSchema = [
"Describe what action an entity can take."
)
}),
z.object({
subject: z.literal(ProjectPermissionSub.SshHostGroups).describe("The entity this permission pertains to."),
action: CASL_ACTION_SCHEMA_NATIVE_ENUM(ProjectPermissionActions).describe(
"Describe what action an entity can take."
)
}),
z.object({
subject: z.literal(ProjectPermissionSub.PkiAlerts).describe("The entity this permission pertains to."),
action: CASL_ACTION_SCHEMA_NATIVE_ENUM(ProjectPermissionActions).describe(
@@ -549,6 +564,12 @@ const GeneralPermissionSchema = [
action: CASL_ACTION_SCHEMA_NATIVE_ENUM(ProjectPermissionKmipActions).describe(
"Describe what action an entity can take."
)
}),
z.object({
subject: z.literal(ProjectPermissionSub.Commits).describe("The entity this permission pertains to."),
action: CASL_ACTION_SCHEMA_NATIVE_ENUM(ProjectPermissionCommitsActions).describe(
"Describe what action an entity can take."
)
})
];
@@ -686,7 +707,8 @@ const buildAdminPermissionRules = () => {
ProjectPermissionSub.PkiCollections,
ProjectPermissionSub.SshCertificateAuthorities,
ProjectPermissionSub.SshCertificates,
ProjectPermissionSub.SshCertificateTemplates
ProjectPermissionSub.SshCertificateTemplates,
ProjectPermissionSub.SshHostGroups
].forEach((el) => {
can(
[
@@ -820,6 +842,11 @@ const buildAdminPermissionRules = () => {
ProjectPermissionSub.SecretRotation
);
can(
[ProjectPermissionCommitsActions.Read, ProjectPermissionCommitsActions.PerformRollback],
ProjectPermissionSub.Commits
);
return rules;
};
@@ -1002,6 +1029,11 @@ const buildMemberPermissionRules = () => {
ProjectPermissionSub.SecretSyncs
);
can(
[ProjectPermissionCommitsActions.Read, ProjectPermissionCommitsActions.PerformRollback],
ProjectPermissionSub.Commits
);
return rules;
};
@@ -1037,6 +1069,7 @@ const buildViewerPermissionRules = () => {
can(ProjectPermissionActions.Read, ProjectPermissionSub.SshCertificates);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SshCertificateTemplates);
can(ProjectPermissionSecretSyncActions.Read, ProjectPermissionSub.SecretSyncs);
can(ProjectPermissionCommitsActions.Read, ProjectPermissionSub.Commits);
return rules;
};

View File

@@ -17,9 +17,14 @@ import { groupBy, pick, unique } from "@app/lib/fn";
import { setKnexStringValue } from "@app/lib/knex";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { EnforcementLevel } from "@app/lib/types";
import { triggerWorkflowIntegrationNotification } from "@app/lib/workflow-integrations/trigger-notification";
import { TriggerFeature } from "@app/lib/workflow-integrations/types";
import { ActorType } from "@app/services/auth/auth-type";
import { TFolderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { KmsDataKey } from "@app/services/kms/kms-types";
import { TMicrosoftTeamsServiceFactory } from "@app/services/microsoft-teams/microsoft-teams-service";
import { TProjectMicrosoftTeamsConfigDALFactory } from "@app/services/microsoft-teams/project-microsoft-teams-config-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service";
import { TProjectEnvDALFactory } from "@app/services/project-env/project-env-dal";
@@ -52,8 +57,6 @@ import {
import { TSecretVersionV2DALFactory } from "@app/services/secret-v2-bridge/secret-version-dal";
import { TSecretVersionV2TagDALFactory } from "@app/services/secret-v2-bridge/secret-version-tag-dal";
import { TProjectSlackConfigDALFactory } from "@app/services/slack/project-slack-config-dal";
import { triggerSlackNotification } from "@app/services/slack/slack-fns";
import { SlackTriggerFeature } from "@app/services/slack/slack-types";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TUserDALFactory } from "@app/services/user/user-dal";
@@ -126,6 +129,9 @@ type TSecretApprovalRequestServiceFactoryDep = {
secretApprovalPolicyDAL: Pick<TSecretApprovalPolicyDALFactory, "findById">;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "getIntegrationDetailsByProject">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
projectMicrosoftTeamsConfigDAL: Pick<TProjectMicrosoftTeamsConfigDALFactory, "getIntegrationDetailsByProject">;
microsoftTeamsService: Pick<TMicrosoftTeamsServiceFactory, "sendNotification">;
folderCommitService: Pick<TFolderCommitServiceFactory, "createCommit">;
};
export type TSecretApprovalRequestServiceFactory = ReturnType<typeof secretApprovalRequestServiceFactory>;
@@ -155,7 +161,10 @@ export const secretApprovalRequestServiceFactory = ({
secretVersionTagV2BridgeDAL,
licenseService,
projectSlackConfigDAL,
resourceMetadataDAL
resourceMetadataDAL,
projectMicrosoftTeamsConfigDAL,
microsoftTeamsService,
folderCommitService
}: TSecretApprovalRequestServiceFactoryDep) => {
const requestCount = async ({ projectId, actor, actorId, actorOrgId, actorAuthMethod }: TApprovalRequestCountDTO) => {
if (actor === ActorType.SERVICE) throw new BadRequestError({ message: "Cannot use service token" });
@@ -590,6 +599,10 @@ export const secretApprovalRequestServiceFactory = ({
? await fnSecretV2BridgeBulkInsert({
tx,
folderId,
actor: {
actorId,
type: actor
},
orgId: actorOrgId,
inputSecrets: secretCreationCommits.map((el) => ({
tagIds: el?.tags.map(({ id }) => id),
@@ -612,13 +625,18 @@ export const secretApprovalRequestServiceFactory = ({
secretDAL: secretV2BridgeDAL,
secretVersionDAL: secretVersionV2BridgeDAL,
secretTagDAL,
secretVersionTagDAL: secretVersionTagV2BridgeDAL
secretVersionTagDAL: secretVersionTagV2BridgeDAL,
folderCommitService
})
: [];
const updatedSecrets = secretUpdationCommits.length
? await fnSecretV2BridgeBulkUpdate({
folderId,
orgId: actorOrgId,
actor: {
actorId,
type: actor
},
tx,
inputSecrets: secretUpdationCommits.map((el) => {
const encryptedValue =
@@ -652,7 +670,8 @@ export const secretApprovalRequestServiceFactory = ({
secretVersionDAL: secretVersionV2BridgeDAL,
secretTagDAL,
secretVersionTagDAL: secretVersionTagV2BridgeDAL,
resourceMetadataDAL
resourceMetadataDAL,
folderCommitService
})
: [];
const deletedSecret = secretDeletionCommits.length
@@ -660,10 +679,13 @@ export const secretApprovalRequestServiceFactory = ({
projectId,
folderId,
tx,
actorId: "",
actorId,
actorType: actor,
secretDAL: secretV2BridgeDAL,
secretQueueService,
inputSecrets: secretDeletionCommits.map(({ key }) => ({ secretKey: key, type: SecretType.Shared }))
inputSecrets: secretDeletionCommits.map(({ key }) => ({ secretKey: key, type: SecretType.Shared })),
folderCommitService,
secretVersionDAL: secretVersionV2BridgeDAL
})
: [];
const updatedSecretApproval = await secretApprovalRequestDAL.updateById(
@@ -1171,21 +1193,28 @@ export const secretApprovalRequestServiceFactory = ({
const env = await projectEnvDAL.findOne({ id: policy.envId });
const user = await userDAL.findById(secretApprovalRequest.committerUserId);
await triggerSlackNotification({
projectId,
projectDAL,
kmsService,
projectSlackConfigDAL,
notification: {
type: SlackTriggerFeature.SECRET_APPROVAL,
payload: {
userEmail: user.email as string,
environment: env.name,
secretPath,
projectId,
requestId: secretApprovalRequest.id,
secretKeys: [...new Set(Object.values(data).flatMap((arr) => arr?.map((item) => item.secretName) ?? []))]
await triggerWorkflowIntegrationNotification({
input: {
projectId,
notification: {
type: TriggerFeature.SECRET_APPROVAL,
payload: {
userEmail: user.email as string,
environment: env.name,
secretPath,
projectId,
requestId: secretApprovalRequest.id,
secretKeys: [...new Set(Object.values(data).flatMap((arr) => arr?.map((item) => item.secretName) ?? []))]
}
}
},
dependencies: {
projectDAL,
projectSlackConfigDAL,
kmsService,
projectMicrosoftTeamsConfigDAL,
microsoftTeamsService
}
});
@@ -1503,21 +1532,28 @@ export const secretApprovalRequestServiceFactory = ({
const user = await userDAL.findById(secretApprovalRequest.committerUserId);
const env = await projectEnvDAL.findOne({ id: policy.envId });
await triggerSlackNotification({
projectId,
projectDAL,
kmsService,
projectSlackConfigDAL,
notification: {
type: SlackTriggerFeature.SECRET_APPROVAL,
payload: {
userEmail: user.email as string,
environment: env.name,
secretPath,
projectId,
requestId: secretApprovalRequest.id,
secretKeys: [...new Set(Object.values(data).flatMap((arr) => arr?.map((item) => item.secretKey) ?? []))]
await triggerWorkflowIntegrationNotification({
input: {
projectId,
notification: {
type: TriggerFeature.SECRET_APPROVAL,
payload: {
userEmail: user.email as string,
environment: env.name,
secretPath,
projectId,
requestId: secretApprovalRequest.id,
secretKeys: [...new Set(Object.values(data).flatMap((arr) => arr?.map((item) => item.secretKey) ?? []))]
}
}
},
dependencies: {
projectDAL,
kmsService,
projectSlackConfigDAL,
microsoftTeamsService,
projectMicrosoftTeamsConfigDAL
}
});

View File

@@ -10,6 +10,7 @@ import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { QueueName, TQueueServiceFactory } from "@app/queue";
import { ActorType } from "@app/services/auth/auth-type";
import { TFolderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { KmsDataKey } from "@app/services/kms/kms-types";
import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service";
@@ -87,6 +88,7 @@ type TSecretReplicationServiceFactoryDep = {
projectBotService: Pick<TProjectBotServiceFactory, "getBotKey">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
folderCommitService: Pick<TFolderCommitServiceFactory, "createCommit">;
};
export type TSecretReplicationServiceFactory = ReturnType<typeof secretReplicationServiceFactory>;
@@ -132,6 +134,7 @@ export const secretReplicationServiceFactory = ({
secretVersionV2BridgeDAL,
secretV2BridgeDAL,
kmsService,
folderCommitService,
resourceMetadataDAL
}: TSecretReplicationServiceFactoryDep) => {
const $getReplicatedSecrets = (
@@ -446,6 +449,7 @@ export const secretReplicationServiceFactory = ({
tx,
secretTagDAL,
resourceMetadataDAL,
folderCommitService,
secretVersionTagDAL: secretVersionV2TagBridgeDAL,
inputSecrets: locallyCreatedSecrets.map((doc) => {
return {
@@ -466,6 +470,7 @@ export const secretReplicationServiceFactory = ({
orgId,
folderId: destinationReplicationFolderId,
secretVersionDAL: secretVersionV2BridgeDAL,
folderCommitService,
secretDAL: secretV2BridgeDAL,
tx,
resourceMetadataDAL,

View File

@@ -219,7 +219,7 @@ export const parseRotationErrorMessage = (err: unknown): string => {
if (err instanceof AxiosError) {
errorMessage += err?.response?.data
? JSON.stringify(err?.response?.data)
: err?.message ?? "An unknown error occurred.";
: (err?.message ?? "An unknown error occurred.");
} else {
errorMessage += (err as Error)?.message || "An unknown error occurred.";
}

View File

@@ -20,7 +20,7 @@ export const BaseSecretRotationSchema = (type: SecretRotation) =>
// unique to provider
type: true,
parameters: true,
secretMappings: true
secretsMapping: true
}).extend({
connection: z.object({
app: z.literal(SECRET_ROTATION_CONNECTION_MAP[type]),

View File

@@ -61,6 +61,7 @@ import { TAppConnectionDALFactory } from "@app/services/app-connection/app-conne
import { decryptAppConnection } from "@app/services/app-connection/app-connection-fns";
import { TAppConnectionServiceFactory } from "@app/services/app-connection/app-connection-service";
import { ActorType } from "@app/services/auth/auth-type";
import { TFolderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { KmsDataKey } from "@app/services/kms/kms-types";
import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service";
@@ -96,7 +97,7 @@ export type TSecretRotationV2ServiceFactoryDep = {
TSecretV2BridgeDALFactory,
"bulkUpdate" | "insertMany" | "deleteMany" | "upsertSecretReferences" | "find" | "invalidateSecretCacheByProjectId"
>;
secretVersionV2BridgeDAL: Pick<TSecretVersionV2DALFactory, "insertMany">;
secretVersionV2BridgeDAL: Pick<TSecretVersionV2DALFactory, "insertMany" | "findLatestVersionMany">;
secretVersionTagV2BridgeDAL: Pick<TSecretVersionV2TagDALFactory, "insertMany">;
resourceMetadataDAL: Pick<TResourceMetadataDALFactory, "insertMany" | "delete">;
secretTagDAL: Pick<TSecretTagDALFactory, "saveTagsToSecretV2" | "deleteTagsToSecretV2" | "find">;
@@ -104,6 +105,7 @@ export type TSecretRotationV2ServiceFactoryDep = {
snapshotService: Pick<TSecretSnapshotServiceFactory, "performSnapshot">;
queueService: Pick<TQueueServiceFactory, "queuePg">;
appConnectionDAL: Pick<TAppConnectionDALFactory, "findById" | "update" | "updateById">;
folderCommitService: Pick<TFolderCommitServiceFactory, "createCommit">;
};
export type TSecretRotationV2ServiceFactory = ReturnType<typeof secretRotationV2ServiceFactory>;
@@ -141,6 +143,7 @@ export const secretRotationV2ServiceFactory = ({
snapshotService,
keyStore,
queueService,
folderCommitService,
appConnectionDAL
}: TSecretRotationV2ServiceFactoryDep) => {
const $queueSendSecretRotationStatusNotification = async (secretRotation: TSecretRotationV2Raw) => {
@@ -533,7 +536,12 @@ export const secretRotationV2ServiceFactory = ({
secretVersionDAL: secretVersionV2BridgeDAL,
secretVersionTagDAL: secretVersionTagV2BridgeDAL,
secretTagDAL,
resourceMetadataDAL
folderCommitService,
resourceMetadataDAL,
actor: {
type: actor.type,
actorId: actor.id
}
});
await secretRotationV2DAL.insertSecretMappings(
@@ -668,7 +676,12 @@ export const secretRotationV2ServiceFactory = ({
secretVersionDAL: secretVersionV2BridgeDAL,
secretVersionTagDAL: secretVersionTagV2BridgeDAL,
secretTagDAL,
resourceMetadataDAL
folderCommitService,
resourceMetadataDAL,
actor: {
type: actor.type,
actorId: actor.id
}
});
secretsMappingUpdated = true;
@@ -786,6 +799,9 @@ export const secretRotationV2ServiceFactory = ({
projectId,
folderId,
actorId: actor.id, // not actually used since rotated secrets are shared
actorType: actor.type,
folderCommitService,
secretVersionDAL: secretVersionV2BridgeDAL,
tx
});
}
@@ -926,6 +942,10 @@ export const secretRotationV2ServiceFactory = ({
secretDAL: secretV2BridgeDAL,
secretVersionDAL: secretVersionV2BridgeDAL,
secretVersionTagDAL: secretVersionTagV2BridgeDAL,
folderCommitService,
actor: {
type: ActorType.PLATFORM
},
secretTagDAL,
resourceMetadataDAL
});

View File

@@ -14,6 +14,7 @@ import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { ActorType } from "@app/services/auth/auth-type";
import { TFolderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { KmsDataKey } from "@app/services/kms/kms-types";
import { TProjectBotServiceFactory } from "@app/services/project-bot/project-bot-service";
@@ -53,6 +54,7 @@ type TSecretRotationQueueFactoryDep = {
secretVersionV2BridgeDAL: Pick<TSecretVersionV2DALFactory, "insertMany" | "findLatestVersionMany">;
telemetryService: Pick<TTelemetryServiceFactory, "sendPostHogEvents">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
folderCommitService: Pick<TFolderCommitServiceFactory, "createCommit">;
};
// These error should stop the repeatable job and ask user to reconfigure rotation
@@ -77,6 +79,7 @@ export const secretRotationQueueFactory = ({
telemetryService,
secretV2BridgeDAL,
secretVersionV2BridgeDAL,
folderCommitService,
kmsService
}: TSecretRotationQueueFactoryDep) => {
const addToQueue = async (rotationId: string, interval: number) => {
@@ -330,7 +333,7 @@ export const secretRotationQueueFactory = ({
})),
tx
);
await secretVersionV2BridgeDAL.insertMany(
const secretVersions = await secretVersionV2BridgeDAL.insertMany(
updatedSecrets.map(({ id, updatedAt, createdAt, ...el }) => ({
...el,
actorType: ActorType.PLATFORM,
@@ -338,6 +341,22 @@ export const secretRotationQueueFactory = ({
})),
tx
);
await folderCommitService.createCommit(
{
actor: {
type: ActorType.PLATFORM
},
message: "Changed by Secret rotation",
folderId: secretVersions[0].folderId,
changes: secretVersions.map((sv) => ({
type: "add",
isUpdate: true,
secretVersionId: sv.id
}))
},
tx
);
});
await secretV2BridgeDAL.invalidateSecretCacheByProjectId(secretRotation.projectId);

View File

@@ -0,0 +1,225 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { BadRequestError, DatabaseError } from "@app/lib/errors";
import { groupBy, unique } from "@app/lib/fn";
import { ormify } from "@app/lib/knex";
import { EHostGroupMembershipFilter } from "./ssh-host-group-types";
export type TSshHostGroupDALFactory = ReturnType<typeof sshHostGroupDALFactory>;
export const sshHostGroupDALFactory = (db: TDbClient) => {
const sshHostGroupOrm = ormify(db, TableName.SshHostGroup);
const findSshHostGroupsWithLoginMappings = async (projectId: string, tx?: Knex) => {
try {
// First, get all the SSH host groups with their login mappings
const rows = await (tx || db.replicaNode())(TableName.SshHostGroup)
.leftJoin(
TableName.SshHostLoginUser,
`${TableName.SshHostGroup}.id`,
`${TableName.SshHostLoginUser}.sshHostGroupId`
)
.leftJoin(
TableName.SshHostLoginUserMapping,
`${TableName.SshHostLoginUser}.id`,
`${TableName.SshHostLoginUserMapping}.sshHostLoginUserId`
)
.leftJoin(TableName.Users, `${TableName.SshHostLoginUserMapping}.userId`, `${TableName.Users}.id`)
.where(`${TableName.SshHostGroup}.projectId`, projectId)
.select(
db.ref("id").withSchema(TableName.SshHostGroup).as("sshHostGroupId"),
db.ref("projectId").withSchema(TableName.SshHostGroup),
db.ref("name").withSchema(TableName.SshHostGroup),
db.ref("loginUser").withSchema(TableName.SshHostLoginUser),
db.ref("username").withSchema(TableName.Users),
db.ref("userId").withSchema(TableName.SshHostLoginUserMapping)
)
.orderBy(`${TableName.SshHostGroup}.updatedAt`, "desc");
const hostsGrouped = groupBy(rows, (r) => r.sshHostGroupId);
const hostGroupIds = Object.keys(hostsGrouped);
type HostCountRow = {
sshHostGroupId: string;
host_count: string;
};
const hostCountsQuery = (await (tx ||
db
.replicaNode()(TableName.SshHostGroupMembership)
.select(`${TableName.SshHostGroupMembership}.sshHostGroupId`, db.raw(`count(*) as host_count`))
.whereIn(`${TableName.SshHostGroupMembership}.sshHostGroupId`, hostGroupIds)
.groupBy(`${TableName.SshHostGroupMembership}.sshHostGroupId`))) as HostCountRow[];
const hostCountsMap = hostCountsQuery.reduce<Record<string, number>>((acc, { sshHostGroupId, host_count }) => {
acc[sshHostGroupId] = Number(host_count);
return acc;
}, {});
return Object.values(hostsGrouped).map((hostRows) => {
const { sshHostGroupId, name } = hostRows[0];
const loginMappingGrouped = groupBy(
hostRows.filter((r) => r.loginUser),
(r) => r.loginUser
);
const loginMappings = Object.entries(loginMappingGrouped).map(([loginUser, entries]) => ({
loginUser,
allowedPrincipals: {
usernames: unique(entries.map((e) => e.username)).filter(Boolean)
}
}));
return {
id: sshHostGroupId,
projectId,
name,
loginMappings,
hostCount: hostCountsMap[sshHostGroupId] ?? 0
};
});
} catch (error) {
throw new DatabaseError({ error, name: `${TableName.SshHostGroup}: FindSshHostGroupsWithLoginMappings` });
}
};
const findSshHostGroupByIdWithLoginMappings = async (sshHostGroupId: string, tx?: Knex) => {
try {
const rows = await (tx || db.replicaNode())(TableName.SshHostGroup)
.leftJoin(
TableName.SshHostLoginUser,
`${TableName.SshHostGroup}.id`,
`${TableName.SshHostLoginUser}.sshHostGroupId`
)
.leftJoin(
TableName.SshHostLoginUserMapping,
`${TableName.SshHostLoginUser}.id`,
`${TableName.SshHostLoginUserMapping}.sshHostLoginUserId`
)
.leftJoin(TableName.Users, `${TableName.SshHostLoginUserMapping}.userId`, `${TableName.Users}.id`)
.where(`${TableName.SshHostGroup}.id`, sshHostGroupId)
.select(
db.ref("id").withSchema(TableName.SshHostGroup).as("sshHostGroupId"),
db.ref("projectId").withSchema(TableName.SshHostGroup),
db.ref("name").withSchema(TableName.SshHostGroup),
db.ref("loginUser").withSchema(TableName.SshHostLoginUser),
db.ref("username").withSchema(TableName.Users),
db.ref("userId").withSchema(TableName.SshHostLoginUserMapping)
);
if (rows.length === 0) return null;
const { sshHostGroupId: id, projectId, name } = rows[0];
const loginMappingGrouped = groupBy(
rows.filter((r) => r.loginUser),
(r) => r.loginUser
);
const loginMappings = Object.entries(loginMappingGrouped).map(([loginUser, entries]) => ({
loginUser,
allowedPrincipals: {
usernames: unique(entries.map((e) => e.username)).filter(Boolean)
}
}));
return {
id,
projectId,
name,
loginMappings
};
} catch (error) {
throw new DatabaseError({ error, name: `${TableName.SshHostGroup}: FindSshHostGroupByIdWithLoginMappings` });
}
};
const findAllSshHostsInGroup = async ({
sshHostGroupId,
offset = 0,
limit,
filter
}: {
sshHostGroupId: string;
offset?: number;
limit?: number;
filter?: EHostGroupMembershipFilter;
}) => {
try {
const sshHostGroup = await db
.replicaNode()(TableName.SshHostGroup)
.where(`${TableName.SshHostGroup}.id`, sshHostGroupId)
.select("projectId")
.first();
if (!sshHostGroup) {
throw new BadRequestError({
message: `SSH host group with ID ${sshHostGroupId} not found`
});
}
const query = db
.replicaNode()(TableName.SshHost)
.where(`${TableName.SshHost}.projectId`, sshHostGroup.projectId)
.leftJoin(TableName.SshHostGroupMembership, (bd) => {
bd.on(`${TableName.SshHostGroupMembership}.sshHostId`, "=", `${TableName.SshHost}.id`).andOn(
`${TableName.SshHostGroupMembership}.sshHostGroupId`,
"=",
db.raw("?", [sshHostGroupId])
);
})
.select(
db.ref("id").withSchema(TableName.SshHost),
db.ref("hostname").withSchema(TableName.SshHost),
db.ref("alias").withSchema(TableName.SshHost),
db.ref("sshHostGroupId").withSchema(TableName.SshHostGroupMembership),
db.ref("createdAt").withSchema(TableName.SshHostGroupMembership).as("joinedGroupAt"),
db.raw(`count(*) OVER() as total_count`)
)
.offset(offset)
.orderBy(`${TableName.SshHost}.hostname`, "asc");
if (limit) {
void query.limit(limit);
}
if (filter) {
switch (filter) {
case EHostGroupMembershipFilter.GROUP_MEMBERS:
void query.andWhere(`${TableName.SshHostGroupMembership}.createdAt`, "is not", null);
break;
case EHostGroupMembershipFilter.NON_GROUP_MEMBERS:
void query.andWhere(`${TableName.SshHostGroupMembership}.createdAt`, "is", null);
break;
default:
break;
}
}
const hosts = await query;
return {
hosts: hosts.map(({ id, hostname, alias, sshHostGroupId: memberGroupId, joinedGroupAt }) => ({
id,
hostname,
alias,
isPartOfGroup: !!memberGroupId,
joinedGroupAt
})),
// @ts-expect-error col select is raw and not strongly typed
totalCount: Number(hosts?.[0]?.total_count ?? 0)
};
} catch (error) {
throw new DatabaseError({ error, name: `${TableName.SshHostGroupMembership}: FindAllSshHostsInGroup` });
}
};
return {
findSshHostGroupsWithLoginMappings,
findSshHostGroupByIdWithLoginMappings,
findAllSshHostsInGroup,
...sshHostGroupOrm
};
};

View File

@@ -0,0 +1,13 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TSshHostGroupMembershipDALFactory = ReturnType<typeof sshHostGroupMembershipDALFactory>;
export const sshHostGroupMembershipDALFactory = (db: TDbClient) => {
const sshHostGroupMembershipOrm = ormify(db, TableName.SshHostGroupMembership);
return {
...sshHostGroupMembershipOrm
};
};

View File

@@ -0,0 +1,7 @@
import { SshHostGroupsSchema } from "@app/db/schemas";
export const sanitizedSshHostGroup = SshHostGroupsSchema.pick({
id: true,
projectId: true,
name: true
});

View File

@@ -0,0 +1,397 @@
import { ForbiddenError } from "@casl/ability";
import { ActionProjectType } from "@app/db/schemas";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { TSshHostDALFactory } from "@app/ee/services/ssh-host/ssh-host-dal";
import { TSshHostLoginUserMappingDALFactory } from "@app/ee/services/ssh-host/ssh-host-login-user-mapping-dal";
import { TSshHostLoginUserDALFactory } from "@app/ee/services/ssh-host/ssh-login-user-dal";
import { TSshHostGroupDALFactory } from "@app/ee/services/ssh-host-group/ssh-host-group-dal";
import { TSshHostGroupMembershipDALFactory } from "@app/ee/services/ssh-host-group/ssh-host-group-membership-dal";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TUserDALFactory } from "@app/services/user/user-dal";
import { TLicenseServiceFactory } from "../license/license-service";
import { createSshLoginMappings } from "../ssh-host/ssh-host-fns";
import {
TAddHostToSshHostGroupDTO,
TCreateSshHostGroupDTO,
TDeleteSshHostGroupDTO,
TGetSshHostGroupDTO,
TListSshHostGroupHostsDTO,
TRemoveHostFromSshHostGroupDTO,
TUpdateSshHostGroupDTO
} from "./ssh-host-group-types";
type TSshHostGroupServiceFactoryDep = {
projectDAL: Pick<TProjectDALFactory, "findById" | "find">;
sshHostDAL: Pick<TSshHostDALFactory, "findSshHostByIdWithLoginMappings">;
sshHostGroupDAL: Pick<
TSshHostGroupDALFactory,
| "create"
| "updateById"
| "findById"
| "deleteById"
| "transaction"
| "findSshHostGroupByIdWithLoginMappings"
| "findAllSshHostsInGroup"
| "findOne"
| "find"
>;
sshHostGroupMembershipDAL: Pick<TSshHostGroupMembershipDALFactory, "create" | "deleteById" | "findOne">;
sshHostLoginUserDAL: Pick<TSshHostLoginUserDALFactory, "create" | "transaction" | "delete">;
sshHostLoginUserMappingDAL: Pick<TSshHostLoginUserMappingDALFactory, "insertMany">;
userDAL: Pick<TUserDALFactory, "find">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission" | "getUserProjectPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
};
export type TSshHostGroupServiceFactory = ReturnType<typeof sshHostGroupServiceFactory>;
export const sshHostGroupServiceFactory = ({
projectDAL,
sshHostDAL,
sshHostGroupDAL,
sshHostGroupMembershipDAL,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
licenseService
}: TSshHostGroupServiceFactoryDep) => {
const createSshHostGroup = async ({
projectId,
name,
loginMappings,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TCreateSshHostGroupDTO) => {
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Create, ProjectPermissionSub.SshHostGroups);
const plan = await licenseService.getPlan(actorOrgId);
if (!plan.sshHostGroups)
throw new BadRequestError({
message: "Failed to create SSH host group due to plan restriction. Upgrade plan to create group."
});
const newSshHostGroup = await sshHostGroupDAL.transaction(async (tx) => {
// (dangtony98): room to optimize check to ensure that
// the SSH host group name is unique across the whole org
const project = await projectDAL.findById(projectId, tx);
if (!project) throw new NotFoundError({ message: `Project with ID '${projectId}' not found` });
const projects = await projectDAL.find(
{
orgId: project.orgId
},
{ tx }
);
const existingSshHostGroup = await sshHostGroupDAL.find(
{
name,
$in: {
projectId: projects.map((p) => p.id)
}
},
{ tx }
);
if (existingSshHostGroup.length) {
throw new BadRequestError({
message: `SSH host group with name '${name}' already exists in the organization`
});
}
const sshHostGroup = await sshHostGroupDAL.create(
{
projectId,
name
},
tx
);
await createSshLoginMappings({
sshHostGroupId: sshHostGroup.id,
loginMappings,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
projectId,
actorAuthMethod,
actorOrgId,
tx
});
const newSshHostGroupWithLoginMappings = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(
sshHostGroup.id,
tx
);
if (!newSshHostGroupWithLoginMappings) {
throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroup.id}' not found` });
}
return newSshHostGroupWithLoginMappings;
});
return newSshHostGroup;
};
const updateSshHostGroup = async ({
sshHostGroupId,
name,
loginMappings,
actor,
actorId,
actorAuthMethod,
actorOrgId
}: TUpdateSshHostGroupDTO) => {
const sshHostGroup = await sshHostGroupDAL.findById(sshHostGroupId);
if (!sshHostGroup) throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroupId}' not found` });
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Edit, ProjectPermissionSub.SshHostGroups);
const plan = await licenseService.getPlan(actorOrgId);
if (!plan.sshHostGroups)
throw new BadRequestError({
message: "Failed to update SSH host group due to plan restriction. Upgrade plan to update group."
});
const updatedSshHostGroup = await sshHostGroupDAL.transaction(async (tx) => {
await sshHostGroupDAL.updateById(
sshHostGroupId,
{
name
},
tx
);
if (loginMappings) {
await sshHostLoginUserDAL.delete({ sshHostGroupId: sshHostGroup.id }, tx);
if (loginMappings.length) {
await createSshLoginMappings({
sshHostGroupId: sshHostGroup.id,
loginMappings,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
tx
});
}
}
const updatedSshHostGroupWithLoginMappings = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(
sshHostGroup.id,
tx
);
if (!updatedSshHostGroupWithLoginMappings) {
throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroup.id}' not found` });
}
return updatedSshHostGroupWithLoginMappings;
});
return updatedSshHostGroup;
};
const getSshHostGroup = async ({
sshHostGroupId,
actor,
actorId,
actorAuthMethod,
actorOrgId
}: TGetSshHostGroupDTO) => {
const sshHostGroup = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(sshHostGroupId);
if (!sshHostGroup) throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroupId}' not found` });
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.SshHostGroups);
return sshHostGroup;
};
const deleteSshHostGroup = async ({
sshHostGroupId,
actor,
actorId,
actorAuthMethod,
actorOrgId
}: TDeleteSshHostGroupDTO) => {
const sshHostGroup = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(sshHostGroupId);
if (!sshHostGroup) throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroupId}' not found` });
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Delete, ProjectPermissionSub.SshHostGroups);
await sshHostGroupDAL.deleteById(sshHostGroupId);
return sshHostGroup;
};
const listSshHostGroupHosts = async ({
sshHostGroupId,
actor,
actorId,
actorAuthMethod,
actorOrgId,
filter
}: TListSshHostGroupHostsDTO) => {
const sshHostGroup = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(sshHostGroupId);
if (!sshHostGroup) throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroupId}' not found` });
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.SshHostGroups);
const { hosts, totalCount } = await sshHostGroupDAL.findAllSshHostsInGroup({ sshHostGroupId, filter });
return { sshHostGroup, hosts, totalCount };
};
const addHostToSshHostGroup = async ({
sshHostGroupId,
hostId,
actor,
actorId,
actorAuthMethod,
actorOrgId
}: TAddHostToSshHostGroupDTO) => {
const sshHostGroup = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(sshHostGroupId);
if (!sshHostGroup) throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroupId}' not found` });
const sshHost = await sshHostDAL.findSshHostByIdWithLoginMappings(hostId);
if (!sshHost) {
throw new NotFoundError({
message: `SSH host with ID ${hostId} not found`
});
}
if (sshHostGroup.projectId !== sshHost.projectId) {
throw new BadRequestError({
message: `SSH host with ID ${hostId} not found in project ${sshHostGroup.projectId}`
});
}
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Edit, ProjectPermissionSub.SshHostGroups);
await sshHostGroupMembershipDAL.create({ sshHostGroupId, sshHostId: hostId });
return { sshHostGroup, sshHost };
};
const removeHostFromSshHostGroup = async ({
sshHostGroupId,
hostId,
actor,
actorId,
actorAuthMethod,
actorOrgId
}: TRemoveHostFromSshHostGroupDTO) => {
const sshHostGroup = await sshHostGroupDAL.findSshHostGroupByIdWithLoginMappings(sshHostGroupId);
if (!sshHostGroup) throw new NotFoundError({ message: `SSH host group with ID '${sshHostGroupId}' not found` });
const sshHost = await sshHostDAL.findSshHostByIdWithLoginMappings(hostId);
if (!sshHost) {
throw new NotFoundError({
message: `SSH host with ID ${hostId} not found`
});
}
if (sshHostGroup.projectId !== sshHost.projectId) {
throw new BadRequestError({
message: `SSH host with ID ${hostId} not found in project ${sshHostGroup.projectId}`
});
}
const { permission } = await permissionService.getProjectPermission({
actor,
actorId,
projectId: sshHostGroup.projectId,
actorAuthMethod,
actorOrgId,
actionProjectType: ActionProjectType.SSH
});
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Edit, ProjectPermissionSub.SshHostGroups);
const sshHostGroupMembership = await sshHostGroupMembershipDAL.findOne({
sshHostGroupId,
sshHostId: hostId
});
if (!sshHostGroupMembership) {
throw new NotFoundError({
message: `SSH host with ID ${hostId} not found in SSH host group with ID ${sshHostGroupId}`
});
}
await sshHostGroupMembershipDAL.deleteById(sshHostGroupMembership.id);
return { sshHostGroup, sshHost };
};
return {
createSshHostGroup,
getSshHostGroup,
deleteSshHostGroup,
updateSshHostGroup,
listSshHostGroupHosts,
addHostToSshHostGroup,
removeHostFromSshHostGroup
};
};

View File

@@ -0,0 +1,46 @@
import { TLoginMapping } from "@app/ee/services/ssh-host/ssh-host-types";
import { TProjectPermission } from "@app/lib/types";
export type TCreateSshHostGroupDTO = {
name: string;
loginMappings: TLoginMapping[];
} & TProjectPermission;
export type TUpdateSshHostGroupDTO = {
sshHostGroupId: string;
name?: string;
loginMappings?: {
loginUser: string;
allowedPrincipals: {
usernames: string[];
};
}[];
} & Omit<TProjectPermission, "projectId">;
export type TGetSshHostGroupDTO = {
sshHostGroupId: string;
} & Omit<TProjectPermission, "projectId">;
export type TDeleteSshHostGroupDTO = {
sshHostGroupId: string;
} & Omit<TProjectPermission, "projectId">;
export type TListSshHostGroupHostsDTO = {
sshHostGroupId: string;
filter?: EHostGroupMembershipFilter;
} & Omit<TProjectPermission, "projectId">;
export type TAddHostToSshHostGroupDTO = {
sshHostGroupId: string;
hostId: string;
} & Omit<TProjectPermission, "projectId">;
export type TRemoveHostFromSshHostGroupDTO = {
sshHostGroupId: string;
hostId: string;
} & Omit<TProjectPermission, "projectId">;
export enum EHostGroupMembershipFilter {
GROUP_MEMBERS = "group-members",
NON_GROUP_MEMBERS = "non-group-members"
}

View File

@@ -6,6 +6,8 @@ import { DatabaseError } from "@app/lib/errors";
import { groupBy, unique } from "@app/lib/fn";
import { ormify } from "@app/lib/knex";
import { LoginMappingSource } from "./ssh-host-types";
export type TSshHostDALFactory = ReturnType<typeof sshHostDALFactory>;
export const sshHostDALFactory = (db: TDbClient) => {
@@ -13,20 +15,22 @@ export const sshHostDALFactory = (db: TDbClient) => {
const findUserAccessibleSshHosts = async (projectIds: string[], userId: string, tx?: Knex) => {
try {
const user = await (tx || db.replicaNode())(TableName.Users).where({ id: userId }).select("username").first();
const knex = tx || db.replicaNode();
const user = await knex(TableName.Users).where({ id: userId }).select("username").first();
if (!user) {
throw new DatabaseError({ name: `${TableName.Users}: UserNotFound`, error: new Error("User not found") });
}
const rows = await (tx || db.replicaNode())(TableName.SshHost)
// get hosts where user has direct login mappings
const directHostRows = await knex(TableName.SshHost)
.leftJoin(TableName.SshHostLoginUser, `${TableName.SshHost}.id`, `${TableName.SshHostLoginUser}.sshHostId`)
.leftJoin(
TableName.SshHostLoginUserMapping,
`${TableName.SshHostLoginUser}.id`,
`${TableName.SshHostLoginUserMapping}.sshHostLoginUserId`
)
.leftJoin(TableName.Users, `${TableName.Users}.id`, `${TableName.SshHostLoginUserMapping}.userId`)
.whereIn(`${TableName.SshHost}.projectId`, projectIds)
.andWhere(`${TableName.SshHostLoginUserMapping}.userId`, userId)
.select(
@@ -37,26 +41,70 @@ export const sshHostDALFactory = (db: TDbClient) => {
db.ref("userCertTtl").withSchema(TableName.SshHost),
db.ref("hostCertTtl").withSchema(TableName.SshHost),
db.ref("loginUser").withSchema(TableName.SshHostLoginUser),
db.ref("username").withSchema(TableName.Users),
db.ref("userId").withSchema(TableName.SshHostLoginUserMapping),
db.ref("userSshCaId").withSchema(TableName.SshHost),
db.ref("hostSshCaId").withSchema(TableName.SshHost)
)
.orderBy(`${TableName.SshHost}.updatedAt`, "desc");
);
const grouped = groupBy(rows, (r) => r.sshHostId);
return Object.values(grouped).map((hostRows) => {
// get hosts where user has login mappings via host groups
const groupHostRows = await knex(TableName.SshHostGroupMembership)
.join(
TableName.SshHostLoginUser,
`${TableName.SshHostGroupMembership}.sshHostGroupId`,
`${TableName.SshHostLoginUser}.sshHostGroupId`
)
.leftJoin(
TableName.SshHostLoginUserMapping,
`${TableName.SshHostLoginUser}.id`,
`${TableName.SshHostLoginUserMapping}.sshHostLoginUserId`
)
.join(TableName.SshHost, `${TableName.SshHostGroupMembership}.sshHostId`, `${TableName.SshHost}.id`)
.whereIn(`${TableName.SshHost}.projectId`, projectIds)
.andWhere(`${TableName.SshHostLoginUserMapping}.userId`, userId)
.select(
db.ref("id").withSchema(TableName.SshHost).as("sshHostId"),
db.ref("projectId").withSchema(TableName.SshHost),
db.ref("hostname").withSchema(TableName.SshHost),
db.ref("alias").withSchema(TableName.SshHost),
db.ref("userCertTtl").withSchema(TableName.SshHost),
db.ref("hostCertTtl").withSchema(TableName.SshHost),
db.ref("loginUser").withSchema(TableName.SshHostLoginUser),
db.ref("userSshCaId").withSchema(TableName.SshHost),
db.ref("hostSshCaId").withSchema(TableName.SshHost)
);
const directHostRowsWithSource = directHostRows.map((row) => ({
...row,
source: LoginMappingSource.HOST
}));
const groupHostRowsWithSource = groupHostRows.map((row) => ({
...row,
source: LoginMappingSource.HOST_GROUP
}));
const mergedRows = [...directHostRowsWithSource, ...groupHostRowsWithSource];
const hostsGrouped = groupBy(mergedRows, (r) => r.sshHostId);
return Object.values(hostsGrouped).map((hostRows) => {
const { sshHostId, hostname, alias, userCertTtl, hostCertTtl, userSshCaId, hostSshCaId, projectId } =
hostRows[0];
const loginMappingGrouped = groupBy(hostRows, (r) => r.loginUser);
const loginMappings = Object.entries(loginMappingGrouped).map(([loginUser, mappings]) => {
// Prefer HOST source over HOST_GROUP
const preferredMapping =
mappings.find((m) => m.source === LoginMappingSource.HOST) ||
mappings.find((m) => m.source === LoginMappingSource.HOST_GROUP);
const loginMappings = Object.entries(loginMappingGrouped).map(([loginUser]) => ({
loginUser,
allowedPrincipals: {
usernames: [user.username]
}
}));
return {
loginUser,
allowedPrincipals: {
usernames: [user.username]
},
source: preferredMapping!.source
};
});
return {
id: sshHostId,
@@ -101,20 +149,57 @@ export const sshHostDALFactory = (db: TDbClient) => {
)
.orderBy(`${TableName.SshHost}.updatedAt`, "desc");
// process login mappings inherited from groups that hosts are part of
const hostIds = unique(rows.map((r) => r.sshHostId)).filter(Boolean);
const groupRows = await (tx || db.replicaNode())(TableName.SshHostGroupMembership)
.join(
TableName.SshHostLoginUser,
`${TableName.SshHostGroupMembership}.sshHostGroupId`,
`${TableName.SshHostLoginUser}.sshHostGroupId`
)
.leftJoin(
TableName.SshHostLoginUserMapping,
`${TableName.SshHostLoginUser}.id`,
`${TableName.SshHostLoginUserMapping}.sshHostLoginUserId`
)
.leftJoin(TableName.Users, `${TableName.SshHostLoginUserMapping}.userId`, `${TableName.Users}.id`)
.select(
db.ref("sshHostId").withSchema(TableName.SshHostGroupMembership),
db.ref("loginUser").withSchema(TableName.SshHostLoginUser),
db.ref("username").withSchema(TableName.Users)
)
.whereIn(`${TableName.SshHostGroupMembership}.sshHostId`, hostIds);
const groupedGroupMappings = groupBy(groupRows, (r) => r.sshHostId);
const hostsGrouped = groupBy(rows, (r) => r.sshHostId);
return Object.values(hostsGrouped).map((hostRows) => {
const { sshHostId, hostname, alias, userCertTtl, hostCertTtl, userSshCaId, hostSshCaId } = hostRows[0];
// direct login mappings
const loginMappingGrouped = groupBy(
hostRows.filter((r) => r.loginUser),
(r) => r.loginUser
);
const loginMappings = Object.entries(loginMappingGrouped).map(([loginUser, entries]) => ({
const directMappings = Object.entries(loginMappingGrouped).map(([loginUser, entries]) => ({
loginUser,
allowedPrincipals: {
usernames: unique(entries.map((e) => e.username)).filter(Boolean)
}
},
source: LoginMappingSource.HOST
}));
// group-inherited login mappings
const inheritedGroupRows = groupedGroupMappings[sshHostId] || [];
const inheritedGrouped = groupBy(inheritedGroupRows, (r) => r.loginUser);
const groupMappings = Object.entries(inheritedGrouped).map(([loginUser, entries]) => ({
loginUser,
allowedPrincipals: {
usernames: unique(entries.map((e) => e.username)).filter(Boolean)
},
source: LoginMappingSource.HOST_GROUP
}));
return {
@@ -124,7 +209,7 @@ export const sshHostDALFactory = (db: TDbClient) => {
projectId,
userCertTtl,
hostCertTtl,
loginMappings,
loginMappings: [...directMappings, ...groupMappings],
userSshCaId,
hostSshCaId
};
@@ -163,16 +248,50 @@ export const sshHostDALFactory = (db: TDbClient) => {
const { sshHostId: id, projectId, hostname, alias, userCertTtl, hostCertTtl, userSshCaId, hostSshCaId } = rows[0];
const loginMappingGrouped = groupBy(
// direct login mappings
const directGrouped = groupBy(
rows.filter((r) => r.loginUser),
(r) => r.loginUser
);
const loginMappings = Object.entries(loginMappingGrouped).map(([loginUser, entries]) => ({
const directMappings = Object.entries(directGrouped).map(([loginUser, entries]) => ({
loginUser,
allowedPrincipals: {
usernames: unique(entries.map((e) => e.username)).filter(Boolean)
}
},
source: LoginMappingSource.HOST
}));
// group login mappings
const groupRows = await (tx || db.replicaNode())(TableName.SshHostGroupMembership)
.join(
TableName.SshHostLoginUser,
`${TableName.SshHostGroupMembership}.sshHostGroupId`,
`${TableName.SshHostLoginUser}.sshHostGroupId`
)
.leftJoin(
TableName.SshHostLoginUserMapping,
`${TableName.SshHostLoginUser}.id`,
`${TableName.SshHostLoginUserMapping}.sshHostLoginUserId`
)
.leftJoin(TableName.Users, `${TableName.SshHostLoginUserMapping}.userId`, `${TableName.Users}.id`)
.where(`${TableName.SshHostGroupMembership}.sshHostId`, sshHostId)
.select(
db.ref("loginUser").withSchema(TableName.SshHostLoginUser),
db.ref("username").withSchema(TableName.Users)
);
const groupGrouped = groupBy(
groupRows.filter((r) => r.loginUser),
(r) => r.loginUser
);
const groupMappings = Object.entries(groupGrouped).map(([loginUser, entries]) => ({
loginUser,
allowedPrincipals: {
usernames: unique(entries.map((e) => e.username)).filter(Boolean)
},
source: LoginMappingSource.HOST_GROUP
}));
return {
@@ -182,7 +301,7 @@ export const sshHostDALFactory = (db: TDbClient) => {
alias,
userCertTtl,
hostCertTtl,
loginMappings,
loginMappings: [...directMappings, ...groupMappings],
userSshCaId,
hostSshCaId
};

View File

@@ -0,0 +1,85 @@
import { Knex } from "knex";
import { ActionProjectType } from "@app/db/schemas";
import { BadRequestError } from "@app/lib/errors";
import { TCreateSshLoginMappingsDTO } from "./ssh-host-types";
/**
* Create SSH login mappings for a given SSH host
* or SSH host group.
*/
export const createSshLoginMappings = async ({
sshHostId,
sshHostGroupId,
loginMappings,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
projectId,
actorAuthMethod,
actorOrgId,
tx: outerTx
}: TCreateSshLoginMappingsDTO) => {
const processCreation = async (tx: Knex) => {
// (dangtony98): room to optimize
for await (const { loginUser, allowedPrincipals } of loginMappings) {
const sshHostLoginUser = await sshHostLoginUserDAL.create(
// (dangtony98): should either pass in sshHostId or sshHostGroupId but not both
{
sshHostId,
sshHostGroupId,
loginUser
},
tx
);
if (allowedPrincipals.usernames.length > 0) {
const users = await userDAL.find(
{
$in: {
username: allowedPrincipals.usernames
}
},
{ tx }
);
const foundUsernames = new Set(users.map((u) => u.username));
for (const uname of allowedPrincipals.usernames) {
if (!foundUsernames.has(uname)) {
throw new BadRequestError({
message: `Invalid username: ${uname}`
});
}
}
for await (const user of users) {
// check that each user has access to the SSH project
await permissionService.getUserProjectPermission({
userId: user.id,
projectId,
authMethod: actorAuthMethod,
userOrgId: actorOrgId,
actionProjectType: ActionProjectType.SSH
});
}
await sshHostLoginUserMappingDAL.insertMany(
users.map((user) => ({
sshHostLoginUserId: sshHostLoginUser.id,
userId: user.id
})),
tx
);
}
}
};
if (outerTx) {
return processCreation(outerTx);
}
return sshHostLoginUserDAL.transaction(processCreation);
};

View File

@@ -26,6 +26,7 @@ import {
getSshPublicKey
} from "../ssh/ssh-certificate-authority-fns";
import { SshCertType } from "../ssh/ssh-certificate-authority-types";
import { createSshLoginMappings } from "./ssh-host-fns";
import {
TCreateSshHostDTO,
TDeleteSshHostDTO,
@@ -202,56 +203,18 @@ export const sshHostServiceFactory = ({
tx
);
// (dangtony98): room to optimize
for await (const { loginUser, allowedPrincipals } of loginMappings) {
const sshHostLoginUser = await sshHostLoginUserDAL.create(
{
sshHostId: host.id,
loginUser
},
tx
);
if (allowedPrincipals.usernames.length > 0) {
const users = await userDAL.find(
{
$in: {
username: allowedPrincipals.usernames
}
},
{ tx }
);
const foundUsernames = new Set(users.map((u) => u.username));
for (const uname of allowedPrincipals.usernames) {
if (!foundUsernames.has(uname)) {
throw new BadRequestError({
message: `Invalid username: ${uname}`
});
}
}
for await (const user of users) {
// check that each user has access to the SSH project
await permissionService.getUserProjectPermission({
userId: user.id,
projectId,
authMethod: actorAuthMethod,
userOrgId: actorOrgId,
actionProjectType: ActionProjectType.SSH
});
}
await sshHostLoginUserMappingDAL.insertMany(
users.map((user) => ({
sshHostLoginUserId: sshHostLoginUser.id,
userId: user.id
})),
tx
);
}
}
await createSshLoginMappings({
sshHostId: host.id,
loginMappings,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
projectId,
actorAuthMethod,
actorOrgId,
tx
});
const newSshHostWithLoginMappings = await sshHostDAL.findSshHostByIdWithLoginMappings(host.id, tx);
if (!newSshHostWithLoginMappings) {
@@ -310,54 +273,18 @@ export const sshHostServiceFactory = ({
if (loginMappings) {
await sshHostLoginUserDAL.delete({ sshHostId: host.id }, tx);
if (loginMappings.length) {
for await (const { loginUser, allowedPrincipals } of loginMappings) {
const sshHostLoginUser = await sshHostLoginUserDAL.create(
{
sshHostId: host.id,
loginUser
},
tx
);
if (allowedPrincipals.usernames.length > 0) {
const users = await userDAL.find(
{
$in: {
username: allowedPrincipals.usernames
}
},
{ tx }
);
const foundUsernames = new Set(users.map((u) => u.username));
for (const uname of allowedPrincipals.usernames) {
if (!foundUsernames.has(uname)) {
throw new BadRequestError({
message: `Invalid username: ${uname}`
});
}
}
for await (const user of users) {
await permissionService.getUserProjectPermission({
userId: user.id,
projectId: host.projectId,
authMethod: actorAuthMethod,
userOrgId: actorOrgId,
actionProjectType: ActionProjectType.SSH
});
}
await sshHostLoginUserMappingDAL.insertMany(
users.map((user) => ({
sshHostLoginUserId: sshHostLoginUser.id,
userId: user.id
})),
tx
);
}
}
await createSshLoginMappings({
sshHostId: host.id,
loginMappings,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
projectId: host.projectId,
actorAuthMethod,
actorOrgId,
tx
});
}
}

View File

@@ -1,18 +1,32 @@
import { Knex } from "knex";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { TSshHostLoginUserMappingDALFactory } from "@app/ee/services/ssh-host/ssh-host-login-user-mapping-dal";
import { TSshHostLoginUserDALFactory } from "@app/ee/services/ssh-host/ssh-login-user-dal";
import { TProjectPermission } from "@app/lib/types";
import { ActorAuthMethod } from "@app/services/auth/auth-type";
import { TUserDALFactory } from "@app/services/user/user-dal";
export type TListSshHostsDTO = Omit<TProjectPermission, "projectId">;
export type TLoginMapping = {
loginUser: string;
allowedPrincipals: {
usernames: string[];
};
};
export enum LoginMappingSource {
HOST = "host",
HOST_GROUP = "hostGroup"
}
export type TCreateSshHostDTO = {
hostname: string;
alias?: string;
userCertTtl: string;
hostCertTtl: string;
loginMappings: {
loginUser: string;
allowedPrincipals: {
usernames: string[];
};
}[];
loginMappings: TLoginMapping[];
userSshCaId?: string;
hostSshCaId?: string;
} & TProjectPermission;
@@ -23,12 +37,7 @@ export type TUpdateSshHostDTO = {
alias?: string;
userCertTtl?: string;
hostCertTtl?: string;
loginMappings?: {
loginUser: string;
allowedPrincipals: {
usernames: string[];
};
}[];
loginMappings?: TLoginMapping[];
} & Omit<TProjectPermission, "projectId">;
export type TGetSshHostDTO = {
@@ -48,3 +57,19 @@ export type TIssueSshHostHostCertDTO = {
sshHostId: string;
publicKey: string;
} & Omit<TProjectPermission, "projectId">;
type BaseCreateSshLoginMappingsDTO = {
loginMappings: TLoginMapping[];
sshHostLoginUserDAL: Pick<TSshHostLoginUserDALFactory, "create" | "transaction">;
sshHostLoginUserMappingDAL: Pick<TSshHostLoginUserMappingDALFactory, "insertMany">;
userDAL: Pick<TUserDALFactory, "find">;
permissionService: Pick<TPermissionServiceFactory, "getUserProjectPermission">;
projectId: string;
actorAuthMethod: ActorAuthMethod;
actorOrgId: string;
tx?: Knex;
};
export type TCreateSshLoginMappingsDTO =
| (BaseCreateSshLoginMappingsDTO & { sshHostId: string; sshHostGroupId?: undefined })
| (BaseCreateSshLoginMappingsDTO & { sshHostGroupId: string; sshHostId?: undefined });

View File

@@ -282,7 +282,7 @@ export const sshCertificateAuthorityServiceFactory = ({
// set [keyId] depending on if [allowCustomKeyIds] is true or false
const keyId = sshCertificateTemplate.allowCustomKeyIds
? requestedKeyId ?? `${actor}-${actorId}`
? (requestedKeyId ?? `${actor}-${actorId}`)
: `${actor}-${actorId}`;
const sshCaSecret = await sshCertificateAuthoritySecretDAL.findOne({ sshCaId: sshCertificateTemplate.sshCaId });
@@ -404,7 +404,7 @@ export const sshCertificateAuthorityServiceFactory = ({
// set [keyId] depending on if [allowCustomKeyIds] is true or false
const keyId = sshCertificateTemplate.allowCustomKeyIds
? requestedKeyId ?? `${actor}-${actorId}`
? (requestedKeyId ?? `${actor}-${actorId}`)
: `${actor}-${actorId}`;
const sshCaSecret = await sshCertificateAuthoritySecretDAL.findOne({ sshCaId: sshCertificateTemplate.sshCaId });

View File

@@ -48,6 +48,8 @@ export enum ApiDocsTags {
SshCertificates = "SSH Certificates",
SshCertificateAuthorities = "SSH Certificate Authorities",
SshCertificateTemplates = "SSH Certificate Templates",
SshHosts = "SSH Hosts",
SshHostGroups = "SSH Host Groups",
KmsKeys = "KMS Keys",
KmsEncryption = "KMS Encryption",
KmsSigning = "KMS Signing"
@@ -568,6 +570,9 @@ export const PROJECTS = {
LIST_SSH_HOSTS: {
projectId: "The ID of the project to list SSH hosts for."
},
LIST_SSH_HOST_GROUPS: {
projectId: "The ID of the project to list SSH host groups for."
},
LIST_SSH_CERTIFICATES: {
projectId: "The ID of the project to list SSH certificates for.",
offset: "The offset to start from. If you enter 10, it will start from the 10th SSH certificate.",
@@ -1382,6 +1387,40 @@ export const SSH_CERTIFICATE_TEMPLATES = {
}
};
export const SSH_HOST_GROUPS = {
GET: {
sshHostGroupId: "The ID of the SSH host group to get.",
filter: "The filter to apply to the SSH hosts in the SSH host group."
},
CREATE: {
projectId: "The ID of the project to create the SSH host group in.",
name: "The name of the SSH host group.",
loginMappings:
"A list of default login mappings to include on each host in the SSH host group. Each login mapping contains a login user and a list of corresponding allowed principals being usernames of users in the Infisical SSH project."
},
UPDATE: {
sshHostGroupId: "The ID of the SSH host group to update.",
name: "The name of the SSH host group to update to.",
loginMappings:
"A list of default login mappings to include on each host in the SSH host group. Each login mapping contains a login user and a list of corresponding allowed principals being usernames of users in the Infisical SSH project."
},
DELETE: {
sshHostGroupId: "The ID of the SSH host group to delete."
},
LIST_HOSTS: {
offset: "The offset to start from. If you enter 10, it will start from the 10th host",
limit: "The number of hosts to return."
},
ADD_HOST: {
sshHostGroupId: "The ID of the SSH host group to add the host to.",
hostId: "The ID of the SSH host to add to the SSH host group."
},
DELETE_HOST: {
sshHostGroupId: "The ID of the SSH host group to delete the host from.",
hostId: "The ID of the SSH host to delete from the SSH host group."
}
};
export const SSH_HOSTS = {
GET: {
sshHostId: "The ID of the SSH host to get."
@@ -1862,6 +1901,13 @@ export const AppConnections = {
instanceUrl: "The Windmill instance URL to connect with (defaults to https://app.windmill.dev).",
accessToken: "The access token to use to connect with Windmill."
},
HC_VAULT: {
instanceUrl: "The Hashicrop Vault instance URL to connect with.",
namespace: "The Hashicrop Vault namespace to connect with.",
accessToken: "The access token used to connect with Hashicorp Vault.",
roleId: "The Role ID used to connect with Hashicorp Vault.",
secretId: "The Secret ID used to connect with Hashicorp Vault."
},
LDAP: {
provider: "The type of LDAP provider. Determines provider-specific behaviors.",
url: "The LDAP/LDAPS URL to connect to (e.g., 'ldap://domain-or-ip:389' or 'ldaps://domain-or-ip:636').",
@@ -2019,6 +2065,10 @@ export const SecretSyncs = {
workspace: "The Windmill workspace to sync secrets to.",
path: "The Windmill workspace path to sync secrets to."
},
HC_VAULT: {
mount: "The Hashicorp Vault Secrets Engine Mount to sync secrets to.",
path: "The Hashicorp Vault path to sync secrets to."
},
TEAMCITY: {
project: "The TeamCity project to sync secrets to.",
buildConfig: "The TeamCity build configuration to sync secrets to."

View File

@@ -228,6 +228,10 @@ const envSchema = z
DATADOG_SERVICE: zpStr(z.string().optional().default("infisical-core")),
DATADOG_HOSTNAME: zpStr(z.string().optional()),
// PIT
PIT_CHECKPOINT_WINDOW: zpStr(z.string().optional().default("10")),
PIT_TREE_CHECKPOINT_WINDOW: zpStr(z.string().optional().default("100")),
/* CORS ----------------------------------------------------------------------------- */
CORS_ALLOWED_ORIGINS: zpStr(

View File

@@ -0,0 +1,98 @@
import { validateMicrosoftTeamsChannelsSchema } from "@app/services/microsoft-teams/microsoft-teams-fns";
import { sendSlackNotification } from "@app/services/slack/slack-fns";
import { logger } from "../logger";
import { TriggerFeature, TTriggerWorkflowNotificationDTO } from "./types";
export const triggerWorkflowIntegrationNotification = async (dto: TTriggerWorkflowNotificationDTO) => {
try {
const { projectId, notification } = dto.input;
const { projectDAL, projectSlackConfigDAL, kmsService, projectMicrosoftTeamsConfigDAL, microsoftTeamsService } =
dto.dependencies;
const project = await projectDAL.findById(projectId);
if (!project) {
return;
}
const microsoftTeamsConfig = await projectMicrosoftTeamsConfigDAL.getIntegrationDetailsByProject(projectId);
const slackConfig = await projectSlackConfigDAL.getIntegrationDetailsByProject(projectId);
if (slackConfig) {
if (notification.type === TriggerFeature.ACCESS_REQUEST) {
const targetChannelIds = slackConfig.accessRequestChannels?.split(", ") || [];
if (targetChannelIds.length && slackConfig.isAccessRequestNotificationEnabled) {
await sendSlackNotification({
orgId: project.orgId,
notification,
kmsService,
targetChannelIds,
slackIntegration: slackConfig
}).catch((error) => {
logger.error(error, "Error sending Slack notification");
});
}
} else if (notification.type === TriggerFeature.SECRET_APPROVAL) {
const targetChannelIds = slackConfig.secretRequestChannels?.split(", ") || [];
if (targetChannelIds.length && slackConfig.isSecretRequestNotificationEnabled) {
await sendSlackNotification({
orgId: project.orgId,
notification,
kmsService,
targetChannelIds,
slackIntegration: slackConfig
}).catch((error) => {
logger.error(error, "Error sending Slack notification");
});
}
}
}
if (microsoftTeamsConfig) {
if (notification.type === TriggerFeature.ACCESS_REQUEST) {
if (microsoftTeamsConfig.isAccessRequestNotificationEnabled && microsoftTeamsConfig.accessRequestChannels) {
const { success, data } = validateMicrosoftTeamsChannelsSchema.safeParse(
microsoftTeamsConfig.accessRequestChannels
);
if (success && data) {
await microsoftTeamsService
.sendNotification({
notification,
target: data,
tenantId: microsoftTeamsConfig.tenantId,
microsoftTeamsIntegrationId: microsoftTeamsConfig.id,
orgId: project.orgId
})
.catch((error) => {
logger.error(error, "Error sending Microsoft Teams notification");
});
}
}
} else if (notification.type === TriggerFeature.SECRET_APPROVAL) {
if (microsoftTeamsConfig.isSecretRequestNotificationEnabled && microsoftTeamsConfig.secretRequestChannels) {
const { success, data } = validateMicrosoftTeamsChannelsSchema.safeParse(
microsoftTeamsConfig.secretRequestChannels
);
if (success && data) {
await microsoftTeamsService
.sendNotification({
notification,
target: data,
tenantId: microsoftTeamsConfig.tenantId,
microsoftTeamsIntegrationId: microsoftTeamsConfig.id,
orgId: project.orgId
})
.catch((error) => {
logger.error(error, "Error sending Microsoft Teams notification");
});
}
}
}
}
} catch (error) {
logger.error(error, "Error triggering workflow integration notification");
}
};

View File

@@ -0,0 +1,51 @@
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TMicrosoftTeamsServiceFactory } from "@app/services/microsoft-teams/microsoft-teams-service";
import { TProjectMicrosoftTeamsConfigDALFactory } from "@app/services/microsoft-teams/project-microsoft-teams-config-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectSlackConfigDALFactory } from "@app/services/slack/project-slack-config-dal";
export enum TriggerFeature {
SECRET_APPROVAL = "secret-approval",
ACCESS_REQUEST = "access-request"
}
export type TNotification =
| {
type: TriggerFeature.SECRET_APPROVAL;
payload: {
userEmail: string;
environment: string;
secretPath: string;
requestId: string;
projectId: string;
secretKeys: string[];
};
}
| {
type: TriggerFeature.ACCESS_REQUEST;
payload: {
requesterFullName: string;
requesterEmail: string;
isTemporary: boolean;
secretPath: string;
environment: string;
projectName: string;
permissions: string[];
approvalUrl: string;
note?: string;
};
};
export type TTriggerWorkflowNotificationDTO = {
input: {
projectId: string;
notification: TNotification;
};
dependencies: {
projectDAL: Pick<TProjectDALFactory, "findById">;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "getIntegrationDetailsByProject">;
projectMicrosoftTeamsConfigDAL: Pick<TProjectMicrosoftTeamsConfigDALFactory, "getIntegrationDetailsByProject">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
microsoftTeamsService: Pick<TMicrosoftTeamsServiceFactory, "sendNotification">;
};
};

View File

@@ -49,7 +49,8 @@ export enum QueueName {
AccessTokenStatusUpdate = "access-token-status-update",
ImportSecretsFromExternalSource = "import-secrets-from-external-source",
AppConnectionSecretSync = "app-connection-secret-sync",
SecretRotationV2 = "secret-rotation-v2"
SecretRotationV2 = "secret-rotation-v2",
FolderTreeCheckpoint = "folder-tree-checkpoint"
}
export enum QueueJobs {
@@ -81,7 +82,8 @@ export enum QueueJobs {
SecretSyncSendActionFailedNotifications = "secret-sync-send-action-failed-notifications",
SecretRotationV2QueueRotations = "secret-rotation-v2-queue-rotations",
SecretRotationV2RotateSecrets = "secret-rotation-v2-rotate-secrets",
SecretRotationV2SendNotification = "secret-rotation-v2-send-notification"
SecretRotationV2SendNotification = "secret-rotation-v2-send-notification",
CreateFolderTreeCheckpoint = "create-folder-tree-checkpoint"
}
export type TQueueJobTypes = {
@@ -191,6 +193,12 @@ export type TQueueJobTypes = {
name: QueueJobs.ProjectV3Migration;
payload: { projectId: string };
};
[QueueName.FolderTreeCheckpoint]: {
name: QueueJobs.CreateFolderTreeCheckpoint;
payload: {
envId: string;
};
};
[QueueName.ImportSecretsFromExternalSource]: {
name: QueueJobs.ImportSecretsFromExternalSource;
payload: {

View File

@@ -111,6 +111,11 @@ export const injectIdentity = fp(async (server: FastifyZodProvider) => {
return;
}
// Authentication is handled on a route-level here.
if (req.url.includes("/api/v1/workflow-integrations/microsoft-teams/message-endpoint")) {
return;
}
const { authMode, token, actor } = await extractAuth(req, appCfg.AUTH_SECRET);
if (!authMode) return;

View File

@@ -103,6 +103,9 @@ import { sshHostDALFactory } from "@app/ee/services/ssh-host/ssh-host-dal";
import { sshHostLoginUserMappingDALFactory } from "@app/ee/services/ssh-host/ssh-host-login-user-mapping-dal";
import { sshHostServiceFactory } from "@app/ee/services/ssh-host/ssh-host-service";
import { sshHostLoginUserDALFactory } from "@app/ee/services/ssh-host/ssh-login-user-dal";
import { sshHostGroupDALFactory } from "@app/ee/services/ssh-host-group/ssh-host-group-dal";
import { sshHostGroupMembershipDALFactory } from "@app/ee/services/ssh-host-group/ssh-host-group-membership-dal";
import { sshHostGroupServiceFactory } from "@app/ee/services/ssh-host-group/ssh-host-group-service";
import { trustedIpDALFactory } from "@app/ee/services/trusted-ip/trusted-ip-dal";
import { trustedIpServiceFactory } from "@app/ee/services/trusted-ip/trusted-ip-service";
import { TKeyStoreFactory } from "@app/keystore/keystore";
@@ -137,6 +140,14 @@ import { externalGroupOrgRoleMappingDALFactory } from "@app/services/external-gr
import { externalGroupOrgRoleMappingServiceFactory } from "@app/services/external-group-org-role-mapping/external-group-org-role-mapping-service";
import { externalMigrationQueueFactory } from "@app/services/external-migration/external-migration-queue";
import { externalMigrationServiceFactory } from "@app/services/external-migration/external-migration-service";
import { folderCheckpointDALFactory } from "@app/services/folder-checkpoint/folder-checkpoint-dal";
import { folderCheckpointResourcesDALFactory } from "@app/services/folder-checkpoint-resources/folder-checkpoint-resources-dal";
import { folderCommitDALFactory } from "@app/services/folder-commit/folder-commit-dal";
import { folderCommitQueueServiceFactory } from "@app/services/folder-commit/folder-commit-queue";
import { folderCommitServiceFactory } from "@app/services/folder-commit/folder-commit-service";
import { folderCommitChangesDALFactory } from "@app/services/folder-commit-changes/folder-commit-changes-dal";
import { folderTreeCheckpointDALFactory } from "@app/services/folder-tree-checkpoint/folder-tree-checkpoint-dal";
import { folderTreeCheckpointResourcesDALFactory } from "@app/services/folder-tree-checkpoint-resources/folder-tree-checkpoint-resources-dal";
import { groupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { groupProjectMembershipRoleDALFactory } from "@app/services/group-project/group-project-membership-role-dal";
import { groupProjectServiceFactory } from "@app/services/group-project/group-project-service";
@@ -174,6 +185,9 @@ import { internalKmsDALFactory } from "@app/services/kms/internal-kms-dal";
import { kmskeyDALFactory } from "@app/services/kms/kms-key-dal";
import { kmsRootConfigDALFactory } from "@app/services/kms/kms-root-config-dal";
import { kmsServiceFactory } from "@app/services/kms/kms-service";
import { microsoftTeamsIntegrationDALFactory } from "@app/services/microsoft-teams/microsoft-teams-integration-dal";
import { microsoftTeamsServiceFactory } from "@app/services/microsoft-teams/microsoft-teams-service";
import { projectMicrosoftTeamsConfigDALFactory } from "@app/services/microsoft-teams/project-microsoft-teams-config-dal";
import { incidentContactDALFactory } from "@app/services/org/incident-contacts-dal";
import { orgBotDALFactory } from "@app/services/org/org-bot-dal";
import { orgDALFactory } from "@app/services/org/org-dal";
@@ -399,6 +413,8 @@ export const registerRoutes = async (
const sshHostDAL = sshHostDALFactory(db);
const sshHostLoginUserDAL = sshHostLoginUserDALFactory(db);
const sshHostLoginUserMappingDAL = sshHostLoginUserMappingDALFactory(db);
const sshHostGroupDAL = sshHostGroupDALFactory(db);
const sshHostGroupMembershipDAL = sshHostGroupMembershipDALFactory(db);
const kmsDAL = kmskeyDALFactory(db);
const internalKmsDAL = internalKmsDALFactory(db);
@@ -426,6 +442,8 @@ export const registerRoutes = async (
const githubOrgSyncDAL = githubOrgSyncDALFactory(db);
const secretRotationV2DAL = secretRotationV2DALFactory(db, folderDAL);
const microsoftTeamsIntegrationDAL = microsoftTeamsIntegrationDALFactory(db);
const projectMicrosoftTeamsConfigDAL = projectMicrosoftTeamsConfigDALFactory(db);
const permissionService = permissionServiceFactory({
permissionDAL,
@@ -541,6 +559,37 @@ export const registerRoutes = async (
projectRoleDAL,
permissionService
});
const folderCommitChangesDAL = folderCommitChangesDALFactory(db);
const folderCheckpointDAL = folderCheckpointDALFactory(db);
const folderCheckpointResourcesDAL = folderCheckpointResourcesDALFactory(db);
const folderTreeCheckpointDAL = folderTreeCheckpointDALFactory(db);
const folderCommitDAL = folderCommitDALFactory(db);
const folderTreeCheckpointResourcesDAL = folderTreeCheckpointResourcesDALFactory(db);
const folderCommitQueueService = folderCommitQueueServiceFactory({
queueService,
folderTreeCheckpointDAL,
folderTreeCheckpointResourcesDAL,
folderCommitDAL,
folderDAL
});
const folderCommitService = folderCommitServiceFactory({
folderCommitDAL,
folderCommitChangesDAL,
folderCheckpointDAL,
folderTreeCheckpointDAL,
userDAL,
identityDAL,
folderDAL,
folderVersionDAL,
secretVersionV2BridgeDAL,
projectDAL,
folderCheckpointResourcesDAL,
secretV2BridgeDAL,
folderTreeCheckpointResourcesDAL,
folderCommitQueueService,
permissionService
});
const scimService = scimServiceFactory({
licenseService,
scimDAL,
@@ -623,6 +672,7 @@ export const registerRoutes = async (
tokenService,
orgDAL,
totpService,
orgMembershipDAL,
auditLogService
});
const passwordService = authPaswordServiceFactory({
@@ -687,6 +737,15 @@ export const registerRoutes = async (
orgDAL,
externalGroupOrgRoleMappingDAL
});
const microsoftTeamsService = microsoftTeamsServiceFactory({
microsoftTeamsIntegrationDAL,
permissionService,
workflowIntegrationDAL,
kmsService,
serverCfgDAL: superAdminDAL
});
const superAdminService = superAdminServiceFactory({
userDAL,
identityDAL,
@@ -700,7 +759,8 @@ export const registerRoutes = async (
orgService,
keyStore,
licenseService,
kmsService
kmsService,
microsoftTeamsService
});
const orgAdminService = orgAdminServiceFactory({
@@ -849,6 +909,18 @@ export const registerRoutes = async (
kmsService
});
const sshHostGroupService = sshHostGroupServiceFactory({
projectDAL,
sshHostDAL,
sshHostGroupDAL,
sshHostGroupMembershipDAL,
sshHostLoginUserDAL,
sshHostLoginUserMappingDAL,
userDAL,
permissionService,
licenseService
});
const certificateAuthorityService = certificateAuthorityServiceFactory({
certificateAuthorityDAL,
certificateAuthorityCertDAL,
@@ -940,6 +1012,7 @@ export const registerRoutes = async (
projectMembershipDAL,
projectBotDAL,
secretDAL,
folderCommitService,
secretBlindIndexDAL,
secretVersionDAL,
secretTagDAL,
@@ -986,6 +1059,7 @@ export const registerRoutes = async (
secretReminderRecipientsDAL,
orgService,
resourceMetadataDAL,
folderCommitService,
secretSyncQueue
});
@@ -1018,6 +1092,7 @@ export const registerRoutes = async (
sshCertificateDAL,
sshCertificateTemplateDAL,
sshHostDAL,
sshHostGroupDAL,
projectUserMembershipRoleDAL,
identityProjectMembershipRoleDAL,
keyStore,
@@ -1026,6 +1101,8 @@ export const registerRoutes = async (
certificateTemplateDAL,
projectSlackConfigDAL,
slackIntegrationDAL,
projectMicrosoftTeamsConfigDAL,
microsoftTeamsIntegrationDAL,
projectTemplateService,
groupProjectDAL,
smtpService
@@ -1084,7 +1161,8 @@ export const registerRoutes = async (
folderVersionDAL,
projectEnvDAL,
snapshotService,
projectDAL
projectDAL,
folderCommitService
});
const secretImportService = secretImportServiceFactory({
@@ -1109,6 +1187,7 @@ export const registerRoutes = async (
const secretV2BridgeService = secretV2BridgeServiceFactory({
folderDAL,
secretVersionDAL: secretVersionV2BridgeDAL,
folderCommitService,
secretQueueService,
secretDAL: secretV2BridgeDAL,
permissionService,
@@ -1150,7 +1229,10 @@ export const registerRoutes = async (
userDAL,
licenseService,
projectSlackConfigDAL,
resourceMetadataDAL
resourceMetadataDAL,
projectMicrosoftTeamsConfigDAL,
microsoftTeamsService,
folderCommitService
});
const secretService = secretServiceFactory({
@@ -1212,7 +1294,9 @@ export const registerRoutes = async (
accessApprovalPolicyApproverDAL,
projectSlackConfigDAL,
kmsService,
groupDAL
groupDAL,
microsoftTeamsService,
projectMicrosoftTeamsConfigDAL
});
const secretReplicationService = secretReplicationServiceFactory({
@@ -1233,7 +1317,8 @@ export const registerRoutes = async (
secretV2BridgeDAL,
secretVersionV2TagBridgeDAL: secretVersionTagV2BridgeDAL,
secretVersionV2BridgeDAL,
resourceMetadataDAL
resourceMetadataDAL,
folderCommitService
});
const secretRotationQueue = secretRotationQueueFactory({
@@ -1245,6 +1330,7 @@ export const registerRoutes = async (
projectBotService,
secretVersionV2BridgeDAL,
secretV2BridgeDAL,
folderCommitService,
kmsService
});
@@ -1517,7 +1603,9 @@ export const registerRoutes = async (
secretDAL: secretV2BridgeDAL,
queueService,
secretV2BridgeService,
resourceMetadataDAL
resourceMetadataDAL,
folderCommitService,
folderVersionDAL
});
const migrationService = externalMigrationServiceFactory({
@@ -1579,6 +1667,7 @@ export const registerRoutes = async (
auditLogService,
secretV2BridgeDAL,
secretTagDAL,
folderCommitService,
secretVersionTagV2BridgeDAL,
secretVersionV2BridgeDAL,
keyStore,
@@ -1610,6 +1699,7 @@ export const registerRoutes = async (
await dailyResourceCleanUp.startCleanUp();
await dailyExpiringPkiItemAlert.startSendingAlerts();
await kmsService.startService();
await microsoftTeamsService.start();
// inject all services
server.decorate<FastifyZodProvider["services"]>("services", {
@@ -1669,6 +1759,7 @@ export const registerRoutes = async (
sshCertificateAuthority: sshCertificateAuthorityService,
sshCertificateTemplate: sshCertificateTemplateService,
sshHost: sshHostService,
sshHostGroup: sshHostGroupService,
certificateAuthority: certificateAuthorityService,
certificateTemplate: certificateTemplateService,
certificateAuthorityCrl: certificateAuthorityCrlService,
@@ -1702,8 +1793,10 @@ export const registerRoutes = async (
kmipOperation: kmipOperationService,
gateway: gatewayService,
secretRotationV2: secretRotationV2Service,
microsoftTeams: microsoftTeamsService,
assumePrivileges: assumePrivilegeService,
githubOrgSync: githubOrgSyncConfigService
githubOrgSync: githubOrgSyncConfigService,
folderCommit: folderCommitService
});
const cronJobs: CronJob[] = [];

View File

@@ -27,7 +27,10 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
createdAt: true,
updatedAt: true,
encryptedSlackClientId: true,
encryptedSlackClientSecret: true
encryptedSlackClientSecret: true,
encryptedMicrosoftTeamsAppId: true,
encryptedMicrosoftTeamsClientSecret: true,
encryptedMicrosoftTeamsBotId: true
}).extend({
isMigrationModeOn: z.boolean(),
defaultAuthOrgSlug: z.string().nullable(),
@@ -74,6 +77,9 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
}),
slackClientId: z.string().optional(),
slackClientSecret: z.string().optional(),
microsoftTeamsAppId: z.string().optional(),
microsoftTeamsClientSecret: z.string().optional(),
microsoftTeamsBotId: z.string().optional(),
authConsentContent: z
.string()
.trim()
@@ -197,15 +203,22 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
server.route({
method: "GET",
url: "/integrations/slack/config",
url: "/integrations",
config: {
rateLimit: readLimit
},
schema: {
response: {
200: z.object({
clientId: z.string(),
clientSecret: z.string()
slack: z.object({
clientId: z.string(),
clientSecret: z.string()
}),
microsoftTeams: z.object({
appId: z.string(),
clientSecret: z.string(),
botId: z.string()
})
})
}
},
@@ -215,9 +228,9 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
});
},
handler: async () => {
const adminSlackConfig = await server.services.superAdmin.getAdminSlackConfig();
const adminIntegrationsConfig = await server.services.superAdmin.getAdminIntegrationsConfig();
return adminSlackConfig;
return adminIntegrationsConfig;
}
});

View File

@@ -28,6 +28,10 @@ import {
} from "@app/services/app-connection/databricks";
import { GcpConnectionListItemSchema, SanitizedGcpConnectionSchema } from "@app/services/app-connection/gcp";
import { GitHubConnectionListItemSchema, SanitizedGitHubConnectionSchema } from "@app/services/app-connection/github";
import {
HCVaultConnectionListItemSchema,
SanitizedHCVaultConnectionSchema
} from "@app/services/app-connection/hc-vault";
import {
HumanitecConnectionListItemSchema,
SanitizedHumanitecConnectionSchema
@@ -68,6 +72,7 @@ const SanitizedAppConnectionSchema = z.union([
...SanitizedMsSqlConnectionSchema.options,
...SanitizedCamundaConnectionSchema.options,
...SanitizedAuth0ConnectionSchema.options,
...SanitizedHCVaultConnectionSchema.options,
...SanitizedAzureClientSecretsConnectionSchema.options,
...SanitizedWindmillConnectionSchema.options,
...SanitizedLdapConnectionSchema.options,
@@ -88,6 +93,7 @@ const AppConnectionOptionsSchema = z.discriminatedUnion("app", [
MsSqlConnectionListItemSchema,
CamundaConnectionListItemSchema,
Auth0ConnectionListItemSchema,
HCVaultConnectionListItemSchema,
AzureClientSecretsConnectionListItemSchema,
WindmillConnectionListItemSchema,
LdapConnectionListItemSchema,

View File

@@ -0,0 +1,47 @@
import z from "zod";
import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
CreateHCVaultConnectionSchema,
SanitizedHCVaultConnectionSchema,
UpdateHCVaultConnectionSchema
} from "@app/services/app-connection/hc-vault";
import { AuthMode } from "@app/services/auth/auth-type";
import { registerAppConnectionEndpoints } from "./app-connection-endpoints";
export const registerHCVaultConnectionRouter = async (server: FastifyZodProvider) => {
registerAppConnectionEndpoints({
app: AppConnection.HCVault,
server,
sanitizedResponseSchema: SanitizedHCVaultConnectionSchema,
createSchema: CreateHCVaultConnectionSchema,
updateSchema: UpdateHCVaultConnectionSchema
});
// The following endpoints are for internal Infisical App use only and not part of the public API
server.route({
method: "GET",
url: `/:connectionId/mounts`,
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
connectionId: z.string().uuid()
}),
response: {
200: z.string().array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const { connectionId } = req.params;
const mounts = await server.services.appConnection.hcvault.listMounts(connectionId, req.permission);
return mounts;
}
});
};

View File

@@ -9,6 +9,7 @@ import { registerCamundaConnectionRouter } from "./camunda-connection-router";
import { registerDatabricksConnectionRouter } from "./databricks-connection-router";
import { registerGcpConnectionRouter } from "./gcp-connection-router";
import { registerGitHubConnectionRouter } from "./github-connection-router";
import { registerHCVaultConnectionRouter } from "./hc-vault-connection-router";
import { registerHumanitecConnectionRouter } from "./humanitec-connection-router";
import { registerLdapConnectionRouter } from "./ldap-connection-router";
import { registerMsSqlConnectionRouter } from "./mssql-connection-router";
@@ -37,6 +38,7 @@ export const APP_CONNECTION_REGISTER_ROUTER_MAP: Record<AppConnection, (server:
[AppConnection.Camunda]: registerCamundaConnectionRouter,
[AppConnection.Windmill]: registerWindmillConnectionRouter,
[AppConnection.Auth0]: registerAuth0ConnectionRouter,
[AppConnection.HCVault]: registerHCVaultConnectionRouter,
[AppConnection.LDAP]: registerLdapConnectionRouter,
[AppConnection.TeamCity]: registerTeamCityConnectionRouter
};

View File

@@ -26,6 +26,7 @@ import { registerIdentityUaRouter } from "./identity-universal-auth-router";
import { registerIntegrationAuthRouter } from "./integration-auth-router";
import { registerIntegrationRouter } from "./integration-router";
import { registerInviteOrgRouter } from "./invite-org-router";
import { registerMicrosoftTeamsRouter } from "./microsoft-teams-router";
import { registerOrgAdminRouter } from "./org-admin-router";
import { registerOrgRouter } from "./organization-router";
import { registerPasswordRouter } from "./password-router";
@@ -79,6 +80,7 @@ export const registerV1Routes = async (server: FastifyZodProvider) => {
async (workflowIntegrationRouter) => {
await workflowIntegrationRouter.register(registerWorkflowIntegrationRouter);
await workflowIntegrationRouter.register(registerSlackRouter, { prefix: "/slack" });
await workflowIntegrationRouter.register(registerMicrosoftTeamsRouter, { prefix: "/microsoft-teams" });
},
{ prefix: "/workflow-integrations" }
);

View File

@@ -0,0 +1,381 @@
import { z } from "zod";
import { MicrosoftTeamsIntegrationsSchema, WorkflowIntegrationsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { slugSchema } from "@app/server/lib/schemas";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { WorkflowIntegrationStatus } from "@app/services/workflow-integration/workflow-integration-types";
const sanitizedMicrosoftTeamsIntegrationSchema = WorkflowIntegrationsSchema.pick({
id: true,
description: true,
slug: true,
integration: true
}).merge(
MicrosoftTeamsIntegrationsSchema.pick({
tenantId: true
}).extend({
status: z.nativeEnum(WorkflowIntegrationStatus)
})
);
export const registerMicrosoftTeamsRouter = async (server: FastifyZodProvider) => {
server.route({
method: "GET",
url: "/client-id",
config: {
rateLimit: readLimit
},
schema: {
response: {
200: z.object({
clientId: z.string()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const clientId = await server.services.microsoftTeams.getClientId({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
return {
clientId
};
}
});
server.route({
method: "POST",
url: "/",
config: {
rateLimit: readLimit
},
schema: {
body: z.object({
redirectUri: z.string(),
tenantId: z.string().uuid(),
slug: z.string(),
description: z.string().optional(),
code: z.string().trim()
})
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
await server.services.microsoftTeams.completeMicrosoftTeamsIntegration({
tenantId: req.body.tenantId,
slug: req.body.slug,
description: req.body.description,
redirectUri: req.body.redirectUri,
code: req.body.code,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_CREATE,
metadata: {
tenantId: req.body.tenantId,
slug: req.body.slug,
description: req.body.description
}
}
});
}
});
server.route({
method: "GET",
url: "/",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
response: {
200: sanitizedMicrosoftTeamsIntegrationSchema.array()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const microsoftTeamsIntegrations = await server.services.microsoftTeams.getMicrosoftTeamsIntegrationsByOrg({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_LIST,
metadata: {}
}
});
return microsoftTeamsIntegrations;
}
});
server.route({
method: "POST",
url: "/:id/installation-status",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
id: z.string()
})
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const microsoftTeamsIntegration = await server.services.microsoftTeams.checkInstallationStatus({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
workflowIntegrationId: req.params.id
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_CHECK_INSTALLATION_STATUS,
metadata: {
tenantId: microsoftTeamsIntegration.tenantId,
slug: microsoftTeamsIntegration.slug
}
}
});
}
});
server.route({
method: "DELETE",
url: "/:id",
config: {
rateLimit: writeLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
response: {
200: sanitizedMicrosoftTeamsIntegrationSchema
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const deletedMicrosoftTeamsIntegration = await server.services.microsoftTeams.deleteMicrosoftTeamsIntegration({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_DELETE,
metadata: {
tenantId: deletedMicrosoftTeamsIntegration.tenantId,
slug: deletedMicrosoftTeamsIntegration.slug,
id: deletedMicrosoftTeamsIntegration.id
}
}
});
return deletedMicrosoftTeamsIntegration;
}
});
server.route({
method: "GET",
url: "/:id",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
response: {
200: sanitizedMicrosoftTeamsIntegrationSchema
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const microsoftTeamsIntegration = await server.services.microsoftTeams.getMicrosoftTeamsIntegrationById({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_GET,
metadata: {
slug: microsoftTeamsIntegration.slug,
id: microsoftTeamsIntegration.id,
tenantId: microsoftTeamsIntegration.tenantId
}
}
});
return microsoftTeamsIntegration;
}
});
server.route({
method: "PATCH",
url: "/:id",
config: {
rateLimit: writeLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
body: z.object({
slug: slugSchema({ max: 64 }).optional(),
description: z.string().optional()
}),
response: {
200: sanitizedMicrosoftTeamsIntegrationSchema
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const microsoftTeamsIntegration = await server.services.microsoftTeams.updateMicrosoftTeamsIntegration({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id,
...req.body
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_UPDATE,
metadata: {
slug: microsoftTeamsIntegration.slug,
id: microsoftTeamsIntegration.id,
tenantId: microsoftTeamsIntegration.tenantId,
newSlug: req.body.slug,
newDescription: req.body.description
}
}
});
return microsoftTeamsIntegration;
}
});
server.route({
method: "GET",
url: "/:workflowIntegrationId/teams",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workflowIntegrationId: z.string()
}),
response: {
200: z
.object({
teamId: z.string(),
teamName: z.string(),
channels: z
.object({
channelName: z.string(),
channelId: z.string()
})
.array()
})
.array()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const microsoftTeamsIntegration = await server.services.microsoftTeams.getTeams({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
workflowIntegrationId: req.params.workflowIntegrationId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.MICROSOFT_TEAMS_WORKFLOW_INTEGRATION_GET_TEAMS,
metadata: {
tenantId: microsoftTeamsIntegration.tenantId,
slug: microsoftTeamsIntegration.slug,
id: microsoftTeamsIntegration.id
}
}
});
return microsoftTeamsIntegration.teams;
}
});
server.route({
method: "POST",
url: "/message-endpoint",
schema: {
body: z.any(),
response: {
200: z.any()
}
},
handler: async (req, res) => {
await server.services.microsoftTeams.handleMessageEndpoint(req, res);
}
});
};

View File

@@ -14,6 +14,7 @@ import {
UserEncryptionKeysSchema,
UsersSchema
} from "@app/db/schemas";
import { ProjectMicrosoftTeamsConfigsSchema } from "@app/db/schemas/project-microsoft-teams-configs";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { ApiDocsTags, PROJECTS } from "@app/lib/api-docs";
import { CharacterType, characterValidator } from "@app/lib/validator/validate-string";
@@ -21,8 +22,10 @@ import { re2Validator } from "@app/lib/zod";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { ActorType, AuthMode } from "@app/services/auth/auth-type";
import { validateMicrosoftTeamsChannelsSchema } from "@app/services/microsoft-teams/microsoft-teams-fns";
import { ProjectFilterType, SearchProjectSortBy } from "@app/services/project/project-types";
import { validateSlackChannelsField } from "@app/services/slack/slack-auth-validators";
import { WorkflowIntegration } from "@app/services/workflow-integration/workflow-integration-types";
import { integrationAuthPubSchema, SanitizedProjectSchema } from "../sanitizedSchemas";
import { sanitizedServiceTokenSchema } from "../v2/service-token-router";
@@ -740,55 +743,112 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
server.route({
method: "GET",
url: "/:workspaceId/slack-config",
url: "/:workspaceId/workflow-integration-config/:integration",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim()
workspaceId: z.string().trim(),
integration: z.nativeEnum(WorkflowIntegration)
}),
response: {
200: ProjectSlackConfigsSchema.pick({
id: true,
slackIntegrationId: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
200: z.discriminatedUnion("integration", [
ProjectSlackConfigsSchema.pick({
id: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
}).merge(
z.object({
integration: z.literal(WorkflowIntegration.SLACK),
integrationId: z.string()
})
),
ProjectMicrosoftTeamsConfigsSchema.pick({
id: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
}).merge(
z.object({
integration: z.literal(WorkflowIntegration.MICROSOFT_TEAMS),
integrationId: z.string()
})
)
])
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const config = await server.services.project.getProjectWorkflowIntegrationConfig({
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
projectId: req.params.workspaceId,
integration: req.params.integration
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.GET_PROJECT_WORKFLOW_INTEGRATION_CONFIG,
metadata: {
id: config.id,
integration: config.integration
}
}
});
return config;
}
});
server.route({
method: "DELETE",
url: "/:projectId/workflow-integration/:integration/:integrationId",
config: {
rateLimit: writeLimit
},
schema: {
params: z.object({
projectId: z.string().trim(),
integration: z.nativeEnum(WorkflowIntegration),
integrationId: z.string()
}),
response: {
200: z.object({
integrationConfig: z.object({
id: z.string()
})
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackConfig = await server.services.project.getProjectSlackConfig({
const deletedIntegration = await server.services.project.deleteProjectWorkflowIntegration({
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
projectId: req.params.workspaceId
projectId: req.params.projectId,
integration: req.params.integration,
integrationId: req.params.integrationId
});
if (slackConfig) {
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.GET_PROJECT_SLACK_CONFIG,
metadata: {
id: slackConfig.id
}
}
});
}
return slackConfig;
return {
integrationConfig: deletedIntegration
};
}
});
server.route({
method: "PUT",
url: "/:workspaceId/slack-config",
url: "/:workspaceId/workflow-integration",
config: {
rateLimit: readLimit
},
@@ -796,27 +856,57 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
params: z.object({
workspaceId: z.string().trim()
}),
body: z.object({
slackIntegrationId: z.string(),
isAccessRequestNotificationEnabled: z.boolean(),
accessRequestChannels: validateSlackChannelsField,
isSecretRequestNotificationEnabled: z.boolean(),
secretRequestChannels: validateSlackChannelsField
}),
response: {
200: ProjectSlackConfigsSchema.pick({
id: true,
slackIntegrationId: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
body: z.discriminatedUnion("integration", [
z.object({
integration: z.literal(WorkflowIntegration.SLACK),
integrationId: z.string(),
accessRequestChannels: validateSlackChannelsField,
secretRequestChannels: validateSlackChannelsField,
isAccessRequestNotificationEnabled: z.boolean(),
isSecretRequestNotificationEnabled: z.boolean()
}),
z.object({
integration: z.literal(WorkflowIntegration.MICROSOFT_TEAMS),
integrationId: z.string(),
accessRequestChannels: validateMicrosoftTeamsChannelsSchema,
secretRequestChannels: validateMicrosoftTeamsChannelsSchema,
isAccessRequestNotificationEnabled: z.boolean(),
isSecretRequestNotificationEnabled: z.boolean()
})
]),
response: {
200: z.discriminatedUnion("integration", [
ProjectSlackConfigsSchema.pick({
id: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
}).merge(
z.object({
integration: z.literal(WorkflowIntegration.SLACK),
integrationId: z.string()
})
),
ProjectMicrosoftTeamsConfigsSchema.pick({
id: true,
isAccessRequestNotificationEnabled: true,
isSecretRequestNotificationEnabled: true
}).merge(
z.object({
integration: z.literal(WorkflowIntegration.MICROSOFT_TEAMS),
integrationId: z.string(),
accessRequestChannels: validateMicrosoftTeamsChannelsSchema,
secretRequestChannels: validateMicrosoftTeamsChannelsSchema
})
)
])
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackConfig = await server.services.project.updateProjectSlackConfig({
const workflowIntegrationConfig = await server.services.project.updateProjectWorkflowIntegration({
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
@@ -829,19 +919,20 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.UPDATE_PROJECT_SLACK_CONFIG,
type: EventType.UPDATE_PROJECT_WORKFLOW_INTEGRATION_CONFIG,
metadata: {
id: slackConfig.id,
slackIntegrationId: slackConfig.slackIntegrationId,
isAccessRequestNotificationEnabled: slackConfig.isAccessRequestNotificationEnabled,
accessRequestChannels: slackConfig.accessRequestChannels,
isSecretRequestNotificationEnabled: slackConfig.isSecretRequestNotificationEnabled,
secretRequestChannels: slackConfig.secretRequestChannels
id: workflowIntegrationConfig.id,
integrationId: workflowIntegrationConfig.integrationId,
integration: workflowIntegrationConfig.integration,
isAccessRequestNotificationEnabled: workflowIntegrationConfig.isAccessRequestNotificationEnabled,
accessRequestChannels: workflowIntegrationConfig.accessRequestChannels,
isSecretRequestNotificationEnabled: workflowIntegrationConfig.isSecretRequestNotificationEnabled,
secretRequestChannels: workflowIntegrationConfig.secretRequestChannels
}
}
});
return slackConfig;
return workflowIntegrationConfig;
}
});

View File

@@ -0,0 +1,17 @@
import {
CreateHCVaultSyncSchema,
HCVaultSyncSchema,
UpdateHCVaultSyncSchema
} from "@app/services/secret-sync/hc-vault";
import { SecretSync } from "@app/services/secret-sync/secret-sync-enums";
import { registerSyncSecretsEndpoints } from "./secret-sync-endpoints";
export const registerHCVaultSyncRouter = async (server: FastifyZodProvider) =>
registerSyncSecretsEndpoints({
destination: SecretSync.HCVault,
server,
responseSchema: HCVaultSyncSchema,
createSchema: CreateHCVaultSyncSchema,
updateSchema: UpdateHCVaultSyncSchema
});

View File

@@ -8,6 +8,7 @@ import { registerCamundaSyncRouter } from "./camunda-sync-router";
import { registerDatabricksSyncRouter } from "./databricks-sync-router";
import { registerGcpSyncRouter } from "./gcp-sync-router";
import { registerGitHubSyncRouter } from "./github-sync-router";
import { registerHCVaultSyncRouter } from "./hc-vault-sync-router";
import { registerHumanitecSyncRouter } from "./humanitec-sync-router";
import { registerTeamCitySyncRouter } from "./teamcity-sync-router";
import { registerTerraformCloudSyncRouter } from "./terraform-cloud-sync-router";
@@ -29,5 +30,6 @@ export const SECRET_SYNC_REGISTER_ROUTER_MAP: Record<SecretSync, (server: Fastif
[SecretSync.Camunda]: registerCamundaSyncRouter,
[SecretSync.Vercel]: registerVercelSyncRouter,
[SecretSync.Windmill]: registerWindmillSyncRouter,
[SecretSync.HCVault]: registerHCVaultSyncRouter,
[SecretSync.TeamCity]: registerTeamCitySyncRouter
};

View File

@@ -22,6 +22,7 @@ import { CamundaSyncListItemSchema, CamundaSyncSchema } from "@app/services/secr
import { DatabricksSyncListItemSchema, DatabricksSyncSchema } from "@app/services/secret-sync/databricks";
import { GcpSyncListItemSchema, GcpSyncSchema } from "@app/services/secret-sync/gcp";
import { GitHubSyncListItemSchema, GitHubSyncSchema } from "@app/services/secret-sync/github";
import { HCVaultSyncListItemSchema, HCVaultSyncSchema } from "@app/services/secret-sync/hc-vault";
import { HumanitecSyncListItemSchema, HumanitecSyncSchema } from "@app/services/secret-sync/humanitec";
import { TeamCitySyncListItemSchema, TeamCitySyncSchema } from "@app/services/secret-sync/teamcity";
import { TerraformCloudSyncListItemSchema, TerraformCloudSyncSchema } from "@app/services/secret-sync/terraform-cloud";
@@ -41,6 +42,7 @@ const SecretSyncSchema = z.discriminatedUnion("destination", [
CamundaSyncSchema,
VercelSyncSchema,
WindmillSyncSchema,
HCVaultSyncSchema,
TeamCitySyncSchema
]);
@@ -57,6 +59,7 @@ const SecretSyncOptionsSchema = z.discriminatedUnion("destination", [
CamundaSyncListItemSchema,
VercelSyncListItemSchema,
WindmillSyncListItemSchema,
HCVaultSyncListItemSchema,
TeamCitySyncListItemSchema
]);

View File

@@ -23,6 +23,7 @@ import { fetchGithubEmails, fetchGithubUser } from "@app/lib/requests/github";
import { authRateLimit } from "@app/server/config/rateLimiter";
import { AuthMethod } from "@app/services/auth/auth-type";
import { OrgAuthMethod } from "@app/services/org/org-types";
import { getServerCfg } from "@app/services/super-admin/super-admin-service";
export const registerSsoRouter = async (server: FastifyZodProvider) => {
const appCfg = getConfig();
@@ -342,8 +343,12 @@ export const registerSsoRouter = async (server: FastifyZodProvider) => {
}`
);
}
const serverCfg = await getServerCfg();
return res.redirect(
`${appCfg.SITE_URL}/signup/sso?token=${encodeURIComponent(req.passportUser.providerAuthToken)}`
`${appCfg.SITE_URL}/signup/sso?token=${encodeURIComponent(req.passportUser.providerAuthToken)}${
serverCfg.defaultAuthOrgId && !appCfg.isCloud ? `&defaultOrgAllowed=true` : ""
}`
);
}
});

View File

@@ -7,7 +7,8 @@ const sanitizedWorkflowIntegrationSchema = WorkflowIntegrationsSchema.pick({
id: true,
description: true,
slug: true,
integration: true
integration: true,
status: true
});
export const registerWorkflowIntegrationRouter = async (server: FastifyZodProvider) => {

View File

@@ -14,6 +14,8 @@ import { sanitizedSshCa } from "@app/ee/services/ssh/ssh-certificate-authority-s
import { sanitizedSshCertificate } from "@app/ee/services/ssh-certificate/ssh-certificate-schema";
import { sanitizedSshCertificateTemplate } from "@app/ee/services/ssh-certificate-template/ssh-certificate-template-schema";
import { loginMappingSchema, sanitizedSshHost } from "@app/ee/services/ssh-host/ssh-host-schema";
import { LoginMappingSource } from "@app/ee/services/ssh-host/ssh-host-types";
import { sanitizedSshHostGroup } from "@app/ee/services/ssh-host-group/ssh-host-group-schema";
import { ApiDocsTags, PROJECTS } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { slugSchema } from "@app/server/lib/schemas";
@@ -631,7 +633,11 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
200: z.object({
hosts: z.array(
sanitizedSshHost.extend({
loginMappings: z.array(loginMappingSchema)
loginMappings: loginMappingSchema
.extend({
source: z.nativeEnum(LoginMappingSource)
})
.array()
})
)
})
@@ -650,4 +656,39 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
return { hosts };
}
});
server.route({
method: "GET",
url: "/:projectId/ssh-host-groups",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
projectId: z.string().trim().describe(PROJECTS.LIST_SSH_HOST_GROUPS.projectId)
}),
response: {
200: z.object({
groups: z.array(
sanitizedSshHostGroup.extend({
loginMappings: loginMappingSchema.array(),
hostCount: z.number()
})
)
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const groups = await server.services.project.listProjectSshHostGroups({
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
projectId: req.params.projectId
});
return { groups };
}
});
};

View File

@@ -88,24 +88,41 @@ export const registerSignupRouter = async (server: FastifyZodProvider) => {
rateLimit: authRateLimit
},
schema: {
body: z.object({
email: z.string().trim(),
firstName: z.string().trim(),
lastName: z.string().trim().optional(),
protectedKey: z.string().trim(),
protectedKeyIV: z.string().trim(),
protectedKeyTag: z.string().trim(),
publicKey: z.string().trim(),
encryptedPrivateKey: z.string().trim(),
encryptedPrivateKeyIV: z.string().trim(),
encryptedPrivateKeyTag: z.string().trim(),
salt: z.string().trim(),
verifier: z.string().trim(),
organizationName: GenericResourceNameSchema,
providerAuthToken: z.string().trim().optional().nullish(),
attributionSource: z.string().trim().optional(),
password: z.string()
}),
body: z
.object({
email: z.string().trim(),
firstName: z.string().trim(),
lastName: z.string().trim().optional(),
protectedKey: z.string().trim(),
protectedKeyIV: z.string().trim(),
protectedKeyTag: z.string().trim(),
publicKey: z.string().trim(),
encryptedPrivateKey: z.string().trim(),
encryptedPrivateKeyIV: z.string().trim(),
encryptedPrivateKeyTag: z.string().trim(),
salt: z.string().trim(),
verifier: z.string().trim(),
providerAuthToken: z.string().trim().optional().nullish(),
attributionSource: z.string().trim().optional(),
password: z.string()
})
.and(
z.preprocess(
(data) => {
if (typeof data === "object" && data && "useDefaultOrg" in data === false) {
return { ...data, useDefaultOrg: false };
}
return data;
},
z.discriminatedUnion("useDefaultOrg", [
z.object({ useDefaultOrg: z.literal(true) }),
z.object({
useDefaultOrg: z.literal(false),
organizationName: GenericResourceNameSchema
})
])
)
),
response: {
200: z.object({
message: z.string(),

View File

@@ -14,6 +14,7 @@ export enum AppConnection {
Camunda = "camunda",
Windmill = "windmill",
Auth0 = "auth0",
HCVault = "hashicorp-vault",
LDAP = "ldap",
TeamCity = "teamcity"
}

View File

@@ -41,6 +41,11 @@ import {
} from "./databricks";
import { GcpConnectionMethod, getGcpConnectionListItem, validateGcpConnectionCredentials } from "./gcp";
import { getGitHubConnectionListItem, GitHubConnectionMethod, validateGitHubConnectionCredentials } from "./github";
import {
getHCVaultConnectionListItem,
HCVaultConnectionMethod,
validateHCVaultConnectionCredentials
} from "./hc-vault";
import {
getHumanitecConnectionListItem,
HumanitecConnectionMethod,
@@ -84,6 +89,7 @@ export const listAppConnectionOptions = () => {
getAzureClientSecretsConnectionListItem(),
getWindmillConnectionListItem(),
getAuth0ConnectionListItem(),
getHCVaultConnectionListItem(),
getLdapConnectionListItem(),
getTeamCityConnectionListItem()
].sort((a, b) => a.name.localeCompare(b.name));
@@ -152,6 +158,7 @@ export const validateAppConnectionCredentials = async (
[AppConnection.TerraformCloud]: validateTerraformCloudConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.Auth0]: validateAuth0ConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.Windmill]: validateWindmillConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.HCVault]: validateHCVaultConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.LDAP]: validateLdapConnectionCredentials as TAppConnectionCredentialsValidator,
[AppConnection.TeamCity]: validateTeamCityConnectionCredentials as TAppConnectionCredentialsValidator
};
@@ -186,10 +193,13 @@ export const getAppConnectionMethodName = (method: TAppConnection["method"]) =>
case MsSqlConnectionMethod.UsernameAndPassword:
return "Username & Password";
case WindmillConnectionMethod.AccessToken:
case HCVaultConnectionMethod.AccessToken:
case TeamCityConnectionMethod.AccessToken:
return "Access Token";
case Auth0ConnectionMethod.ClientCredentials:
return "Client Credentials";
case HCVaultConnectionMethod.AppRole:
return "App Role";
case LdapConnectionMethod.SimpleBind:
return "Simple Bind";
default:
@@ -238,6 +248,7 @@ export const TRANSITION_CONNECTION_CREDENTIALS_TO_PLATFORM: Record<
[AppConnection.AzureClientSecrets]: platformManagedCredentialsNotSupported,
[AppConnection.Windmill]: platformManagedCredentialsNotSupported,
[AppConnection.Auth0]: platformManagedCredentialsNotSupported,
[AppConnection.HCVault]: platformManagedCredentialsNotSupported,
[AppConnection.LDAP]: platformManagedCredentialsNotSupported, // we could support this in the future
[AppConnection.TeamCity]: platformManagedCredentialsNotSupported
};

View File

@@ -16,6 +16,7 @@ export const APP_CONNECTION_NAME_MAP: Record<AppConnection, string> = {
[AppConnection.Camunda]: "Camunda",
[AppConnection.Windmill]: "Windmill",
[AppConnection.Auth0]: "Auth0",
[AppConnection.HCVault]: "Hashicorp Vault",
[AppConnection.LDAP]: "LDAP",
[AppConnection.TeamCity]: "TeamCity"
};

View File

@@ -43,6 +43,8 @@ import { ValidateGcpConnectionCredentialsSchema } from "./gcp";
import { gcpConnectionService } from "./gcp/gcp-connection-service";
import { ValidateGitHubConnectionCredentialsSchema } from "./github";
import { githubConnectionService } from "./github/github-connection-service";
import { ValidateHCVaultConnectionCredentialsSchema } from "./hc-vault";
import { hcVaultConnectionService } from "./hc-vault/hc-vault-connection-service";
import { ValidateHumanitecConnectionCredentialsSchema } from "./humanitec";
import { humanitecConnectionService } from "./humanitec/humanitec-connection-service";
import { ValidateLdapConnectionCredentialsSchema } from "./ldap";
@@ -81,6 +83,7 @@ const VALIDATE_APP_CONNECTION_CREDENTIALS_MAP: Record<AppConnection, TValidateAp
[AppConnection.AzureClientSecrets]: ValidateAzureClientSecretsConnectionCredentialsSchema,
[AppConnection.Windmill]: ValidateWindmillConnectionCredentialsSchema,
[AppConnection.Auth0]: ValidateAuth0ConnectionCredentialsSchema,
[AppConnection.HCVault]: ValidateHCVaultConnectionCredentialsSchema,
[AppConnection.LDAP]: ValidateLdapConnectionCredentialsSchema,
[AppConnection.TeamCity]: ValidateTeamCityConnectionCredentialsSchema
};
@@ -459,6 +462,7 @@ export const appConnectionServiceFactory = ({
vercel: vercelConnectionService(connectAppConnectionById),
azureClientSecrets: azureClientSecretsConnectionService(connectAppConnectionById, appConnectionDAL, kmsService),
auth0: auth0ConnectionService(connectAppConnectionById, appConnectionDAL, kmsService),
hcvault: hcVaultConnectionService(connectAppConnectionById),
windmill: windmillConnectionService(connectAppConnectionById),
teamcity: teamcityConnectionService(connectAppConnectionById)
};

View File

@@ -57,6 +57,12 @@ import {
TGitHubConnectionInput,
TValidateGitHubConnectionCredentialsSchema
} from "./github";
import {
THCVaultConnection,
THCVaultConnectionConfig,
THCVaultConnectionInput,
TValidateHCVaultConnectionCredentialsSchema
} from "./hc-vault";
import {
THumanitecConnection,
THumanitecConnectionConfig,
@@ -116,6 +122,7 @@ export type TAppConnection = { id: string } & (
| TAzureClientSecretsConnection
| TWindmillConnection
| TAuth0Connection
| THCVaultConnection
| TLdapConnection
| TTeamCityConnection
);
@@ -140,6 +147,7 @@ export type TAppConnectionInput = { id: string } & (
| TAzureClientSecretsConnectionInput
| TWindmillConnectionInput
| TAuth0ConnectionInput
| THCVaultConnectionInput
| TLdapConnectionInput
| TTeamCityConnectionInput
);
@@ -170,6 +178,7 @@ export type TAppConnectionConfig =
| TVercelConnectionConfig
| TWindmillConnectionConfig
| TAuth0ConnectionConfig
| THCVaultConnectionConfig
| TLdapConnectionConfig
| TTeamCityConnectionConfig;
@@ -189,6 +198,7 @@ export type TValidateAppConnectionCredentialsSchema =
| TValidateTerraformCloudConnectionCredentialsSchema
| TValidateWindmillConnectionCredentialsSchema
| TValidateAuth0ConnectionCredentialsSchema
| TValidateHCVaultConnectionCredentialsSchema
| TValidateLdapConnectionCredentialsSchema
| TValidateTeamCityConnectionCredentialsSchema;

View File

@@ -0,0 +1,4 @@
export enum HCVaultConnectionMethod {
AccessToken = "access-token",
AppRole = "app-role"
}

View File

@@ -0,0 +1,119 @@
import { AxiosError } from "axios";
import { request } from "@app/lib/config/request";
import { BadRequestError } from "@app/lib/errors";
import { removeTrailingSlash } from "@app/lib/fn";
import { blockLocalAndPrivateIpAddresses } from "@app/lib/validator";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import { HCVaultConnectionMethod } from "./hc-vault-connection-enums";
import {
THCVaultConnection,
THCVaultConnectionConfig,
THCVaultMountResponse,
TValidateHCVaultConnectionCredentials
} from "./hc-vault-connection-types";
export const getHCVaultInstanceUrl = async (config: THCVaultConnectionConfig) => {
const instanceUrl = removeTrailingSlash(config.credentials.instanceUrl);
await blockLocalAndPrivateIpAddresses(instanceUrl);
return instanceUrl;
};
export const getHCVaultConnectionListItem = () => ({
name: "HCVault" as const,
app: AppConnection.HCVault as const,
methods: Object.values(HCVaultConnectionMethod) as [
HCVaultConnectionMethod.AccessToken,
HCVaultConnectionMethod.AppRole
]
});
type TokenRespData = {
auth: {
client_token: string;
};
};
export const getHCVaultAccessToken = async (connection: TValidateHCVaultConnectionCredentials) => {
// Return access token directly if not using AppRole method
if (connection.method !== HCVaultConnectionMethod.AppRole) {
return connection.credentials.accessToken;
}
// Generate temporary token for AppRole method
try {
const { instanceUrl, roleId, secretId } = connection.credentials;
const tokenResp = await request.post<TokenRespData>(
`${removeTrailingSlash(instanceUrl)}/v1/auth/approle/login`,
{ role_id: roleId, secret_id: secretId },
{
headers: {
"Content-Type": "application/json",
...(connection.credentials.namespace ? { "X-Vault-Namespace": connection.credentials.namespace } : {})
}
}
);
if (tokenResp.status !== 200) {
throw new BadRequestError({
message: `Unable to validate credentials: Hashicorp Vault responded with a status code of ${tokenResp.status} (${tokenResp.statusText}). Verify credentials and try again.`
});
}
return tokenResp.data.auth.client_token;
} catch (e: unknown) {
throw new BadRequestError({
message: "Unable to validate connection: verify credentials"
});
}
};
export const validateHCVaultConnectionCredentials = async (config: THCVaultConnectionConfig) => {
const instanceUrl = await getHCVaultInstanceUrl(config);
try {
const accessToken = await getHCVaultAccessToken(config);
// Verify token
await request.get(`${instanceUrl}/v1/auth/token/lookup-self`, {
headers: { "X-Vault-Token": accessToken }
});
return config.credentials;
} catch (error: unknown) {
if (error instanceof AxiosError) {
throw new BadRequestError({
message: `Failed to validate credentials: ${error.message || "Unknown error"}`
});
}
throw new BadRequestError({
message: "Unable to validate connection: verify credentials"
});
}
};
export const listHCVaultMounts = async (appConnection: THCVaultConnection) => {
const instanceUrl = await getHCVaultInstanceUrl(appConnection);
const accessToken = await getHCVaultAccessToken(appConnection);
const { data } = await request.get<THCVaultMountResponse>(`${instanceUrl}/v1/sys/mounts`, {
headers: {
"X-Vault-Token": accessToken,
...(appConnection.credentials.namespace ? { "X-Vault-Namespace": appConnection.credentials.namespace } : {})
}
});
const mounts: string[] = [];
// Filter for "kv" version 2 type only
Object.entries(data.data).forEach(([path, mount]) => {
if (mount.type === "kv" && mount.options?.version === "2") {
mounts.push(path);
}
});
return mounts;
};

View File

@@ -0,0 +1,100 @@
import z from "zod";
import { AppConnections } from "@app/lib/api-docs";
import { AppConnection } from "@app/services/app-connection/app-connection-enums";
import {
BaseAppConnectionSchema,
GenericCreateAppConnectionFieldsSchema,
GenericUpdateAppConnectionFieldsSchema
} from "@app/services/app-connection/app-connection-schemas";
import { HCVaultConnectionMethod } from "./hc-vault-connection-enums";
const InstanceUrlSchema = z
.string()
.trim()
.min(1, "Instance URL required")
.url("Invalid Instance URL")
.describe(AppConnections.CREDENTIALS.HC_VAULT.instanceUrl);
const NamespaceSchema = z.string().trim().optional().describe(AppConnections.CREDENTIALS.HC_VAULT.namespace);
export const HCVaultConnectionAccessTokenCredentialsSchema = z.object({
instanceUrl: InstanceUrlSchema,
namespace: NamespaceSchema,
accessToken: z
.string()
.trim()
.min(1, "Access Token required")
.describe(AppConnections.CREDENTIALS.HC_VAULT.accessToken)
});
export const HCVaultConnectionAppRoleCredentialsSchema = z.object({
instanceUrl: InstanceUrlSchema,
namespace: NamespaceSchema,
roleId: z.string().trim().min(1, "Role ID required").describe(AppConnections.CREDENTIALS.HC_VAULT.roleId),
secretId: z.string().trim().min(1, "Secret ID required").describe(AppConnections.CREDENTIALS.HC_VAULT.secretId)
});
const BaseHCVaultConnectionSchema = BaseAppConnectionSchema.extend({ app: z.literal(AppConnection.HCVault) });
export const HCVaultConnectionSchema = z.intersection(
BaseHCVaultConnectionSchema,
z.discriminatedUnion("method", [
z.object({
method: z.literal(HCVaultConnectionMethod.AccessToken),
credentials: HCVaultConnectionAccessTokenCredentialsSchema
}),
z.object({
method: z.literal(HCVaultConnectionMethod.AppRole),
credentials: HCVaultConnectionAppRoleCredentialsSchema
})
])
);
export const SanitizedHCVaultConnectionSchema = z.discriminatedUnion("method", [
BaseHCVaultConnectionSchema.extend({
method: z.literal(HCVaultConnectionMethod.AccessToken),
credentials: HCVaultConnectionAccessTokenCredentialsSchema.pick({})
}),
BaseHCVaultConnectionSchema.extend({
method: z.literal(HCVaultConnectionMethod.AppRole),
credentials: HCVaultConnectionAppRoleCredentialsSchema.pick({})
})
]);
export const ValidateHCVaultConnectionCredentialsSchema = z.discriminatedUnion("method", [
z.object({
method: z
.literal(HCVaultConnectionMethod.AccessToken)
.describe(AppConnections.CREATE(AppConnection.HCVault).method),
credentials: HCVaultConnectionAccessTokenCredentialsSchema.describe(
AppConnections.CREATE(AppConnection.HCVault).credentials
)
}),
z.object({
method: z.literal(HCVaultConnectionMethod.AppRole).describe(AppConnections.CREATE(AppConnection.HCVault).method),
credentials: HCVaultConnectionAppRoleCredentialsSchema.describe(
AppConnections.CREATE(AppConnection.HCVault).credentials
)
})
]);
export const CreateHCVaultConnectionSchema = ValidateHCVaultConnectionCredentialsSchema.and(
GenericCreateAppConnectionFieldsSchema(AppConnection.HCVault)
);
export const UpdateHCVaultConnectionSchema = z
.object({
credentials: z
.union([HCVaultConnectionAccessTokenCredentialsSchema, HCVaultConnectionAppRoleCredentialsSchema])
.optional()
.describe(AppConnections.UPDATE(AppConnection.HCVault).credentials)
})
.and(GenericUpdateAppConnectionFieldsSchema(AppConnection.HCVault));
export const HCVaultConnectionListItemSchema = z.object({
name: z.literal("HCVault"),
app: z.literal(AppConnection.HCVault),
methods: z.nativeEnum(HCVaultConnectionMethod).array()
});

View File

@@ -0,0 +1,30 @@
import { logger } from "@app/lib/logger";
import { OrgServiceActor } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import { listHCVaultMounts } from "./hc-vault-connection-fns";
import { THCVaultConnection } from "./hc-vault-connection-types";
type TGetAppConnectionFunc = (
app: AppConnection,
connectionId: string,
actor: OrgServiceActor
) => Promise<THCVaultConnection>;
export const hcVaultConnectionService = (getAppConnection: TGetAppConnectionFunc) => {
const listMounts = async (connectionId: string, actor: OrgServiceActor) => {
const appConnection = await getAppConnection(AppConnection.HCVault, connectionId, actor);
try {
const mounts = await listHCVaultMounts(appConnection);
return mounts;
} catch (error) {
logger.error(error, "Failed to establish connection with Hashicorp Vault");
return [];
}
};
return {
listMounts
};
};

View File

@@ -0,0 +1,35 @@
import z from "zod";
import { DiscriminativePick } from "@app/lib/types";
import { AppConnection } from "../app-connection-enums";
import {
CreateHCVaultConnectionSchema,
HCVaultConnectionSchema,
ValidateHCVaultConnectionCredentialsSchema
} from "./hc-vault-connection-schemas";
export type THCVaultConnection = z.infer<typeof HCVaultConnectionSchema>;
export type THCVaultConnectionInput = z.infer<typeof CreateHCVaultConnectionSchema> & {
app: AppConnection.HCVault;
};
export type TValidateHCVaultConnectionCredentialsSchema = typeof ValidateHCVaultConnectionCredentialsSchema;
export type TValidateHCVaultConnectionCredentials = z.infer<typeof ValidateHCVaultConnectionCredentialsSchema>;
export type THCVaultConnectionConfig = DiscriminativePick<THCVaultConnectionInput, "method" | "app" | "credentials"> & {
orgId: string;
};
export type THCVaultMountResponse = {
data: {
[key: string]: {
options: {
version?: string | null;
} | null;
type: string; // We're only interested in "kv" types
};
};
};

View File

@@ -0,0 +1,4 @@
export * from "./hc-vault-connection-enums";
export * from "./hc-vault-connection-fns";
export * from "./hc-vault-connection-schemas";
export * from "./hc-vault-connection-types";

View File

@@ -2,7 +2,7 @@ import bcrypt from "bcrypt";
import jwt from "jsonwebtoken";
import { Knex } from "knex";
import { OrgMembershipRole, TUsers, UserDeviceSchema } from "@app/db/schemas";
import { OrgMembershipRole, OrgMembershipStatus, TableName, TUsers, UserDeviceSchema } from "@app/db/schemas";
import { TAuditLogServiceFactory } from "@app/ee/services/audit-log/audit-log-service";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { isAuthMethodSaml } from "@app/ee/services/permission/permission-fns";
@@ -20,6 +20,8 @@ import { getServerCfg } from "@app/services/super-admin/super-admin-service";
import { TAuthTokenServiceFactory } from "../auth-token/auth-token-service";
import { TokenType } from "../auth-token/auth-token-types";
import { TOrgDALFactory } from "../org/org-dal";
import { getDefaultOrgMembershipRole } from "../org/org-role-fns";
import { TOrgMembershipDALFactory } from "../org-membership/org-membership-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { LoginMethod } from "../super-admin/super-admin-types";
import { TTotpServiceFactory } from "../totp/totp-service";
@@ -48,6 +50,7 @@ type TAuthLoginServiceFactoryDep = {
smtpService: TSmtpService;
totpService: Pick<TTotpServiceFactory, "verifyUserTotp" | "verifyWithUserRecoveryCode">;
auditLogService: Pick<TAuditLogServiceFactory, "createAuditLog">;
orgMembershipDAL: TOrgMembershipDALFactory;
};
export type TAuthLoginFactory = ReturnType<typeof authLoginServiceFactory>;
@@ -56,6 +59,7 @@ export const authLoginServiceFactory = ({
tokenService,
smtpService,
orgDAL,
orgMembershipDAL,
totpService,
auditLogService
}: TAuthLoginServiceFactoryDep) => {
@@ -397,8 +401,8 @@ export const authLoginServiceFactory = ({
}
const shouldCheckMfa = selectedOrg.enforceMfa || user.isMfaEnabled;
const orgMfaMethod = selectedOrg.enforceMfa ? selectedOrg.selectedMfaMethod ?? MfaMethod.EMAIL : undefined;
const userMfaMethod = user.isMfaEnabled ? user.selectedMfaMethod ?? MfaMethod.EMAIL : undefined;
const orgMfaMethod = selectedOrg.enforceMfa ? (selectedOrg.selectedMfaMethod ?? MfaMethod.EMAIL) : undefined;
const userMfaMethod = user.isMfaEnabled ? (user.selectedMfaMethod ?? MfaMethod.EMAIL) : undefined;
const mfaMethod = orgMfaMethod ?? userMfaMethod;
if (shouldCheckMfa && (!decodedToken.isMfaVerified || decodedToken.mfaMethod !== mfaMethod)) {
@@ -569,9 +573,9 @@ export const authLoginServiceFactory = ({
}: TVerifyMfaTokenDTO) => {
const appCfg = getConfig();
const user = await userDAL.findById(userId);
enforceUserLockStatus(Boolean(user.isLocked), user.temporaryLockDateEnd);
try {
enforceUserLockStatus(Boolean(user.isLocked), user.temporaryLockDateEnd);
if (mfaMethod === MfaMethod.EMAIL) {
await tokenService.validateTokenForUser({
type: TokenType.TOKEN_EMAIL_MFA,
@@ -719,6 +723,35 @@ export const authLoginServiceFactory = ({
authMethods: [authMethod],
isGhost: false
});
if (authMethod === AuthMethod.GITHUB && serverCfg.defaultAuthOrgId && !appCfg.isCloud) {
let orgId = "";
const defaultOrg = await orgDAL.findOrgById(serverCfg.defaultAuthOrgId);
if (!defaultOrg) {
throw new BadRequestError({
message: `Failed to find default organization with ID ${serverCfg.defaultAuthOrgId}`
});
}
orgId = defaultOrg.id;
const [orgMembership] = await orgDAL.findMembership({
[`${TableName.OrgMembership}.userId` as "userId"]: user.id,
[`${TableName.OrgMembership}.orgId` as "id"]: orgId
});
if (!orgMembership) {
const { role, roleId } = await getDefaultOrgMembershipRole(defaultOrg.defaultMembershipRole);
await orgMembershipDAL.create({
userId: user.id,
inviteEmail: email,
orgId,
role,
roleId,
status: OrgMembershipStatus.Accepted,
isActive: true
});
}
}
} else {
const isLinkingRequired = !user?.authMethods?.includes(authMethod);
if (isLinkingRequired) {

View File

@@ -9,7 +9,7 @@ import { isAuthMethodSaml } from "@app/ee/services/permission/permission-fns";
import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricDecrypt, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { generateUserSrpKeys, getUserPrivateKey } from "@app/lib/crypto/srp";
import { ForbiddenRequestError, NotFoundError } from "@app/lib/errors";
import { BadRequestError, ForbiddenRequestError, NotFoundError } from "@app/lib/errors";
import { getMinExpiresIn } from "@app/lib/fn";
import { isDisposableEmail } from "@app/lib/validator";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
@@ -150,7 +150,8 @@ export const authSignupServiceFactory = ({
encryptedPrivateKeyTag,
ip,
userAgent,
authorization
authorization,
useDefaultOrg
}: TCompleteAccountSignupDTO) => {
const appCfg = getConfig();
const serverCfg = await getServerCfg();
@@ -293,15 +294,24 @@ export const authSignupServiceFactory = ({
});
if (!organizationId) {
const newOrganization = await orgService.createOrganization({
userId: user.id,
userEmail: user.email ?? user.username,
orgName: organizationName
});
let orgId = "";
if (useDefaultOrg && serverCfg.defaultAuthOrgId && !appCfg.isCloud) {
const defaultOrg = await orgDAL.findOrgById(serverCfg.defaultAuthOrgId);
if (!defaultOrg) throw new BadRequestError({ message: "Failed to find default organization" });
orgId = defaultOrg.id;
} else {
if (!organizationName) throw new BadRequestError({ message: "Organization name is required" });
const newOrganization = await orgService.createOrganization({
userId: user.id,
userEmail: user.email ?? user.username,
orgName: organizationName
});
if (!newOrganization) throw new Error("Failed to create organization");
if (!newOrganization) throw new Error("Failed to create organization");
orgId = newOrganization.id;
}
organizationId = newOrganization.id;
organizationId = orgId;
}
const updatedMembersips = await orgDAL.updateMembership(

View File

@@ -12,12 +12,13 @@ export type TCompleteAccountSignupDTO = {
encryptedPrivateKeyTag: string;
salt: string;
verifier: string;
organizationName: string;
organizationName?: string;
providerAuthToken?: string | null;
attributionSource?: string | undefined;
ip: string;
userAgent: string;
authorization: string;
useDefaultOrg?: boolean;
};
export type TCompleteAccountInviteDTO = {

View File

@@ -10,6 +10,7 @@ import { chunkArray } from "@app/lib/fn";
import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TFolderCommitServiceFactory } from "../folder-commit/folder-commit-service";
import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types";
import { TProjectDALFactory } from "../project/project-dal";
@@ -18,6 +19,7 @@ import { TProjectEnvDALFactory } from "../project-env/project-env-dal";
import { TProjectEnvServiceFactory } from "../project-env/project-env-service";
import { TResourceMetadataDALFactory } from "../resource-metadata/resource-metadata-dal";
import { TSecretFolderDALFactory } from "../secret-folder/secret-folder-dal";
import { TSecretFolderVersionDALFactory } from "../secret-folder/secret-folder-version-dal";
import { TSecretTagDALFactory } from "../secret-tag/secret-tag-dal";
import { TSecretV2BridgeDALFactory } from "../secret-v2-bridge/secret-v2-bridge-dal";
import { fnSecretBulkInsert, getAllSecretReferences } from "../secret-v2-bridge/secret-v2-bridge-fns";
@@ -42,6 +44,8 @@ export type TImportDataIntoInfisicalDTO = {
projectService: Pick<TProjectServiceFactory, "createProject">;
projectEnvService: Pick<TProjectEnvServiceFactory, "createEnvironment">;
secretV2BridgeService: Pick<TSecretV2BridgeServiceFactory, "createManySecret">;
folderCommitService: Pick<TFolderCommitServiceFactory, "createCommit">;
folderVersionDAL: Pick<TSecretFolderVersionDALFactory, "create">;
input: TImportInfisicalDataCreate;
};
@@ -507,6 +511,8 @@ export const importDataIntoInfisicalFn = async ({
secretVersionTagDAL,
folderDAL,
resourceMetadataDAL,
folderVersionDAL,
folderCommitService,
input: { data, actor, actorId, actorOrgId, actorAuthMethod }
}: TImportDataIntoInfisicalDTO) => {
// Import data to infisical
@@ -599,6 +605,36 @@ export const importDataIntoInfisicalFn = async ({
tx
);
const newFolderVersion = await folderVersionDAL.create(
{
name: newFolder.name,
envId: newFolder.envId,
version: newFolder.version,
folderId: newFolder.id
},
tx
);
await folderCommitService.createCommit(
{
actor: {
type: actor,
metadata: {
id: actorId
}
},
message: "Changed by external migration",
folderId: parentEnv.rootFolderId,
changes: [
{
type: "add",
folderVersionId: newFolderVersion.id
}
]
},
tx
);
originalToNewFolderId.set(folder.id, {
folderId: newFolder.id,
projectId: parentEnv.projectId
@@ -772,6 +808,7 @@ export const importDataIntoInfisicalFn = async ({
secretVersionDAL,
secretTagDAL,
secretVersionTagDAL,
folderCommitService,
actor: {
type: actor,
actorId

View File

@@ -3,6 +3,7 @@ import { infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { logger } from "@app/lib/logger";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { TFolderCommitServiceFactory } from "../folder-commit/folder-commit-service";
import { TKmsServiceFactory } from "../kms/kms-service";
import { TProjectDALFactory } from "../project/project-dal";
import { TProjectServiceFactory } from "../project/project-service";
@@ -10,6 +11,7 @@ import { TProjectEnvDALFactory } from "../project-env/project-env-dal";
import { TProjectEnvServiceFactory } from "../project-env/project-env-service";
import { TResourceMetadataDALFactory } from "../resource-metadata/resource-metadata-dal";
import { TSecretFolderDALFactory } from "../secret-folder/secret-folder-dal";
import { TSecretFolderVersionDALFactory } from "../secret-folder/secret-folder-version-dal";
import { TSecretTagDALFactory } from "../secret-tag/secret-tag-dal";
import { TSecretV2BridgeDALFactory } from "../secret-v2-bridge/secret-v2-bridge-dal";
import { TSecretV2BridgeServiceFactory } from "../secret-v2-bridge/secret-v2-bridge-service";
@@ -36,6 +38,8 @@ export type TExternalMigrationQueueFactoryDep = {
projectService: Pick<TProjectServiceFactory, "createProject">;
projectEnvService: Pick<TProjectEnvServiceFactory, "createEnvironment">;
secretV2BridgeService: Pick<TSecretV2BridgeServiceFactory, "createManySecret">;
folderCommitService: Pick<TFolderCommitServiceFactory, "createCommit">;
folderVersionDAL: Pick<TSecretFolderVersionDALFactory, "create">;
resourceMetadataDAL: Pick<TResourceMetadataDALFactory, "insertMany" | "delete">;
};
@@ -56,6 +60,8 @@ export const externalMigrationQueueFactory = ({
secretTagDAL,
secretVersionTagDAL,
folderDAL,
folderCommitService,
folderVersionDAL,
resourceMetadataDAL
}: TExternalMigrationQueueFactoryDep) => {
const startImport = async (dto: {
@@ -114,6 +120,8 @@ export const externalMigrationQueueFactory = ({
projectService,
projectEnvService,
secretV2BridgeService,
folderCommitService,
folderVersionDAL,
resourceMetadataDAL
});

View File

@@ -0,0 +1,118 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import {
TableName,
TFolderCheckpointResources,
TFolderCheckpoints,
TSecretFolderVersions,
TSecretVersionsV2
} from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TFolderCheckpointResourcesDALFactory = ReturnType<typeof folderCheckpointResourcesDALFactory>;
export type ResourceWithCheckpointInfo = TFolderCheckpointResources & {
folderCommitId: string;
};
export const folderCheckpointResourcesDALFactory = (db: TDbClient) => {
const folderCheckpointResourcesOrm = ormify(db, TableName.FolderCheckpointResources);
const findByCheckpointId = async (
folderCheckpointId: string,
tx?: Knex
): Promise<
(TFolderCheckpointResources & {
referencedSecretId?: string;
referencedFolderId?: string;
folderName?: string;
folderVersion?: string;
secretKey?: string;
secretVersion?: string;
})[]
> => {
try {
const docs = await (tx || db.replicaNode())<TFolderCheckpointResources>(TableName.FolderCheckpointResources)
.where({ folderCheckpointId })
.leftJoin<TSecretVersionsV2>(
TableName.SecretVersionV2,
`${TableName.FolderCheckpointResources}.secretVersionId`,
`${TableName.SecretVersionV2}.id`
)
.leftJoin<TSecretFolderVersions>(
TableName.SecretFolderVersion,
`${TableName.FolderCheckpointResources}.folderVersionId`,
`${TableName.SecretFolderVersion}.id`
)
.select(selectAllTableCols(TableName.FolderCheckpointResources))
.select(
db.ref("secretId").withSchema(TableName.SecretVersionV2).as("referencedSecretId"),
db.ref("folderId").withSchema(TableName.SecretFolderVersion).as("referencedFolderId"),
db.ref("name").withSchema(TableName.SecretFolderVersion).as("folderName"),
db.ref("version").withSchema(TableName.SecretFolderVersion).as("folderVersion"),
db.ref("key").withSchema(TableName.SecretVersionV2).as("secretKey"),
db.ref("version").withSchema(TableName.SecretVersionV2).as("secretVersion")
);
return docs.map((doc) => ({
...doc,
folderVersion: doc.folderVersion?.toString(),
secretVersion: doc.secretVersion?.toString()
}));
} catch (error) {
throw new DatabaseError({ error, name: "FindByCheckpointId" });
}
};
const findBySecretVersionId = async (secretVersionId: string, tx?: Knex): Promise<ResourceWithCheckpointInfo[]> => {
try {
const docs = await (tx || db.replicaNode())<
TFolderCheckpointResources & Pick<TFolderCheckpoints, "folderCommitId" | "createdAt">
>(TableName.FolderCheckpointResources)
.where({ secretVersionId })
.select(selectAllTableCols(TableName.FolderCheckpointResources))
.join(
TableName.FolderCheckpoint,
`${TableName.FolderCheckpointResources}.folderCheckpointId`,
`${TableName.FolderCheckpoint}.id`
)
.select(
db.ref("folderCommitId").withSchema(TableName.FolderCheckpoint),
db.ref("createdAt").withSchema(TableName.FolderCheckpoint)
);
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindBySecretVersionId" });
}
};
const findByFolderVersionId = async (folderVersionId: string, tx?: Knex): Promise<ResourceWithCheckpointInfo[]> => {
try {
const docs = await (tx || db.replicaNode())<
TFolderCheckpointResources & Pick<TFolderCheckpoints, "folderCommitId" | "createdAt">
>(TableName.FolderCheckpointResources)
.where({ folderVersionId })
.select(selectAllTableCols(TableName.FolderCheckpointResources))
.join(
TableName.FolderCheckpoint,
`${TableName.FolderCheckpointResources}.folderCheckpointId`,
`${TableName.FolderCheckpoint}.id`
)
.select(
db.ref("folderCommitId").withSchema(TableName.FolderCheckpoint),
db.ref("createdAt").withSchema(TableName.FolderCheckpoint)
);
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindByFolderVersionId" });
}
};
return {
...folderCheckpointResourcesOrm,
findByCheckpointId,
findBySecretVersionId,
findByFolderVersionId
};
};

View File

@@ -0,0 +1,138 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName, TFolderCheckpoints, TFolderCommits } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { buildFindFilter, ormify, selectAllTableCols } from "@app/lib/knex";
export type TFolderCheckpointDALFactory = ReturnType<typeof folderCheckpointDALFactory>;
type CheckpointWithCommitInfo = TFolderCheckpoints & {
actorMetadata: unknown;
actorType: string;
message?: string | null;
commitDate: Date;
folderId: string;
};
export const folderCheckpointDALFactory = (db: TDbClient) => {
const folderCheckpointOrm = ormify(db, TableName.FolderCheckpoint);
const findByCommitId = async (folderCommitId: string, tx?: Knex): Promise<TFolderCheckpoints | undefined> => {
try {
const doc = await (tx || db.replicaNode())<TFolderCheckpoints>(TableName.FolderCheckpoint)
.where({ folderCommitId })
.select(selectAllTableCols(TableName.FolderCheckpoint))
.first();
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "FindByCommitId" });
}
};
const findByFolderId = async (folderId: string, limit?: number, tx?: Knex): Promise<CheckpointWithCommitInfo[]> => {
try {
let query = (tx || db.replicaNode())(TableName.FolderCheckpoint)
.join<TFolderCommits>(
TableName.FolderCommit,
`${TableName.FolderCheckpoint}.folderCommitId`,
`${TableName.FolderCommit}.id`
)
// eslint-disable-next-line @typescript-eslint/no-misused-promises
.where(buildFindFilter({ folderId }, TableName.FolderCommit))
.select(selectAllTableCols(TableName.FolderCheckpoint))
.select(
db.ref("actorMetadata").withSchema(TableName.FolderCommit),
db.ref("actorType").withSchema(TableName.FolderCommit),
db.ref("message").withSchema(TableName.FolderCommit),
db.ref("createdAt").withSchema(TableName.FolderCommit).as("commitDate"),
db.ref("folderId").withSchema(TableName.FolderCommit)
)
.orderBy(`${TableName.FolderCheckpoint}.createdAt`, "desc");
if (limit !== undefined) {
query = query.limit(limit);
}
return await query;
} catch (error) {
throw new DatabaseError({ error, name: "FindByFolderId" });
}
};
const findLatestByFolderId = async (folderId: string, tx?: Knex): Promise<CheckpointWithCommitInfo | undefined> => {
try {
const doc = await (tx || db.replicaNode())(TableName.FolderCheckpoint)
.join<TFolderCommits>(
TableName.FolderCommit,
`${TableName.FolderCheckpoint}.folderCommitId`,
`${TableName.FolderCommit}.id`
)
// eslint-disable-next-line @typescript-eslint/no-misused-promises
.where(buildFindFilter({ folderId }, TableName.FolderCommit))
.select(selectAllTableCols(TableName.FolderCheckpoint))
.select(
db.ref("actorMetadata").withSchema(TableName.FolderCommit),
db.ref("actorType").withSchema(TableName.FolderCommit),
db.ref("message").withSchema(TableName.FolderCommit),
db.ref("createdAt").withSchema(TableName.FolderCommit).as("commitDate"),
db.ref("folderId").withSchema(TableName.FolderCommit)
)
.orderBy(`${TableName.FolderCheckpoint}.createdAt`, "desc")
.first();
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestByFolderId" });
}
};
const findNearestCheckpoint = async (
folderCommitId: string,
tx?: Knex
): Promise<(CheckpointWithCommitInfo & { commitId: number }) | undefined> => {
try {
// First, get the commit info to find the folder ID and commit sequence number
const commit = await (tx || db.replicaNode())(TableName.FolderCommit)
.where({ id: folderCommitId })
.select("commitId", "folderId")
.first();
if (!commit) {
return undefined;
}
// Get the checkpoint with the highest commitId that's still less than or equal to our commit
const nearestCheckpoint = await (tx || db.replicaNode())(TableName.FolderCheckpoint)
.join<TFolderCommits>(
TableName.FolderCommit,
`${TableName.FolderCheckpoint}.folderCommitId`,
`${TableName.FolderCommit}.id`
)
.where(`${TableName.FolderCommit}.folderId`, commit.folderId)
.where(`${TableName.FolderCommit}.commitId`, "<=", commit.commitId.toString())
.select(selectAllTableCols(TableName.FolderCheckpoint))
.select(
db.ref("actorMetadata").withSchema(TableName.FolderCommit),
db.ref("actorType").withSchema(TableName.FolderCommit),
db.ref("message").withSchema(TableName.FolderCommit),
db.ref("commitId").withSchema(TableName.FolderCommit),
db.ref("createdAt").withSchema(TableName.FolderCommit).as("commitDate"),
db.ref("folderId").withSchema(TableName.FolderCommit)
)
.orderBy(`${TableName.FolderCommit}.commitId`, "desc")
.first();
return nearestCheckpoint;
} catch (error) {
throw new DatabaseError({ error, name: "FindNearestCheckpoint" });
}
};
return {
...folderCheckpointOrm,
findByCommitId,
findByFolderId,
findLatestByFolderId,
findNearestCheckpoint
};
};

View File

@@ -0,0 +1,137 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import {
TableName,
TFolderCommitChanges,
TFolderCommits,
TSecretFolderVersions,
TSecretVersionsV2
} from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TFolderCommitChangesDALFactory = ReturnType<typeof folderCommitChangesDALFactory>;
type CommitChangeWithCommitInfo = TFolderCommitChanges & {
actorMetadata: unknown;
actorType: string;
message?: string | null;
folderId: string;
folderName?: string;
folderVersion?: string;
secretKey?: string;
secretVersion?: string;
secretId?: string;
folderChangeId?: string;
versions?: {
secretKey?: string;
secretComment?: string;
skipMultilineEncoding?: boolean | null;
secretReminderRepeatDays?: number | null;
secretReminderNote?: string | null;
metadata?: unknown;
tags?: string[] | null;
secretReminderRecipients?: string[] | null;
secretValue?: string;
name?: string;
}[];
};
export const folderCommitChangesDALFactory = (db: TDbClient) => {
const folderCommitChangesOrm = ormify(db, TableName.FolderCommitChanges);
const findByCommitId = async (folderCommitId: string, tx?: Knex): Promise<CommitChangeWithCommitInfo[]> => {
try {
const docs = await (tx || db.replicaNode())<TFolderCommitChanges>(TableName.FolderCommitChanges)
.where({ folderCommitId })
.leftJoin<TFolderCommits>(
TableName.FolderCommit,
`${TableName.FolderCommitChanges}.folderCommitId`,
`${TableName.FolderCommit}.id`
)
.leftJoin<TSecretVersionsV2>(
TableName.SecretVersionV2,
`${TableName.FolderCommitChanges}.secretVersionId`,
`${TableName.SecretVersionV2}.id`
)
.leftJoin<TSecretFolderVersions>(
TableName.SecretFolderVersion,
`${TableName.FolderCommitChanges}.folderVersionId`,
`${TableName.SecretFolderVersion}.id`
)
.select(selectAllTableCols(TableName.FolderCommitChanges))
.select(
db.ref("name").withSchema(TableName.SecretFolderVersion).as("folderName"),
db.ref("folderId").withSchema(TableName.SecretFolderVersion).as("folderChangeId"),
db.ref("version").withSchema(TableName.SecretFolderVersion).as("folderVersion"),
db.ref("key").withSchema(TableName.SecretVersionV2).as("secretKey"),
db.ref("version").withSchema(TableName.SecretVersionV2).as("secretVersion"),
db.ref("secretId").withSchema(TableName.SecretVersionV2),
db.ref("actorMetadata").withSchema(TableName.FolderCommit),
db.ref("actorType").withSchema(TableName.FolderCommit),
db.ref("message").withSchema(TableName.FolderCommit),
db.ref("createdAt").withSchema(TableName.FolderCommit),
db.ref("folderId").withSchema(TableName.FolderCommit)
);
return docs.map((doc) => ({
...doc,
folderVersion: doc.folderVersion?.toString(),
secretVersion: doc.secretVersion?.toString()
}));
} catch (error) {
throw new DatabaseError({ error, name: "FindByCommitId" });
}
};
const findBySecretVersionId = async (secretVersionId: string, tx?: Knex): Promise<CommitChangeWithCommitInfo[]> => {
try {
const docs = await (tx || db.replicaNode())<
TFolderCommitChanges &
Pick<TFolderCommits, "actorMetadata" | "actorType" | "message" | "createdAt" | "folderId">
>(TableName.FolderCommitChanges)
.where({ secretVersionId })
.select(selectAllTableCols(TableName.FolderCommitChanges))
.join(TableName.FolderCommit, `${TableName.FolderCommitChanges}.folderCommitId`, `${TableName.FolderCommit}.id`)
.select(
db.ref("actorMetadata").withSchema(TableName.FolderCommit),
db.ref("actorType").withSchema(TableName.FolderCommit),
db.ref("message").withSchema(TableName.FolderCommit),
db.ref("createdAt").withSchema(TableName.FolderCommit),
db.ref("folderId").withSchema(TableName.FolderCommit)
);
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindBySecretVersionId" });
}
};
const findByFolderVersionId = async (folderVersionId: string, tx?: Knex): Promise<CommitChangeWithCommitInfo[]> => {
try {
const docs = await (tx || db.replicaNode())<
TFolderCommitChanges &
Pick<TFolderCommits, "actorMetadata" | "actorType" | "message" | "createdAt" | "folderId">
>(TableName.FolderCommitChanges)
.where({ folderVersionId })
.select(selectAllTableCols(TableName.FolderCommitChanges))
.join(TableName.FolderCommit, `${TableName.FolderCommitChanges}.folderCommitId`, `${TableName.FolderCommit}.id`)
.select(
db.ref("actorMetadata").withSchema(TableName.FolderCommit),
db.ref("actorType").withSchema(TableName.FolderCommit),
db.ref("message").withSchema(TableName.FolderCommit),
db.ref("createdAt").withSchema(TableName.FolderCommit),
db.ref("folderId").withSchema(TableName.FolderCommit)
);
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindByFolderVersionId" });
}
};
return {
...folderCommitChangesOrm,
findByCommitId,
findBySecretVersionId,
findByFolderVersionId
};
};

View File

@@ -0,0 +1,361 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import {
TableName,
TFolderCommitChanges,
TFolderCommits,
TSecretFolderVersions,
TSecretVersionsV2
} from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { buildFindFilter, ormify, selectAllTableCols } from "@app/lib/knex";
export type TFolderCommitDALFactory = ReturnType<typeof folderCommitDALFactory>;
export const folderCommitDALFactory = (db: TDbClient) => {
const folderCommitOrm = ormify(db, TableName.FolderCommit);
const { delete: deleteOp, deleteById, ...restOfOrm } = folderCommitOrm;
const findByFolderId = async (folderId: string, tx?: Knex): Promise<TFolderCommits[]> => {
try {
const trx = tx || db.replicaNode();
// First, get all folder commits
const folderCommits = await trx(TableName.FolderCommit)
.where({ folderId })
.select("*")
.orderBy("createdAt", "desc");
if (folderCommits.length === 0) return [];
// Get all commit IDs
const commitIds = folderCommits.map((commit) => commit.id);
// Then get all related changes
const changes = await trx(TableName.FolderCommitChanges).whereIn("folderCommitId", commitIds).select("*");
const changesMap = changes.reduce(
(acc, change) => {
const { folderCommitId } = change;
if (!acc[folderCommitId]) acc[folderCommitId] = [];
acc[folderCommitId].push(change);
return acc;
},
{} as Record<string, TFolderCommitChanges[]>
);
return folderCommits.map((commit) => ({
...commit,
changes: changesMap[commit.id] || []
}));
} catch (error) {
throw new DatabaseError({ error, name: "FindByFolderId" });
}
};
const findLatestCommit = async (folderId: string, tx?: Knex): Promise<TFolderCommits | undefined> => {
try {
const doc = await (tx || db.replicaNode())(TableName.FolderCommit)
.where({ folderId })
.select(selectAllTableCols(TableName.FolderCommit))
.orderBy("commitId", "desc")
.first();
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestCommit" });
}
};
const findLatestCommitByFolderIds = async (folderIds: string[], tx?: Knex): Promise<TFolderCommits[] | undefined> => {
try {
// First get max commitId for each folderId
const maxCommitIdSubquery = (tx || db.replicaNode())(TableName.FolderCommit)
.select("folderId")
.max("commitId as maxCommitId")
.whereIn("folderId", folderIds)
.groupBy("folderId");
// Join with main table to get complete records for each max commitId
const docs = await (tx || db.replicaNode())(TableName.FolderCommit)
.select(selectAllTableCols(TableName.FolderCommit))
// eslint-disable-next-line func-names
.join<TFolderCommits>(maxCommitIdSubquery.as("latest"), function () {
this.on(`${TableName.FolderCommit}.folderId`, "=", "latest.folderId").andOn(
`${TableName.FolderCommit}.commitId`,
"=",
"latest.maxCommitId"
);
});
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestCommitByFolderIds" });
}
};
const findLatestEnvCommit = async (envId: string, tx?: Knex): Promise<TFolderCommits | undefined> => {
try {
const doc = await (tx || db.replicaNode())(TableName.FolderCommit)
.where(`${TableName.FolderCommit}.envId`, "=", envId)
.select(selectAllTableCols(TableName.FolderCommit))
.orderBy("commitId", "desc")
.first();
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestCommit" });
}
};
const findMultipleLatestCommits = async (folderIds: string[], tx?: Knex): Promise<TFolderCommits[]> => {
try {
const knexInstance = tx || db.replicaNode();
// Get the latest commitId for each folderId
const subquery = knexInstance(TableName.FolderCommit)
.whereIn("folderId", folderIds)
.groupBy("folderId")
.select("folderId")
.max("commitId as maxCommitId");
// Then fetch the complete rows matching those latest commits
const docs = await knexInstance(TableName.FolderCommit)
// eslint-disable-next-line func-names
.innerJoin<TFolderCommits>(subquery.as("latest"), function () {
this.on(`${TableName.FolderCommit}.folderId`, "=", "latest.folderId").andOn(
`${TableName.FolderCommit}.commitId`,
"=",
"latest.maxCommitId"
);
})
.select(selectAllTableCols(TableName.FolderCommit));
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindMultipleLatestCommits" });
}
};
const getNumberOfCommitsSince = async (folderId: string, folderCommitId: string, tx?: Knex): Promise<number> => {
try {
const referencedCommit = await (tx || db.replicaNode())(TableName.FolderCommit)
.where({ id: folderCommitId })
.select("commitId")
.first();
if (referencedCommit?.commitId) {
const doc = await (tx || db.replicaNode())(TableName.FolderCommit)
.where({ folderId })
.where("commitId", ">", referencedCommit.commitId)
.count();
return Number(doc?.[0].count);
}
return 0;
} catch (error) {
throw new DatabaseError({ error, name: "getNumberOfCommitsSince" });
}
};
const getEnvNumberOfCommitsSince = async (envId: string, folderCommitId: string, tx?: Knex): Promise<number> => {
try {
const referencedCommit = await (tx || db.replicaNode())(TableName.FolderCommit)
.where({ id: folderCommitId })
.select("commitId")
.first();
if (referencedCommit?.commitId) {
const doc = await (tx || db.replicaNode())(TableName.FolderCommit)
.where(`${TableName.FolderCommit}.envId`, "=", envId)
.where("commitId", ">", referencedCommit.commitId)
.count();
return Number(doc?.[0].count);
}
return 0;
} catch (error) {
throw new DatabaseError({ error, name: "getNumberOfCommitsSince" });
}
};
const findCommitsToRecreate = async (
folderId: string,
targetCommitNumber: number,
checkpointCommitNumber: number,
tx?: Knex
): Promise<
(TFolderCommits & {
changes: (TFolderCommitChanges & {
referencedSecretId?: string;
referencedFolderId?: string;
folderName?: string;
folderVersion?: string;
secretKey?: string;
secretVersion?: string;
})[];
})[]
> => {
try {
// First get all the commits in the range
const commits = await (tx || db.replicaNode())(TableName.FolderCommit)
// eslint-disable-next-line @typescript-eslint/no-misused-promises
.where(buildFindFilter({ folderId }, TableName.FolderCommit))
.andWhere(`${TableName.FolderCommit}.commitId`, ">", checkpointCommitNumber)
.andWhere(`${TableName.FolderCommit}.commitId`, "<=", targetCommitNumber)
.select(selectAllTableCols(TableName.FolderCommit))
.orderBy(`${TableName.FolderCommit}.commitId`, "asc");
// If no commits found, return empty array
if (!commits.length) {
return [];
}
// Get all the commit IDs
const commitIds = commits.map((commit) => commit.id);
// Get all changes for these commits in a single query
const allChanges = await (tx || db.replicaNode())(TableName.FolderCommitChanges)
.whereIn("folderCommitId", commitIds)
.leftJoin<TSecretVersionsV2>(
TableName.SecretVersionV2,
`${TableName.FolderCommitChanges}.secretVersionId`,
`${TableName.SecretVersionV2}.id`
)
.leftJoin<TSecretFolderVersions>(
TableName.SecretFolderVersion,
`${TableName.FolderCommitChanges}.folderVersionId`,
`${TableName.SecretFolderVersion}.id`
)
.select(selectAllTableCols(TableName.FolderCommitChanges))
.select(
db.ref("secretId").withSchema(TableName.SecretVersionV2).as("referencedSecretId"),
db.ref("folderId").withSchema(TableName.SecretFolderVersion).as("referencedFolderId"),
db.ref("name").withSchema(TableName.SecretFolderVersion).as("folderName"),
db.ref("version").withSchema(TableName.SecretFolderVersion).as("folderVersion"),
db.ref("key").withSchema(TableName.SecretVersionV2).as("secretKey"),
db.ref("version").withSchema(TableName.SecretVersionV2).as("secretVersion")
);
// Organize changes by commit ID
const changesByCommitId = allChanges.reduce(
(acc, change) => {
if (!acc[change.folderCommitId]) {
acc[change.folderCommitId] = [];
}
acc[change.folderCommitId].push(change);
return acc;
},
{} as Record<string, TFolderCommitChanges[]>
);
// Attach changes to each commit
return commits.map((commit) => ({
...commit,
changes: changesByCommitId[commit.id] || []
}));
} catch (error) {
throw new DatabaseError({ error, name: "FindCommitsToRecreate" });
}
};
const findLatestCommitBetween = async ({
folderId,
startCommitId,
endCommitId,
tx
}: {
folderId: string;
startCommitId?: string;
endCommitId: string;
tx?: Knex;
}): Promise<TFolderCommits | undefined> => {
try {
const doc = await (tx || db.replicaNode())(TableName.FolderCommit)
.where("commitId", "<=", endCommitId)
.where({ folderId })
.where((qb) => {
if (startCommitId) {
void qb.where("commitId", ">=", startCommitId);
}
})
.select(selectAllTableCols(TableName.FolderCommit))
.orderBy("commitId", "desc")
.first();
return doc;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestCommitBetween" });
}
};
const findAllCommitsBetween = async ({
envId,
startCommitId,
endCommitId,
tx
}: {
folderId?: string;
envId?: string;
startCommitId?: string;
endCommitId: string;
tx?: Knex;
}): Promise<TFolderCommits[]> => {
try {
const docs = await (tx || db.replicaNode())(TableName.FolderCommit)
.where("commitId", "<=", endCommitId)
.where((qb) => {
if (envId) {
void qb.where(`${TableName.FolderCommit}.envId`, "=", envId);
}
if (startCommitId) {
void qb.where("commitId", ">=", startCommitId);
}
})
.select(selectAllTableCols(TableName.FolderCommit))
.orderBy("commitId", "desc");
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestCommitBetween" });
}
};
const findAllFolderCommitsAfter = async ({
envId,
startCommitId,
tx
}: {
folderId?: string;
envId?: string;
startCommitId?: string;
tx?: Knex;
}): Promise<TFolderCommits[]> => {
try {
const docs = await (tx || db.replicaNode())(TableName.FolderCommit)
.where((qb) => {
if (envId) {
void qb.where(`${TableName.FolderCommit}.envId`, "=", envId);
}
if (startCommitId) {
void qb.where("commitId", ">=", startCommitId);
}
})
.select(selectAllTableCols(TableName.FolderCommit))
.orderBy("commitId", "desc");
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindLatestCommitBetween" });
}
};
return {
...restOfOrm,
findByFolderId,
findLatestCommit,
getNumberOfCommitsSince,
findCommitsToRecreate,
findMultipleLatestCommits,
findAllCommitsBetween,
findLatestCommitBetween,
findLatestEnvCommit,
getEnvNumberOfCommitsSince,
findLatestCommitByFolderIds,
findAllFolderCommitsAfter
};
};

View File

@@ -0,0 +1,180 @@
import { TSecretFolders } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env";
import { logger } from "@app/lib/logger";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { TFolderTreeCheckpointDALFactory } from "../folder-tree-checkpoint/folder-tree-checkpoint-dal";
import { TFolderTreeCheckpointResourcesDALFactory } from "../folder-tree-checkpoint-resources/folder-tree-checkpoint-resources-dal";
import { TSecretFolderDALFactory } from "../secret-folder/secret-folder-dal";
import { TFolderCommitDALFactory } from "./folder-commit-dal";
type TFolderCommitQueueServiceFactoryDep = {
queueService: TQueueServiceFactory;
folderTreeCheckpointDAL: Pick<
TFolderTreeCheckpointDALFactory,
"create" | "findLatestByEnvId" | "findNearestCheckpoint"
>;
folderTreeCheckpointResourcesDAL: Pick<
TFolderTreeCheckpointResourcesDALFactory,
"insertMany" | "findByTreeCheckpointId"
>;
folderCommitDAL: Pick<
TFolderCommitDALFactory,
"findLatestEnvCommit" | "getEnvNumberOfCommitsSince" | "findMultipleLatestCommits"
>;
folderDAL: Pick<TSecretFolderDALFactory, "findByEnvId">;
};
export type TFolderCommitQueueServiceFactory = ReturnType<typeof folderCommitQueueServiceFactory>;
export const folderCommitQueueServiceFactory = ({
queueService,
folderTreeCheckpointDAL,
folderTreeCheckpointResourcesDAL,
folderCommitDAL,
folderDAL
}: TFolderCommitQueueServiceFactoryDep) => {
const appCfg = getConfig();
const scheduleTreeCheckpoint = async (envId: string) => {
await queueService.queue(
QueueName.FolderTreeCheckpoint,
QueueJobs.CreateFolderTreeCheckpoint,
{ envId },
{
jobId: envId,
backoff: {
type: "exponential",
delay: 3000
},
removeOnFail: {
count: 3
},
removeOnComplete: true
}
);
};
const schedulePeriodicTreeCheckpoint = async (envId: string, intervalMs: number) => {
await queueService.queue(
QueueName.FolderTreeCheckpoint,
QueueJobs.CreateFolderTreeCheckpoint,
{ envId },
{
jobId: `periodic-${envId}`,
repeat: {
every: intervalMs
},
backoff: {
type: "exponential",
delay: 3000
},
removeOnFail: false,
removeOnComplete: false
}
);
};
const cancelScheduledTreeCheckpoint = async (envId: string) => {
await queueService.stopJobById(QueueName.FolderTreeCheckpoint, envId);
await queueService.stopRepeatableJobByJobId(QueueName.FolderTreeCheckpoint, `periodic-${envId}`);
};
// Sort folders by hierarchy (copied from the source code)
const sortFoldersByHierarchy = (folders: TSecretFolders[]) => {
const childrenMap = new Map<string, TSecretFolders[]>();
const allFolderIds = new Set<string>();
folders.forEach((folder) => {
if (folder.id) allFolderIds.add(folder.id);
});
folders.forEach((folder) => {
if (folder.parentId) {
const children = childrenMap.get(folder.parentId) || [];
children.push(folder);
childrenMap.set(folder.parentId, children);
}
});
const rootFolders = folders.filter((folder) => !folder.parentId || !allFolderIds.has(folder.parentId));
const result = [];
let currentLevel = rootFolders;
while (currentLevel.length > 0) {
result.push(...currentLevel);
const nextLevel = [];
for (const folder of currentLevel) {
if (folder.id) {
const children = childrenMap.get(folder.id) || [];
nextLevel.push(...children);
}
}
currentLevel = nextLevel;
}
return result;
};
queueService.start(QueueName.FolderTreeCheckpoint, async (job) => {
try {
if (job.name === QueueJobs.CreateFolderTreeCheckpoint) {
const { envId } = job.data as { envId: string };
logger.info("Folder tree checkpoint creation started:", envId, job.id);
const latestTreeCheckpoint = await folderTreeCheckpointDAL.findLatestByEnvId(envId);
const latestCommit = await folderCommitDAL.findLatestEnvCommit(envId);
if (!latestCommit) {
logger.info(`Latest commit ID not found for envId ${envId}`);
return;
}
const latestCommitId = latestCommit.id;
if (latestTreeCheckpoint) {
const commitsSinceLastCheckpoint = await folderCommitDAL.getEnvNumberOfCommitsSince(
envId,
latestTreeCheckpoint.folderCommitId
);
if (commitsSinceLastCheckpoint < Number(appCfg.PIT_TREE_CHECKPOINT_WINDOW)) {
logger.info(
`Commits since last checkpoint ${commitsSinceLastCheckpoint} is less than ${appCfg.PIT_TREE_CHECKPOINT_WINDOW}`
);
return;
}
}
const folders = await folderDAL.findByEnvId(envId);
const sortedFolders = sortFoldersByHierarchy(folders);
const filteredFoldersIds = sortedFolders.filter((folder) => !folder.isReserved).map((folder) => folder.id);
const folderCommits = await folderCommitDAL.findMultipleLatestCommits(filteredFoldersIds);
const folderTreeCheckpoint = await folderTreeCheckpointDAL.create({
folderCommitId: latestCommitId
});
await folderTreeCheckpointResourcesDAL.insertMany(
folderCommits.map((folderCommit) => ({
folderTreeCheckpointId: folderTreeCheckpoint.id,
folderId: folderCommit.folderId,
folderCommitId: folderCommit.id
}))
);
logger.info("Folder tree checkpoint created successfully:", folderTreeCheckpoint.id);
}
} catch (error) {
logger.error(error, "Error creating folder tree checkpoint:");
throw error;
}
});
return {
scheduleTreeCheckpoint,
schedulePeriodicTreeCheckpoint,
cancelScheduledTreeCheckpoint
};
};

View File

@@ -0,0 +1,608 @@
import { Knex } from "knex";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { TSecretFolderVersions, TSecretVersionsV2 } from "@app/db/schemas";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { ActorType } from "../auth/auth-type";
import { ChangeType, folderCommitServiceFactory, TFolderCommitServiceFactory } from "./folder-commit-service";
// Mock config
vi.mock("@app/lib/config/env", () => ({
getConfig: () => ({
PIT_CHECKPOINT_WINDOW: 5,
PIT_TREE_CHECKPOINT_WINDOW: 10
})
}));
// Mock logger
vi.mock("@app/lib/logger", () => ({
logger: {
info: vi.fn(),
error: vi.fn()
}
}));
describe("folderCommitServiceFactory", () => {
// Properly type the mock functions
type TransactionCallback<T> = (trx: Knex) => Promise<T>;
// Mock dependencies
const mockFolderCommitDAL = {
create: vi.fn().mockResolvedValue({}),
findById: vi.fn().mockResolvedValue({}),
findByFolderId: vi.fn().mockResolvedValue([]),
findLatestCommit: vi.fn().mockResolvedValue({}),
transaction: vi.fn().mockImplementation(<T>(callback: TransactionCallback<T>) => callback({} as Knex)),
getNumberOfCommitsSince: vi.fn().mockResolvedValue(0),
getEnvNumberOfCommitsSince: vi.fn().mockResolvedValue(0),
findCommitsToRecreate: vi.fn().mockResolvedValue([]),
findMultipleLatestCommits: vi.fn().mockResolvedValue([]),
findLatestCommitBetween: vi.fn().mockResolvedValue({}),
findAllCommitsBetween: vi.fn().mockResolvedValue([]),
findLatestEnvCommit: vi.fn().mockResolvedValue({}),
findLatestCommitByFolderIds: vi.fn().mockResolvedValue({})
};
const mockFolderCommitChangesDAL = {
create: vi.fn().mockResolvedValue({}),
findByCommitId: vi.fn().mockResolvedValue([]),
insertMany: vi.fn().mockResolvedValue([])
};
const mockFolderCheckpointDAL = {
create: vi.fn().mockResolvedValue({}),
findByFolderId: vi.fn().mockResolvedValue([]),
findLatestByFolderId: vi.fn().mockResolvedValue(null),
findNearestCheckpoint: vi.fn().mockResolvedValue({})
};
const mockFolderCheckpointResourcesDAL = {
insertMany: vi.fn().mockResolvedValue([]),
findByCheckpointId: vi.fn().mockResolvedValue([])
};
const mockFolderTreeCheckpointDAL = {
create: vi.fn().mockResolvedValue({}),
findByProjectId: vi.fn().mockResolvedValue([]),
findLatestByProjectId: vi.fn().mockResolvedValue({}),
findNearestCheckpoint: vi.fn().mockResolvedValue({}),
findLatestByEnvId: vi.fn().mockResolvedValue({})
};
const mockFolderTreeCheckpointResourcesDAL = {
insertMany: vi.fn().mockResolvedValue([]),
findByTreeCheckpointId: vi.fn().mockResolvedValue([])
};
const mockUserDAL = {
findById: vi.fn().mockResolvedValue({})
};
const mockIdentityDAL = {
findById: vi.fn().mockResolvedValue({})
};
const mockFolderDAL = {
findByParentId: vi.fn().mockResolvedValue([]),
findByProjectId: vi.fn().mockResolvedValue([]),
deleteById: vi.fn().mockResolvedValue({}),
create: vi.fn().mockResolvedValue({}),
updateById: vi.fn().mockResolvedValue({}),
update: vi.fn().mockResolvedValue({}),
find: vi.fn().mockResolvedValue([]),
findById: vi.fn().mockResolvedValue({}),
findByEnvId: vi.fn().mockResolvedValue([]),
findFoldersByRootAndIds: vi.fn().mockResolvedValue([])
};
const mockFolderVersionDAL = {
findLatestFolderVersions: vi.fn().mockResolvedValue({}),
findById: vi.fn().mockResolvedValue({}),
deleteById: vi.fn().mockResolvedValue({}),
create: vi.fn().mockResolvedValue({}),
updateById: vi.fn().mockResolvedValue({}),
find: vi.fn().mockResolvedValue([]),
findByIdsWithLatestVersion: vi.fn().mockResolvedValue({})
};
const mockSecretVersionV2BridgeDAL = {
findLatestVersionByFolderId: vi.fn().mockResolvedValue([]),
findById: vi.fn().mockResolvedValue({}),
deleteById: vi.fn().mockResolvedValue({}),
create: vi.fn().mockResolvedValue({}),
updateById: vi.fn().mockResolvedValue({}),
find: vi.fn().mockResolvedValue([]),
findByIdsWithLatestVersion: vi.fn().mockResolvedValue({}),
findLatestVersionMany: vi.fn().mockResolvedValue({})
};
const mockSecretV2BridgeDAL = {
deleteById: vi.fn().mockResolvedValue({}),
create: vi.fn().mockResolvedValue({}),
updateById: vi.fn().mockResolvedValue({}),
update: vi.fn().mockResolvedValue({}),
insertMany: vi.fn().mockResolvedValue([]),
invalidateSecretCacheByProjectId: vi.fn().mockResolvedValue({})
};
const mockProjectDAL = {
findById: vi.fn().mockResolvedValue({})
};
const mockFolderCommitQueueService = {
scheduleTreeCheckpoint: vi.fn().mockResolvedValue({})
};
const mockPermissionService = {
getProjectPermission: vi.fn().mockResolvedValue({})
};
let folderCommitService: TFolderCommitServiceFactory;
beforeEach(() => {
vi.clearAllMocks();
folderCommitService = folderCommitServiceFactory({
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderCommitDAL: mockFolderCommitDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderCommitChangesDAL: mockFolderCommitChangesDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderCheckpointDAL: mockFolderCheckpointDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderCheckpointResourcesDAL: mockFolderCheckpointResourcesDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderTreeCheckpointDAL: mockFolderTreeCheckpointDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderTreeCheckpointResourcesDAL: mockFolderTreeCheckpointResourcesDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
userDAL: mockUserDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
identityDAL: mockIdentityDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderDAL: mockFolderDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
folderVersionDAL: mockFolderVersionDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
secretVersionV2BridgeDAL: mockSecretVersionV2BridgeDAL,
projectDAL: mockProjectDAL,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
secretV2BridgeDAL: mockSecretV2BridgeDAL,
folderCommitQueueService: mockFolderCommitQueueService,
// @ts-expect-error - Mock implementation doesn't need all interface methods for testing
permissionService: mockPermissionService
});
});
afterEach(() => {
vi.resetAllMocks();
});
describe("createCommit", () => {
it("should successfully create a commit with user actor", async () => {
// Arrange
const userData = { id: "user-id", username: "testuser" };
const folderData = { id: "folder-id", envId: "env-id" };
const commitData = { id: "commit-id", folderId: "folder-id" };
mockUserDAL.findById.mockResolvedValue(userData);
mockFolderDAL.findById.mockResolvedValue(folderData);
mockFolderCommitDAL.create.mockResolvedValue(commitData);
mockFolderCheckpointDAL.findLatestByFolderId.mockResolvedValue(null);
mockFolderCommitDAL.findLatestCommit.mockResolvedValue({ id: "latest-commit-id" });
mockFolderDAL.findByParentId.mockResolvedValue([]);
mockSecretVersionV2BridgeDAL.findLatestVersionByFolderId.mockResolvedValue([]);
const data = {
actor: {
type: ActorType.USER,
metadata: { id: userData.id }
},
message: "Test commit",
folderId: folderData.id,
changes: [
{
type: "add",
secretVersionId: "secret-version-1"
}
]
};
// Act
const result = await folderCommitService.createCommit(data);
// Assert
expect(mockUserDAL.findById).toHaveBeenCalledWith(userData.id, undefined);
expect(mockFolderDAL.findById).toHaveBeenCalledWith(folderData.id, undefined);
expect(mockFolderCommitDAL.create).toHaveBeenCalledWith(
expect.objectContaining({
actorType: ActorType.USER,
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
actorMetadata: expect.objectContaining({ name: userData.username }),
message: data.message,
folderId: data.folderId,
envId: folderData.envId
}),
undefined
);
expect(mockFolderCommitChangesDAL.insertMany).toHaveBeenCalledWith(
expect.arrayContaining([
expect.objectContaining({
folderCommitId: commitData.id,
changeType: data.changes[0].type,
secretVersionId: data.changes[0].secretVersionId
})
]),
undefined
);
expect(mockFolderCommitQueueService.scheduleTreeCheckpoint).toHaveBeenCalledWith(folderData.envId);
expect(result).toEqual(commitData);
});
it("should successfully create a commit with identity actor", async () => {
// Arrange
const identityData = { id: "identity-id", name: "testidentity" };
const folderData = { id: "folder-id", envId: "env-id" };
const commitData = { id: "commit-id", folderId: "folder-id" };
mockIdentityDAL.findById.mockResolvedValue(identityData);
mockFolderDAL.findById.mockResolvedValue(folderData);
mockFolderCommitDAL.create.mockResolvedValue(commitData);
mockFolderCheckpointDAL.findLatestByFolderId.mockResolvedValue(null);
mockFolderCommitDAL.findLatestCommit.mockResolvedValue({ id: "latest-commit-id" });
mockFolderDAL.findByParentId.mockResolvedValue([]);
mockSecretVersionV2BridgeDAL.findLatestVersionByFolderId.mockResolvedValue([]);
const data = {
actor: {
type: ActorType.IDENTITY,
metadata: { id: identityData.id }
},
message: "Test commit",
folderId: folderData.id,
changes: [
{
type: "add",
folderVersionId: "folder-version-1"
}
]
};
// Act
const result = await folderCommitService.createCommit(data);
// Assert
expect(mockIdentityDAL.findById).toHaveBeenCalledWith(identityData.id, undefined);
expect(mockFolderDAL.findById).toHaveBeenCalledWith(folderData.id, undefined);
expect(mockFolderCommitDAL.create).toHaveBeenCalledWith(
expect.objectContaining({
actorType: ActorType.IDENTITY,
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
actorMetadata: expect.objectContaining({ name: identityData.name }),
message: data.message,
folderId: data.folderId,
envId: folderData.envId
}),
undefined
);
expect(mockFolderCommitChangesDAL.insertMany).toHaveBeenCalledWith(
expect.arrayContaining([
expect.objectContaining({
folderCommitId: commitData.id,
changeType: data.changes[0].type,
folderVersionId: data.changes[0].folderVersionId
})
]),
undefined
);
expect(mockFolderCommitQueueService.scheduleTreeCheckpoint).toHaveBeenCalledWith(folderData.envId);
expect(result).toEqual(commitData);
});
it("should throw NotFoundError when folder does not exist", async () => {
// Arrange
mockFolderDAL.findById.mockResolvedValue(null);
const data = {
actor: {
type: ActorType.PLATFORM
},
message: "Test commit",
folderId: "non-existent-folder",
changes: []
};
// Act & Assert
await expect(folderCommitService.createCommit(data)).rejects.toThrow(NotFoundError);
expect(mockFolderDAL.findById).toHaveBeenCalledWith("non-existent-folder", undefined);
});
});
describe("addCommitChange", () => {
it("should successfully add a change to an existing commit", async () => {
// Arrange
const commitData = { id: "commit-id", folderId: "folder-id" };
const changeData = { id: "change-id", folderCommitId: "commit-id" };
mockFolderCommitDAL.findById.mockResolvedValue(commitData);
mockFolderCommitChangesDAL.create.mockResolvedValue(changeData);
const data = {
folderCommitId: commitData.id,
changeType: "add",
secretVersionId: "secret-version-1"
};
// Act
const result = await folderCommitService.addCommitChange(data);
// Assert
expect(mockFolderCommitDAL.findById).toHaveBeenCalledWith(commitData.id, undefined);
expect(mockFolderCommitChangesDAL.create).toHaveBeenCalledWith(data, undefined);
expect(result).toEqual(changeData);
});
it("should throw BadRequestError when neither secretVersionId nor folderVersionId is provided", async () => {
// Arrange
const data = {
folderCommitId: "commit-id",
changeType: "add"
};
// Act & Assert
await expect(folderCommitService.addCommitChange(data)).rejects.toThrow(BadRequestError);
});
it("should throw NotFoundError when commit does not exist", async () => {
// Arrange
mockFolderCommitDAL.findById.mockResolvedValue(null);
const data = {
folderCommitId: "non-existent-commit",
changeType: "add",
secretVersionId: "secret-version-1"
};
// Act & Assert
await expect(folderCommitService.addCommitChange(data)).rejects.toThrow(NotFoundError);
expect(mockFolderCommitDAL.findById).toHaveBeenCalledWith("non-existent-commit", undefined);
});
});
// Note: reconstructFolderState is an internal function not exposed in the public API
// We'll test it indirectly through compareFolderStates
describe("compareFolderStates", () => {
it("should mark all resources as creates when currentCommitId is not provided", async () => {
// Arrange
const targetCommitId = "target-commit-id";
const targetCommit = { id: targetCommitId, commitId: 1, folderId: "folder-id" };
mockFolderCommitDAL.findById.mockResolvedValue(targetCommit);
// Mock how compareFolderStates would process the results internally
mockFolderCheckpointDAL.findNearestCheckpoint.mockResolvedValue({ id: "checkpoint-id", commitId: "hash-0" });
mockFolderCheckpointResourcesDAL.findByCheckpointId.mockResolvedValue([
{ secretVersionId: "secret-version-1", referencedSecretId: "secret-1" },
{ folderVersionId: "folder-version-1", referencedFolderId: "folder-1" }
]);
mockFolderCommitDAL.findCommitsToRecreate.mockResolvedValue([]);
// Act
const result = await folderCommitService.compareFolderStates({
targetCommitId
});
// Assert
expect(mockFolderCommitDAL.findById).toHaveBeenCalledWith(targetCommitId, undefined);
// Verify we get resources marked as create
expect(result).toEqual(
expect.arrayContaining([
expect.objectContaining({
changeType: "create",
commitId: targetCommit.commitId
})
])
);
});
});
describe("createFolderCheckpoint", () => {
it("should successfully create a checkpoint when force is true", async () => {
// Arrange
const folderCommitId = "commit-id";
const folderId = "folder-id";
const checkpointData = { id: "checkpoint-id", folderCommitId };
mockFolderDAL.findByParentId.mockResolvedValue([{ id: "subfolder-id" }]);
mockFolderVersionDAL.findLatestFolderVersions.mockResolvedValue({ "subfolder-id": { id: "folder-version-1" } });
mockSecretVersionV2BridgeDAL.findLatestVersionByFolderId.mockResolvedValue([{ id: "secret-version-1" }]);
mockFolderCheckpointDAL.create.mockResolvedValue(checkpointData);
// Act
const result = await folderCommitService.createFolderCheckpoint({
folderId,
folderCommitId,
force: true
});
// Assert
expect(mockFolderCheckpointDAL.create).toHaveBeenCalledWith({ folderCommitId }, undefined);
expect(mockFolderCheckpointResourcesDAL.insertMany).toHaveBeenCalled();
expect(result).toBe(folderCommitId);
});
});
describe("deepRollbackFolder", () => {
it("should throw NotFoundError when commit doesn't exist", async () => {
// Arrange
const targetCommitId = "non-existent-commit";
const envId = "env-id";
const actorId = "user-id";
const actorType = ActorType.USER;
const projectId = "project-id";
mockFolderCommitDAL.findById.mockResolvedValue(null);
// Act & Assert
await expect(
folderCommitService.deepRollbackFolder(targetCommitId, envId, actorId, actorType, projectId)
).rejects.toThrow(NotFoundError);
});
});
describe("createFolderTreeCheckpoint", () => {
it("should create a tree checkpoint when checkpoint window is exceeded", async () => {
// Arrange
const envId = "env-id";
const folderCommitId = "commit-id";
const latestCommit = { id: folderCommitId };
const latestTreeCheckpoint = { id: "tree-checkpoint-id", folderCommitId: "old-commit-id" };
const folders = [
{ id: "folder-1", isReserved: false },
{ id: "folder-2", isReserved: false },
{ id: "folder-3", isReserved: true } // Reserved folders should be filtered out
];
const folderCommits = [
{ folderId: "folder-1", id: "commit-1" },
{ folderId: "folder-2", id: "commit-2" }
];
const treeCheckpoint = { id: "new-tree-checkpoint-id" };
mockFolderCommitDAL.findLatestEnvCommit.mockResolvedValue(latestCommit);
mockFolderTreeCheckpointDAL.findLatestByEnvId.mockResolvedValue(latestTreeCheckpoint);
mockFolderCommitDAL.getEnvNumberOfCommitsSince.mockResolvedValue(15); // More than PIT_TREE_CHECKPOINT_WINDOW (10)
mockFolderDAL.findByEnvId.mockResolvedValue(folders);
mockFolderCommitDAL.findMultipleLatestCommits.mockResolvedValue(folderCommits);
mockFolderTreeCheckpointDAL.create.mockResolvedValue(treeCheckpoint);
// Act
await folderCommitService.createFolderTreeCheckpoint(envId);
// Assert
expect(mockFolderCommitDAL.findLatestEnvCommit).toHaveBeenCalledWith(envId, undefined);
expect(mockFolderTreeCheckpointDAL.create).toHaveBeenCalledWith({ folderCommitId }, undefined);
});
});
describe("applyFolderStateDifferences", () => {
it("should process changes correctly", async () => {
// Arrange
const folderId = "folder-id";
const projectId = "project-id";
const actorId = "user-id";
const actorType = ActorType.USER;
const differences = [
{ type: "secret", id: "secret-1", versionId: "v1", changeType: ChangeType.CREATE, commitId: 1 },
{ type: "folder", id: "folder-1", versionId: "v2", changeType: ChangeType.UPDATE, commitId: 1 }
];
const secretVersions = {
"secret-1": {
id: "secret-version-1",
createdAt: new Date(),
updatedAt: new Date(),
type: "shared",
folderId: "folder-1",
secretId: "secret-1",
version: 1,
key: "SECRET_KEY",
encryptedValue: Buffer.from("encrypted"),
encryptedComment: Buffer.from("comment"),
skipMultilineEncoding: false,
userId: "user-1",
envId: "env-1",
metadata: {}
} as TSecretVersionsV2
};
const folderVersions = {
"folder-1": {
folderId: "folder-1",
version: 1,
name: "Test Folder",
envId: "env-1"
} as TSecretFolderVersions
};
// Mock folder lookup for the folder being processed
mockFolderDAL.findById.mockImplementation((id) => {
if (id === folderId) {
return Promise.resolve({ id: folderId, envId: "env-1" });
}
return Promise.resolve(null);
});
// Mock latest commit lookup
mockFolderCommitDAL.findLatestCommit.mockImplementation((id) => {
if (id === folderId) {
return Promise.resolve({ id: "latest-commit-id", folderId });
}
return Promise.resolve(null);
});
// Make sure findByParentId returns an array, not undefined
mockFolderDAL.findByParentId.mockResolvedValue([]);
// Make sure other required functions return appropriate values
mockFolderCheckpointDAL.findLatestByFolderId.mockResolvedValue(null);
mockSecretVersionV2BridgeDAL.findLatestVersionByFolderId.mockResolvedValue([]);
// These mocks need to return objects with an id field
mockSecretVersionV2BridgeDAL.findByIdsWithLatestVersion.mockResolvedValue(Object.values(secretVersions));
mockFolderVersionDAL.findByIdsWithLatestVersion.mockResolvedValue(Object.values(folderVersions));
mockSecretV2BridgeDAL.insertMany.mockResolvedValue([{ id: "new-secret-1" }]);
mockSecretVersionV2BridgeDAL.create.mockResolvedValue({ id: "new-secret-version-1" });
mockFolderDAL.updateById.mockResolvedValue({ id: "updated-folder-1" });
mockFolderVersionDAL.create.mockResolvedValue({ id: "new-folder-version-1" });
mockFolderCommitDAL.create.mockResolvedValue({ id: "new-commit-id" });
mockSecretVersionV2BridgeDAL.findLatestVersionMany.mockResolvedValue([
{
id: "secret-version-1",
createdAt: new Date(),
updatedAt: new Date(),
type: "shared",
folderId: "folder-1",
secretId: "secret-1",
version: 1,
key: "SECRET_KEY",
encryptedValue: Buffer.from("encrypted"),
encryptedComment: Buffer.from("comment"),
skipMultilineEncoding: false,
userId: "user-1",
envId: "env-1",
metadata: {}
}
]);
// Mock transaction
mockFolderCommitDAL.transaction.mockImplementation(<T>(callback: TransactionCallback<T>) => callback({} as Knex));
// Act
const result = await folderCommitService.applyFolderStateDifferences({
differences,
actorInfo: {
actorType,
actorId,
message: "Applying changes"
},
folderId,
projectId,
reconstructNewFolders: false
});
// Assert
expect(mockFolderCommitDAL.create).toHaveBeenCalled();
expect(mockSecretV2BridgeDAL.invalidateSecretCacheByProjectId).toHaveBeenCalledWith(projectId);
// Check that we got the right counts
expect(result).toEqual({
secretChangesCount: 1,
folderChangesCount: 1,
totalChanges: 2
});
});
});
});

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More