Compare commits

..

133 Commits

Author SHA1 Message Date
Daniel Hougaard
09887a7405 Update ConfiguredIntegrationItem.tsx 2024-09-16 23:05:38 +04:00
Daniel Hougaard
38ee3a005e Requested changes 2024-09-16 22:26:36 +04:00
Daniel Hougaard
74653e7ed1 Minor ui improvements 2024-09-16 13:56:23 +04:00
Daniel Hougaard
8a0b1bb427 Update IntegrationAuditLogsSection.tsx 2024-09-15 20:34:08 +04:00
Daniel Hougaard
1f6faadf81 Cleanup 2024-09-15 20:24:23 +04:00
Daniel Hougaard
8f3b7e1698 feat: audit logs event metadata & remapping support 2024-09-15 20:01:43 +04:00
Daniel Hougaard
24c460c695 feat: integration details page 2024-09-15 20:00:43 +04:00
Daniel Hougaard
8acceab1e7 fix: updated last used to be considered last success sync 2024-09-15 19:57:56 +04:00
Daniel Hougaard
d60aba9339 fix: added missing integration metadata attributes 2024-09-15 19:57:36 +04:00
Daniel Hougaard
3a228f7521 feat: improved audit logs 2024-09-15 19:57:02 +04:00
Daniel Hougaard
3f7ac0f142 feat: integration synced log event 2024-09-15 19:52:43 +04:00
Daniel Hougaard
63cf535ebb feat: platform-level actor for logs 2024-09-15 19:52:13 +04:00
Daniel Hougaard
69a2a46c47 Update organization-router.ts 2024-09-15 19:51:54 +04:00
Daniel Hougaard
d081077273 feat: integration sync logs 2024-09-15 19:51:38 +04:00
Daniel Hougaard
75034f9350 feat: more expendable audit logs 2024-09-15 19:50:03 +04:00
Daniel Hougaard
eacd7b0c6a feat: made audit logs more searchable with better filters 2024-09-15 19:49:35 +04:00
Daniel Hougaard
5bad77083c feat: more expendable audit logs 2024-09-15 19:49:07 +04:00
Daniel Hougaard
1025759efb Feat: Integration Audit Logs 2024-09-13 21:00:47 +04:00
Daniel Hougaard
5e5ab29ab9 Feat: Integration UI improvements 2024-09-12 13:09:00 +04:00
Daniel Hougaard
5150c102e6 Merge pull request #2380 from Infisical/daniel/invite-multiple-members-to-project
feat: invite multiple members to projects with role assignment
2024-09-12 11:16:41 +04:00
Daniel Hougaard
41c29d41e1 Update AddMemberModal.tsx 2024-09-12 11:13:39 +04:00
Daniel Hougaard
4de33190a9 Rebase fixes 2024-09-12 11:12:45 +04:00
Daniel Hougaard
7cfecb39e4 Update AddMemberModal.tsx 2024-09-12 11:08:25 +04:00
Daniel Hougaard
7524b83c29 Delete project-membership-fns.ts 2024-09-12 11:08:25 +04:00
Daniel Hougaard
7a41cdf51b Fix: type errors 2024-09-12 11:08:25 +04:00
Daniel Hougaard
17d99cb2cf fix: circular dependencies and query invalidation 2024-09-12 11:07:41 +04:00
Daniel Hougaard
bd0da0ff74 Update AddMemberModal.tsx 2024-09-12 11:03:20 +04:00
Daniel Hougaard
d2a54234f4 Rebase with Akhi 2024-09-12 11:03:20 +04:00
Daniel Hougaard
626262461a feat: assign roles when inviting members to project 2024-09-12 11:03:20 +04:00
Daniel Hougaard
93ba29e57f Feat: Invite multiple users to project with multiple roles 2024-09-12 11:03:20 +04:00
Daniel Hougaard
1581aa088d Update org-admin-service.ts 2024-09-12 11:03:20 +04:00
Daniel Hougaard
ceab951bca feat: remove project role from workspace user encryption computation 2024-09-12 11:03:20 +04:00
Daniel Hougaard
2e3dcc50ae API doc 2024-09-12 11:03:20 +04:00
Meet Shah
7b04c08fc7 Merge pull request #2412 from meetcshah19/meet/fix-org-selection
fix: redirect to selected org if already present
2024-09-12 10:14:56 +05:30
Maidul Islam
70842b8e5e Merge pull request #2411 from akhilmhdh/debug/entra-saml-logpoint
feat: debug added log points for entra failing saml
2024-09-11 19:00:36 -04:00
Meet
36e3e4c1b5 fix: redirect to selected org if already present 2024-09-12 03:37:55 +05:30
=
1384c8e855 feat: debug added log points for entra failing saml 2024-09-12 00:19:16 +05:30
Maidul Islam
f213c75ede Merge pull request #2410 from Infisical/misc/slack-integration-doc-and-ui-updates
misc: added cloud users guide for slack and channel dropdown fix
2024-09-11 14:36:15 -04:00
Sheen Capadngan
6ade708e19 misc: added cloud users guide for slack and other ui updates 2024-09-12 02:23:57 +08:00
Daniel Hougaard
ce3af41ebc Merge pull request #2388 from Infisical/daniel/permission-visualization
feat: user details page audit logs & groups visualization
2024-09-11 21:45:15 +04:00
Tuan Dang
e442f10fa5 Fix merge conflicts 2024-09-11 10:38:47 -07:00
Tuan Dang
2e8ad18285 Merge remote-tracking branch 'origin' into daniel/permission-visualization 2024-09-11 10:32:17 -07:00
Tuan Dang
f03ca7f916 Minor adjustments 2024-09-11 10:30:16 -07:00
Meet Shah
af1905a39e Merge pull request #2406 from meetcshah19/meet/fix-email-capitalization
Send lower case emails to backend
2024-09-11 20:07:14 +05:30
Maidul Islam
1321aa712f Merge pull request #2358 from Infisical/feat/native-slack-integration
feat: native slack integration
2024-09-11 09:36:25 -04:00
Daniel Hougaard
5ad00130ea Merge pull request #2384 from akhilmhdh/feat/org-project-invite
Manager users without waiting for confirmation of mail
2024-09-11 13:06:28 +04:00
Daniel Hougaard
ea5e8e29e6 Requested changes 2024-09-11 12:45:14 +04:00
Sheen Capadngan
e7f89bdfef doc: add note for private channels 2024-09-11 13:50:40 +08:00
Sheen Capadngan
d23a7e41f3 misc: addressed comments 2024-09-11 13:29:43 +08:00
=
52a885716d feat: changes on review comments 2024-09-11 10:46:49 +05:30
Meet Shah
3fc907f076 fix: send lower case emails to backend 2024-09-11 04:38:00 +05:30
Maidul Islam
eaf10483c0 Merge pull request #2405 from Infisical/fix-azure-saml-map-docs
Fix Stated Map for Azure SAML Attributes
2024-09-10 16:46:40 -04:00
Tuan Dang
dcd0234fb5 Fix stated map for azure saml attributes 2024-09-10 13:16:36 -07:00
Daniel Hougaard
4dda270e8e Requested changes 2024-09-10 23:29:23 +04:00
Maidul Islam
c1cb85b49f Merge pull request #2404 from akhilmhdh/fix/secret-reference-pass
Secret reference skip if not found
2024-09-10 13:17:56 -04:00
=
ed71e651f6 fix: secret reference skip if not found 2024-09-10 22:23:40 +05:30
Sheen
1a11dd954b Merge pull request #2395 from Infisical/misc/allow-wildcard-san-value
misc: allow wildcard SAN domain value for certificates
2024-09-11 00:19:43 +08:00
BlackMagiq
5d3574d3f6 Merge pull request #2397 from Infisical/cert-template-enforcement
Certificate Template Enforcement Option + PKI UX Improvements
2024-09-10 09:19:37 -07:00
Sheen Capadngan
aa42aa05aa misc: updated docs 2024-09-11 00:13:44 +08:00
Sheen Capadngan
7a36badb23 misc: addressed review comments 2024-09-11 00:11:19 +08:00
Tuan Dang
9ce6fd3f8e Made required adjustments based on review 2024-09-10 08:18:31 -07:00
Maidul Islam
a549c8b9e3 Merge pull request #2353 from Infisical/daniel/cli-run-watch-mode
feat(cli): `run` command watch mode
2024-09-10 10:39:06 -04:00
Maidul Islam
1bc1feb843 Merge pull request #2399 from sanyarajan/patch-1
Remove reference to Okta in Azure SAML setup
2024-09-10 08:46:36 -04:00
Maidul Islam
80ca115ccd Merge pull request #2396 from Infisical/daniel/cli-stale-session
fix: stale session after logging into CLI
2024-09-10 08:27:16 -04:00
Sanya Rajan
5a6bb90870 Remove reference to Okta in Azure SAML setup 2024-09-10 12:25:11 +02:00
Akhil Mohan
de7a693a6a Merge pull request #2391 from Infisical/daniel/rabbitmq-dynamic-secrets
feat(dynamic-secrets): Rabbit MQ
2024-09-10 12:54:56 +05:30
Daniel Hougaard
096417281e Update rabbit-mq.ts 2024-09-10 11:21:52 +04:00
Daniel Hougaard
763a96faf8 Update rabbit-mq.ts 2024-09-10 11:21:52 +04:00
Daniel Hougaard
870eaf9301 docs(dynamic-secrets): rabbit mq 2024-09-10 11:21:52 +04:00
Daniel Hougaard
10abf192a1 chore(docs): cleanup incorrectly formatted images 2024-09-10 11:21:52 +04:00
Daniel Hougaard
508f697bdd feat(dynamic-secrets): RabbitMQ 2024-09-10 11:21:52 +04:00
Daniel Hougaard
8ea8a6f72e Fix: ElasticSearch provider typo 2024-09-10 11:17:35 +04:00
Sheen
ea3b3c5cec Merge pull request #2394 from Infisical/misc/update-kms-of-existing-params-for-integration
misc: ensure that selected kms key in aws param integration is followed
2024-09-10 12:51:06 +08:00
Tuan Dang
a8fd83652d Update docs for PKI issuer secret target output 2024-09-09 19:55:02 -07:00
Maidul Islam
45f3675337 Merge pull request #2389 from Infisical/misc/support-glob-patterns-oidc
misc: support glob patterns for OIDC
2024-09-09 18:22:51 -04:00
Tuan Dang
87a9a87dcd Show cert template ID on manage policies modal 2024-09-09 14:35:46 -07:00
Tuan Dang
0b882ece8c Update certificate / template docs 2024-09-09 14:22:26 -07:00
Tuan Dang
e005e94165 Merge remote-tracking branch 'origin' into cert-template-enforcement 2024-09-09 12:47:06 -07:00
Tuan Dang
0e07eaaa01 Fix cert template enforcement migration check 2024-09-09 12:45:33 -07:00
Tuan Dang
e10e313af3 Finish cert template enforcement 2024-09-09 12:42:56 -07:00
Sheen Capadngan
cf42279e5b misc: allow wildcard san domain value for certificates 2024-09-10 01:20:31 +08:00
Sheen Capadngan
fbc4b47198 misc: ensure that selected kms key in aws param integration is applied 2024-09-09 22:23:22 +08:00
=
e7191c2f71 feat: made project role multi support for org invite 2024-09-09 16:17:59 +05:30
Sheen Capadngan
8e68d21115 misc: support glob patterns for oidc 2024-09-09 17:17:12 +08:00
Daniel Hougaard
372b6cbaea fix: audit log fixes 2024-09-09 10:42:39 +04:00
Daniel Hougaard
26add7bfd1 fix: remove delete project membership option 2024-09-09 10:42:10 +04:00
Daniel Hougaard
f3d207ab5c feat: better user visualization 2024-09-08 20:20:34 +04:00
Daniel Hougaard
e1cd632546 improvements to user group ui 2024-09-08 20:20:10 +04:00
Daniel Hougaard
655ee4f118 Update mutations.tsx 2024-09-08 20:19:50 +04:00
Daniel Hougaard
34a2452bf5 feat: fetch all user group memberships 2024-09-08 20:19:10 +04:00
Daniel Hougaard
7846a81636 chore: new group with project memberships type 2024-09-08 19:28:17 +04:00
Daniel Hougaard
6bdf3455f5 Update mutations.tsx 2024-09-08 19:27:31 +04:00
Daniel Hougaard
556ae168dd feat: fetch specific user group memberships 2024-09-08 19:25:48 +04:00
Daniel Hougaard
7b19d2aa6a feat: audit logs on organization-level support 2024-09-08 19:24:04 +04:00
Daniel Hougaard
bda9bb3d61 fix: rename list audit logs and include project 2024-09-08 19:21:17 +04:00
Daniel Hougaard
4b66a9343c feat: audit logs section 2024-09-08 19:20:32 +04:00
Daniel Hougaard
4930d7fc02 feat: user groups section 2024-09-08 19:20:18 +04:00
Daniel Hougaard
ad644db512 feat: audit logs on organization-level 2024-09-08 19:19:55 +04:00
Sheen Capadngan
ffaf145317 misc: removed unused table usage 2024-09-08 17:04:41 +08:00
Sheen Capadngan
17b0d0081d misc: moved away from dedicated slack admin config 2024-09-08 17:00:50 +08:00
Sheen Capadngan
ecf177fecc misc: added root workflow integration structure 2024-09-08 13:49:32 +08:00
=
eb7c804bb9 feat(ui): made corresponding changes in api call made from frontend 2024-09-06 23:33:57 +05:30
=
9d7bfae519 feat: made default role on project invite as no access to org level 2024-09-06 23:33:12 +05:30
=
1292b5bf56 feat(api): manage users in org and project level without waiting for confirmation 2024-09-06 23:31:55 +05:30
Sheen Capadngan
dbc5b5a3d1 doc: native slack integration 2024-09-05 18:28:38 +08:00
Sheen Capadngan
1bd66a614b misc: added channels count validator 2024-09-05 02:36:27 +08:00
Sheen Capadngan
802a9cf83c misc: formatting changes 2024-09-05 01:42:33 +08:00
Sheen Capadngan
9e95fdbb58 misc: added proper error message hints 2024-09-05 01:20:12 +08:00
Sheen Capadngan
803f56cfe5 misc: added placeholder 2024-09-05 00:46:00 +08:00
Sheen Capadngan
b163a6c5ad feat: integration to access request approval 2024-09-05 00:42:21 +08:00
Sheen Capadngan
ddc119ceb6 Merge remote-tracking branch 'origin/main' into feat/native-slack-integration 2024-09-05 00:36:44 +08:00
Sheen Capadngan
09e621539e misc: finalized labels 2024-09-04 23:54:19 +08:00
Daniel Hougaard
5e0b78b104 Requested changes 2024-09-04 19:34:51 +04:00
Sheen Capadngan
27852607d1 Merge remote-tracking branch 'origin/main' into feat/native-slack-integration 2024-09-04 23:10:15 +08:00
Sheen Capadngan
956719f797 feat: admin slack configuration 2024-09-04 23:06:30 +08:00
Sheen Capadngan
71b8c59050 feat: slack channel suggestions 2024-09-04 18:03:07 +08:00
Sheen Capadngan
15c5fe4095 misc: slack integration reinstall 2024-09-04 15:44:58 +08:00
Daniel Hougaard
91ebcca0fd Update run.go 2024-09-04 10:44:39 +04:00
Daniel Hougaard
0826b40e2a Fixes and requested changes 2024-09-04 10:18:17 +04:00
Daniel Hougaard
911b62c63a Update run.go 2024-09-04 10:05:57 +04:00
Sheen Capadngan
5343c7af00 misc: added auto redirect to workflow settings tab 2024-09-04 02:22:53 +08:00
Sheen Capadngan
8c03c160a9 misc: implemented secret approval request and project audit logs 2024-09-04 01:48:08 +08:00
Sheen Capadngan
604b0467f9 feat: finalized integration selection in project settings 2024-09-04 00:34:03 +08:00
Sheen Capadngan
a2b555dd81 feat: finished org-level integration management flow 2024-09-03 22:08:31 +08:00
Sheen Capadngan
9120367562 misc: audit logs for slack integration management 2024-09-02 23:15:00 +08:00
Sheen Capadngan
f509464947 slack integration reinstall 2024-09-02 21:05:30 +08:00
Sheen Capadngan
07fd489982 feat: slack integration deletion 2024-09-02 20:34:13 +08:00
Sheen Capadngan
f6d3831d6d feat: finished slack integration update 2024-09-02 20:13:01 +08:00
Sheen Capadngan
d604ef2480 feat: integrated secret approval request 2024-09-02 15:38:05 +08:00
Sheen Capadngan
fe096772e0 feat: initial installation flow 2024-08-31 02:56:02 +08:00
Daniel Hougaard
35a63b8cc6 Fix: Fixed merge related changes 2024-08-29 22:54:49 +04:00
Daniel Hougaard
2a4596d415 Merge branch 'main' into daniel/cli-run-watch-mode 2024-08-29 22:37:35 +04:00
Daniel Hougaard
35e476d916 Fix: Runtime bugs 2024-08-29 22:35:21 +04:00
235 changed files with 8797 additions and 1488 deletions

View File

@@ -72,3 +72,6 @@ PLAIN_API_KEY=
PLAIN_WISH_LABEL_IDS=
SSL_CLIENT_CERTIFICATE_HEADER_KEY=
WORKFLOW_SLACK_CLIENT_ID=
WORKFLOW_SLACK_CLIENT_SECRET=

View File

@@ -34,6 +34,8 @@
"@peculiar/x509": "^1.12.1",
"@serdnam/pino-cloudwatch-transport": "^1.0.4",
"@sindresorhus/slugify": "1.1.0",
"@slack/oauth": "^3.0.1",
"@slack/web-api": "^7.3.4",
"@team-plain/typescript-sdk": "^4.6.1",
"@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0",
@@ -5981,6 +5983,78 @@
"node": ">=8"
}
},
"node_modules/@slack/logger": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/@slack/logger/-/logger-4.0.0.tgz",
"integrity": "sha512-Wz7QYfPAlG/DR+DfABddUZeNgoeY7d1J39OCR2jR+v7VBsB8ezulDK5szTnDDPDwLH5IWhLvXIHlCFZV7MSKgA==",
"dependencies": {
"@types/node": ">=18.0.0"
},
"engines": {
"node": ">= 18",
"npm": ">= 8.6.0"
}
},
"node_modules/@slack/oauth": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/@slack/oauth/-/oauth-3.0.1.tgz",
"integrity": "sha512-TuR9PI6bYKX6qHC7FQI4keMnhj45TNfSNQtTU3mtnHUX4XLM2dYLvRkUNADyiLTle2qu2rsOQtCIsZJw6H0sDA==",
"dependencies": {
"@slack/logger": "^4",
"@slack/web-api": "^7.3.4",
"@types/jsonwebtoken": "^9",
"@types/node": ">=18",
"jsonwebtoken": "^9",
"lodash.isstring": "^4"
},
"engines": {
"node": ">=18",
"npm": ">=8.6.0"
}
},
"node_modules/@slack/types": {
"version": "2.12.0",
"resolved": "https://registry.npmjs.org/@slack/types/-/types-2.12.0.tgz",
"integrity": "sha512-yFewzUomYZ2BYaGJidPuIgjoYj5wqPDmi7DLSaGIkf+rCi4YZ2Z3DaiYIbz7qb/PL2NmamWjCvB7e9ArI5HkKg==",
"engines": {
"node": ">= 12.13.0",
"npm": ">= 6.12.0"
}
},
"node_modules/@slack/web-api": {
"version": "7.3.4",
"resolved": "https://registry.npmjs.org/@slack/web-api/-/web-api-7.3.4.tgz",
"integrity": "sha512-KwLK8dlz2lhr3NO7kbYQ7zgPTXPKrhq1JfQc0etJ0K8LSJhYYnf8GbVznvgDT/Uz1/pBXfFQnoXjrQIOKAdSuw==",
"dependencies": {
"@slack/logger": "^4.0.0",
"@slack/types": "^2.9.0",
"@types/node": ">=18.0.0",
"@types/retry": "0.12.0",
"axios": "^1.7.4",
"eventemitter3": "^5.0.1",
"form-data": "^4.0.0",
"is-electron": "2.2.2",
"is-stream": "^2",
"p-queue": "^6",
"p-retry": "^4",
"retry": "^0.13.1"
},
"engines": {
"node": ">= 18",
"npm": ">= 8.6.0"
}
},
"node_modules/@slack/web-api/node_modules/is-stream": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-2.0.1.tgz",
"integrity": "sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==",
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/@smithy/abort-controller": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/@smithy/abort-controller/-/abort-controller-3.1.1.tgz",
@@ -7186,6 +7260,11 @@
"integrity": "sha512-A4STmOXPhMUtHH+S6ymgE2GiBSMqf4oTvcQZMcHzokuTLVYzXTB8ttjcgxOVaAp2lGwEdzZ0J+cRbbeevQj1UQ==",
"dev": true
},
"node_modules/@types/retry": {
"version": "0.12.0",
"resolved": "https://registry.npmjs.org/@types/retry/-/retry-0.12.0.tgz",
"integrity": "sha512-wWKOClTTiizcZhXnPY4wikVAwmdYHp8q6DmC+EJUzAMsycb7HB32Kh9RN4+0gExjmPmZSAQjgURXIGATPegAvA=="
},
"node_modules/@types/safe-regex": {
"version": "1.1.6",
"resolved": "https://registry.npmjs.org/@types/safe-regex/-/safe-regex-1.1.6.tgz",
@@ -10385,6 +10464,11 @@
"node": ">=6"
}
},
"node_modules/eventemitter3": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-5.0.1.tgz",
"integrity": "sha512-GWkBvjiSZK87ELrYOSESUYeVIc9mvLLf/nXalMOS5dYrgZq9o5OVkbZAVM06CVxYsCwH9BDZFPlQTlPA1j4ahA=="
},
"node_modules/events": {
"version": "3.3.0",
"resolved": "https://registry.npmjs.org/events/-/events-3.3.0.tgz",
@@ -12178,6 +12262,11 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-electron": {
"version": "2.2.2",
"resolved": "https://registry.npmjs.org/is-electron/-/is-electron-2.2.2.tgz",
"integrity": "sha512-FO/Rhvz5tuw4MCWkpMzHFKWD2LsfHzIb7i6MdPYZ/KW7AlxawyLkqdy+jPZP1WubqEADE3O4FUENlJHDfQASRg=="
},
"node_modules/is-extglob": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
@@ -14131,6 +14220,14 @@
"node": ">=14.6"
}
},
"node_modules/p-finally": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/p-finally/-/p-finally-1.0.0.tgz",
"integrity": "sha512-LICb2p9CB7FS+0eR1oqWnHhp0FljGLZCWBE9aix0Uye9W8LTQPwMTYVGWQWIw9RdQiDg4+epXQODwIYJtSJaow==",
"engines": {
"node": ">=4"
}
},
"node_modules/p-is-promise": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/p-is-promise/-/p-is-promise-3.0.0.tgz",
@@ -14169,6 +14266,38 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/p-queue": {
"version": "6.6.2",
"resolved": "https://registry.npmjs.org/p-queue/-/p-queue-6.6.2.tgz",
"integrity": "sha512-RwFpb72c/BhQLEXIZ5K2e+AhgNVmIejGlTgiB9MzZ0e93GRvqZ7uSi0dvRF7/XIXDeNkra2fNHBxTyPDGySpjQ==",
"dependencies": {
"eventemitter3": "^4.0.4",
"p-timeout": "^3.2.0"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/p-queue/node_modules/eventemitter3": {
"version": "4.0.7",
"resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-4.0.7.tgz",
"integrity": "sha512-8guHBZCwKnFhYdHr2ysuRWErTwhoN2X8XELRlrRwpmfeY2jjuUN4taQMsULKUVo1K4DvZl+0pgfyoysHxvmvEw=="
},
"node_modules/p-retry": {
"version": "4.6.2",
"resolved": "https://registry.npmjs.org/p-retry/-/p-retry-4.6.2.tgz",
"integrity": "sha512-312Id396EbJdvRONlngUx0NydfrIQ5lsYu0znKVUzVvArzEIt08V1qhtyESbGVd1FGX7UKtiFp5uwKZdM8wIuQ==",
"dependencies": {
"@types/retry": "0.12.0",
"retry": "^0.13.1"
},
"engines": {
"node": ">=8"
}
},
"node_modules/p-throttle": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/p-throttle/-/p-throttle-5.1.0.tgz",
@@ -14180,6 +14309,17 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/p-timeout": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/p-timeout/-/p-timeout-3.2.0.tgz",
"integrity": "sha512-rhIwUycgwwKcP9yTOOFK/AKsAopjjCakVqLHePO3CC6Mir1Z99xT+R63jZxAT5lFZLa2inS5h+ZS2GvR99/FBg==",
"dependencies": {
"p-finally": "^1.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/p-try": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz",
@@ -15530,6 +15670,14 @@
"node": ">=4"
}
},
"node_modules/retry": {
"version": "0.13.1",
"resolved": "https://registry.npmjs.org/retry/-/retry-0.13.1.tgz",
"integrity": "sha512-XQBQ3I8W1Cge0Seh+6gjj03LbmRFWuoszgK9ooCpwYIrhhoO80pfq4cUkU5DkknwfOfFteRwlZ56PYOGYyFWdg==",
"engines": {
"node": ">= 4"
}
},
"node_modules/reusify": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/reusify/-/reusify-1.0.4.tgz",

View File

@@ -131,6 +131,8 @@
"@peculiar/x509": "^1.12.1",
"@serdnam/pino-cloudwatch-transport": "^1.0.4",
"@sindresorhus/slugify": "1.1.0",
"@slack/oauth": "^3.0.1",
"@slack/web-api": "^7.3.4",
"@team-plain/typescript-sdk": "^4.6.1",
"@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0",

View File

@@ -70,12 +70,14 @@ import { TSecretReplicationServiceFactory } from "@app/services/secret-replicati
import { TSecretSharingServiceFactory } from "@app/services/secret-sharing/secret-sharing-service";
import { TSecretTagServiceFactory } from "@app/services/secret-tag/secret-tag-service";
import { TServiceTokenServiceFactory } from "@app/services/service-token/service-token-service";
import { TSlackServiceFactory } from "@app/services/slack/slack-service";
import { TSuperAdminServiceFactory } from "@app/services/super-admin/super-admin-service";
import { TTelemetryServiceFactory } from "@app/services/telemetry/telemetry-service";
import { TUserDALFactory } from "@app/services/user/user-dal";
import { TUserServiceFactory } from "@app/services/user/user-service";
import { TUserEngagementServiceFactory } from "@app/services/user-engagement/user-engagement-service";
import { TWebhookServiceFactory } from "@app/services/webhook/webhook-service";
import { TWorkflowIntegrationServiceFactory } from "@app/services/workflow-integration/workflow-integration-service";
declare module "fastify" {
interface FastifyRequest {
@@ -177,6 +179,8 @@ declare module "fastify" {
userEngagement: TUserEngagementServiceFactory;
externalKms: TExternalKmsServiceFactory;
orgAdmin: TOrgAdminServiceFactory;
slack: TSlackServiceFactory;
workflowIntegration: TWorkflowIntegrationServiceFactory;
};
// this is exclusive use for middlewares in which we need to inject data
// everywhere else access using service layer

View File

@@ -193,6 +193,9 @@ import {
TProjectRolesUpdate,
TProjects,
TProjectsInsert,
TProjectSlackConfigs,
TProjectSlackConfigsInsert,
TProjectSlackConfigsUpdate,
TProjectsUpdate,
TProjectUserAdditionalPrivilege,
TProjectUserAdditionalPrivilegeInsert,
@@ -299,6 +302,9 @@ import {
TServiceTokens,
TServiceTokensInsert,
TServiceTokensUpdate,
TSlackIntegrations,
TSlackIntegrationsInsert,
TSlackIntegrationsUpdate,
TSuperAdmin,
TSuperAdminInsert,
TSuperAdminUpdate,
@@ -322,7 +328,10 @@ import {
TUsersUpdate,
TWebhooks,
TWebhooksInsert,
TWebhooksUpdate
TWebhooksUpdate,
TWorkflowIntegrations,
TWorkflowIntegrationsInsert,
TWorkflowIntegrationsUpdate
} from "@app/db/schemas";
import {
TSecretV2TagJunction,
@@ -776,5 +785,20 @@ declare module "knex/types/tables" {
TKmsKeyVersionsInsert,
TKmsKeyVersionsUpdate
>;
[TableName.SlackIntegrations]: KnexOriginal.CompositeTableType<
TSlackIntegrations,
TSlackIntegrationsInsert,
TSlackIntegrationsUpdate
>;
[TableName.ProjectSlackConfigs]: KnexOriginal.CompositeTableType<
TProjectSlackConfigs,
TProjectSlackConfigsInsert,
TProjectSlackConfigsUpdate
>;
[TableName.WorkflowIntegrations]: KnexOriginal.CompositeTableType<
TWorkflowIntegrations,
TWorkflowIntegrationsInsert,
TWorkflowIntegrationsUpdate
>;
}
}

View File

@@ -0,0 +1,96 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.WorkflowIntegrations))) {
await knex.schema.createTable(TableName.WorkflowIntegrations, (tb) => {
tb.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
tb.string("integration").notNullable();
tb.string("slug").notNullable();
tb.uuid("orgId").notNullable();
tb.foreign("orgId").references("id").inTable(TableName.Organization).onDelete("CASCADE");
tb.string("description");
tb.unique(["orgId", "slug"]);
tb.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.WorkflowIntegrations);
}
if (!(await knex.schema.hasTable(TableName.SlackIntegrations))) {
await knex.schema.createTable(TableName.SlackIntegrations, (tb) => {
tb.uuid("id", { primaryKey: true }).notNullable();
tb.foreign("id").references("id").inTable(TableName.WorkflowIntegrations).onDelete("CASCADE");
tb.string("teamId").notNullable();
tb.string("teamName").notNullable();
tb.string("slackUserId").notNullable();
tb.string("slackAppId").notNullable();
tb.binary("encryptedBotAccessToken").notNullable();
tb.string("slackBotId").notNullable();
tb.string("slackBotUserId").notNullable();
tb.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.SlackIntegrations);
}
if (!(await knex.schema.hasTable(TableName.ProjectSlackConfigs))) {
await knex.schema.createTable(TableName.ProjectSlackConfigs, (tb) => {
tb.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
tb.string("projectId").notNullable().unique();
tb.foreign("projectId").references("id").inTable(TableName.Project).onDelete("CASCADE");
tb.uuid("slackIntegrationId").notNullable();
tb.foreign("slackIntegrationId").references("id").inTable(TableName.SlackIntegrations).onDelete("CASCADE");
tb.boolean("isAccessRequestNotificationEnabled").notNullable().defaultTo(false);
tb.string("accessRequestChannels").notNullable().defaultTo("");
tb.boolean("isSecretRequestNotificationEnabled").notNullable().defaultTo(false);
tb.string("secretRequestChannels").notNullable().defaultTo("");
tb.timestamps(true, true, true);
});
await createOnUpdateTrigger(knex, TableName.ProjectSlackConfigs);
}
const doesSuperAdminHaveSlackClientId = await knex.schema.hasColumn(TableName.SuperAdmin, "encryptedSlackClientId");
const doesSuperAdminHaveSlackClientSecret = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedSlackClientSecret"
);
await knex.schema.alterTable(TableName.SuperAdmin, (tb) => {
if (!doesSuperAdminHaveSlackClientId) {
tb.binary("encryptedSlackClientId");
}
if (!doesSuperAdminHaveSlackClientSecret) {
tb.binary("encryptedSlackClientSecret");
}
});
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.ProjectSlackConfigs);
await dropOnUpdateTrigger(knex, TableName.ProjectSlackConfigs);
await knex.schema.dropTableIfExists(TableName.SlackIntegrations);
await dropOnUpdateTrigger(knex, TableName.SlackIntegrations);
await knex.schema.dropTableIfExists(TableName.WorkflowIntegrations);
await dropOnUpdateTrigger(knex, TableName.WorkflowIntegrations);
const doesSuperAdminHaveSlackClientId = await knex.schema.hasColumn(TableName.SuperAdmin, "encryptedSlackClientId");
const doesSuperAdminHaveSlackClientSecret = await knex.schema.hasColumn(
TableName.SuperAdmin,
"encryptedSlackClientSecret"
);
await knex.schema.alterTable(TableName.SuperAdmin, (tb) => {
if (doesSuperAdminHaveSlackClientId) {
tb.dropColumn("encryptedSlackClientId");
}
if (doesSuperAdminHaveSlackClientSecret) {
tb.dropColumn("encryptedSlackClientSecret");
}
});
}

View File

@@ -0,0 +1,25 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
if (await knex.schema.hasTable(TableName.CertificateAuthority)) {
const hasRequireTemplateForIssuanceColumn = await knex.schema.hasColumn(
TableName.CertificateAuthority,
"requireTemplateForIssuance"
);
if (!hasRequireTemplateForIssuanceColumn) {
await knex.schema.alterTable(TableName.CertificateAuthority, (t) => {
t.boolean("requireTemplateForIssuance").notNullable().defaultTo(false);
});
}
}
}
export async function down(knex: Knex): Promise<void> {
if (await knex.schema.hasTable(TableName.CertificateAuthority)) {
await knex.schema.alterTable(TableName.CertificateAuthority, (t) => {
t.dropColumn("requireTemplateForIssuance");
});
}
}

View File

@@ -28,7 +28,8 @@ export const CertificateAuthoritiesSchema = z.object({
keyAlgorithm: z.string(),
notBefore: z.date().nullable().optional(),
notAfter: z.date().nullable().optional(),
activeCaCertId: z.string().uuid().nullable().optional()
activeCaCertId: z.string().uuid().nullable().optional(),
requireTemplateForIssuance: z.boolean().default(false)
});
export type TCertificateAuthorities = z.infer<typeof CertificateAuthoritiesSchema>;

View File

@@ -62,6 +62,7 @@ export * from "./project-environments";
export * from "./project-keys";
export * from "./project-memberships";
export * from "./project-roles";
export * from "./project-slack-configs";
export * from "./project-user-additional-privilege";
export * from "./project-user-membership-roles";
export * from "./projects";
@@ -101,6 +102,7 @@ export * from "./secret-versions-v2";
export * from "./secrets";
export * from "./secrets-v2";
export * from "./service-tokens";
export * from "./slack-integrations";
export * from "./super-admin";
export * from "./trusted-ips";
export * from "./user-actions";
@@ -109,3 +111,4 @@ export * from "./user-encryption-keys";
export * from "./user-group-membership";
export * from "./users";
export * from "./webhooks";
export * from "./workflow-integrations";

View File

@@ -114,7 +114,10 @@ export enum TableName {
InternalKms = "internal_kms",
InternalKmsKeyVersion = "internal_kms_key_version",
// @depreciated
KmsKeyVersion = "kms_key_versions"
KmsKeyVersion = "kms_key_versions",
WorkflowIntegrations = "workflow_integrations",
SlackIntegrations = "slack_integrations",
ProjectSlackConfigs = "project_slack_configs"
}
export type TImmutableDBKeys = "id" | "createdAt" | "updatedAt";

View File

@@ -0,0 +1,24 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const ProjectSlackConfigsSchema = z.object({
id: z.string().uuid(),
projectId: z.string(),
slackIntegrationId: z.string().uuid(),
isAccessRequestNotificationEnabled: z.boolean().default(false),
accessRequestChannels: z.string().default(""),
isSecretRequestNotificationEnabled: z.boolean().default(false),
secretRequestChannels: z.string().default(""),
createdAt: z.date(),
updatedAt: z.date()
});
export type TProjectSlackConfigs = z.infer<typeof ProjectSlackConfigsSchema>;
export type TProjectSlackConfigsInsert = Omit<z.input<typeof ProjectSlackConfigsSchema>, TImmutableDBKeys>;
export type TProjectSlackConfigsUpdate = Partial<Omit<z.input<typeof ProjectSlackConfigsSchema>, TImmutableDBKeys>>;

View File

@@ -21,8 +21,8 @@ export const SecretSharingSchema = z.object({
expiresAfterViews: z.number().nullable().optional(),
accessType: z.string().default("anyone"),
name: z.string().nullable().optional(),
password: z.string().nullable().optional(),
lastViewedAt: z.date().nullable().optional()
lastViewedAt: z.date().nullable().optional(),
password: z.string().nullable().optional()
});
export type TSecretSharing = z.infer<typeof SecretSharingSchema>;

View File

@@ -0,0 +1,27 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const SlackIntegrationsSchema = z.object({
id: z.string().uuid(),
teamId: z.string(),
teamName: z.string(),
slackUserId: z.string(),
slackAppId: z.string(),
encryptedBotAccessToken: zodBuffer,
slackBotId: z.string(),
slackBotUserId: z.string(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TSlackIntegrations = z.infer<typeof SlackIntegrationsSchema>;
export type TSlackIntegrationsInsert = Omit<z.input<typeof SlackIntegrationsSchema>, TImmutableDBKeys>;
export type TSlackIntegrationsUpdate = Partial<Omit<z.input<typeof SlackIntegrationsSchema>, TImmutableDBKeys>>;

View File

@@ -5,6 +5,8 @@
import { z } from "zod";
import { zodBuffer } from "@app/lib/zod";
import { TImmutableDBKeys } from "./models";
export const SuperAdminSchema = z.object({
@@ -19,7 +21,9 @@ export const SuperAdminSchema = z.object({
trustLdapEmails: z.boolean().default(false).nullable().optional(),
trustOidcEmails: z.boolean().default(false).nullable().optional(),
defaultAuthOrgId: z.string().uuid().nullable().optional(),
enabledLoginMethods: z.string().array().nullable().optional()
enabledLoginMethods: z.string().array().nullable().optional(),
encryptedSlackClientId: zodBuffer.nullable().optional(),
encryptedSlackClientSecret: zodBuffer.nullable().optional()
});
export type TSuperAdmin = z.infer<typeof SuperAdminSchema>;

View File

@@ -0,0 +1,22 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const WorkflowIntegrationsSchema = z.object({
id: z.string().uuid(),
integration: z.string(),
slug: z.string(),
orgId: z.string().uuid(),
description: z.string().nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});
export type TWorkflowIntegrations = z.infer<typeof WorkflowIntegrationsSchema>;
export type TWorkflowIntegrationsInsert = Omit<z.input<typeof WorkflowIntegrationsSchema>, TImmutableDBKeys>;
export type TWorkflowIntegrationsUpdate = Partial<Omit<z.input<typeof WorkflowIntegrationsSchema>, TImmutableDBKeys>>;

View File

@@ -122,6 +122,10 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
})
.merge(
z.object({
project: z.object({
name: z.string(),
slug: z.string()
}),
event: z.object({
type: z.string(),
metadata: z.any()
@@ -138,16 +142,20 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const auditLogs = await server.services.auditLog.listProjectAuditLogs({
const auditLogs = await server.services.auditLog.listAuditLogs({
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
projectId: req.params.workspaceId,
...req.query,
endDate: req.query.endDate,
startDate: req.query.startDate || getLastMidnightDateISO(),
auditLogActor: req.query.actor,
actor: req.permission.type
actor: req.permission.type,
filter: {
...req.query,
projectId: req.params.workspaceId,
endDate: req.query.endDate,
startDate: req.query.startDate || getLastMidnightDateISO(),
auditLogActorId: req.query.actor,
eventType: req.query.eventType ? [req.query.eventType] : undefined
}
});
return { auditLogs };
}

View File

@@ -103,6 +103,13 @@ export const registerSamlRouter = async (server: FastifyZodProvider) => {
const email = profile?.email ?? (profile?.emailAddress as string); // emailRippling is added because in Rippling the field `email` reserved
if (!email || !profile.firstName) {
logger.info(
{
err: new Error("Invalid saml request. Missing email or first name"),
profile
},
`email: ${email} firstName: ${profile.firstName as string}`
);
throw new BadRequestError({ message: "Invalid request. Missing email or first name" });
}

View File

@@ -5,9 +5,13 @@ import { ProjectMembershipRole } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError, UnauthorizedError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectEnvDALFactory } from "@app/services/project-env/project-env-dal";
import { TProjectMembershipDALFactory } from "@app/services/project-membership/project-membership-dal";
import { TProjectSlackConfigDALFactory } from "@app/services/slack/project-slack-config-dal";
import { triggerSlackNotification } from "@app/services/slack/slack-fns";
import { SlackTriggerFeature } from "@app/services/slack/slack-types";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TUserDALFactory } from "@app/services/user/user-dal";
@@ -33,7 +37,10 @@ type TSecretApprovalRequestServiceFactoryDep = {
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission">;
accessApprovalPolicyApproverDAL: Pick<TAccessApprovalPolicyApproverDALFactory, "find">;
projectEnvDAL: Pick<TProjectEnvDALFactory, "findOne">;
projectDAL: Pick<TProjectDALFactory, "checkProjectUpgradeStatus" | "findProjectBySlug">;
projectDAL: Pick<
TProjectDALFactory,
"checkProjectUpgradeStatus" | "findProjectBySlug" | "findProjectWithOrg" | "findById"
>;
accessApprovalRequestDAL: Pick<
TAccessApprovalRequestDALFactory,
| "create"
@@ -56,6 +63,8 @@ type TSecretApprovalRequestServiceFactoryDep = {
TUserDALFactory,
"findUserByProjectMembershipId" | "findUsersByProjectMembershipIds" | "find" | "findById"
>;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "getIntegrationDetailsByProject">;
};
export type TAccessApprovalRequestServiceFactory = ReturnType<typeof accessApprovalRequestServiceFactory>;
@@ -71,7 +80,9 @@ export const accessApprovalRequestServiceFactory = ({
accessApprovalPolicyApproverDAL,
additionalPrivilegeDAL,
smtpService,
userDAL
userDAL,
kmsService,
projectSlackConfigDAL
}: TSecretApprovalRequestServiceFactoryDep) => {
const createAccessApprovalRequest = async ({
isTemporary,
@@ -166,13 +177,36 @@ export const accessApprovalRequestServiceFactory = ({
tx
);
const requesterFullName = `${requestedByUser.firstName} ${requestedByUser.lastName}`;
const approvalUrl = `${cfg.SITE_URL}/project/${project.id}/approval`;
await triggerSlackNotification({
projectId: project.id,
projectSlackConfigDAL,
projectDAL,
kmsService,
notification: {
type: SlackTriggerFeature.ACCESS_REQUEST,
payload: {
projectName: project.name,
requesterFullName,
isTemporary,
requesterEmail: requestedByUser.email as string,
secretPath,
environment: envSlug,
permissions: accessTypes,
approvalUrl
}
}
});
await smtpService.sendMail({
recipients: approverUsers.filter((approver) => approver.email).map((approver) => approver.email!),
subjectLine: "Access Approval Request",
substitutions: {
projectName: project.name,
requesterFullName: `${requestedByUser.firstName} ${requestedByUser.lastName}`,
requesterFullName,
requesterEmail: requestedByUser.email,
isTemporary,
...(isTemporary && {
@@ -181,7 +215,7 @@ export const accessApprovalRequestServiceFactory = ({
secretPath,
environment: envSlug,
permissions: accessTypes,
approvalUrl: `${cfg.SITE_URL}/project/${project.id}/approval`
approvalUrl
},
template: SmtpTemplates.AccessApprovalRequest
});

View File

@@ -1,11 +1,14 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { AuditLogsSchema, TableName } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, stripUndefinedInWhere } from "@app/lib/knex";
import { ormify, selectAllTableCols, stripUndefinedInWhere } from "@app/lib/knex";
import { logger } from "@app/lib/logger";
import { QueueName } from "@app/queue";
import { ActorType } from "@app/services/auth/auth-type";
import { EventType } from "./audit-log-types";
export type TAuditLogDALFactory = ReturnType<typeof auditLogDALFactory>;
@@ -25,7 +28,24 @@ export const auditLogDALFactory = (db: TDbClient) => {
const auditLogOrm = ormify(db, TableName.AuditLog);
const find = async (
{ orgId, projectId, userAgentType, startDate, endDate, limit = 20, offset = 0, actor, eventType }: TFindQuery,
{
orgId,
projectId,
userAgentType,
startDate,
endDate,
limit = 20,
offset = 0,
actorId,
actorType,
eventType,
eventMetadata
}: Omit<TFindQuery, "actor" | "eventType"> & {
actorId?: string;
actorType?: ActorType;
eventType?: EventType[];
eventMetadata?: Record<string, string>;
},
tx?: Knex
) => {
try {
@@ -33,23 +53,57 @@ export const auditLogDALFactory = (db: TDbClient) => {
.where(
stripUndefinedInWhere({
projectId,
orgId,
eventType,
actor,
[`${TableName.AuditLog}.orgId`]: orgId,
userAgentType
})
)
.leftJoin(TableName.Project, `${TableName.AuditLog}.projectId`, `${TableName.Project}.id`)
.select(selectAllTableCols(TableName.AuditLog))
.select(
db.ref("name").withSchema(TableName.Project).as("projectName"),
db.ref("slug").withSchema(TableName.Project).as("projectSlug")
)
.limit(limit)
.offset(offset)
.orderBy("createdAt", "desc");
.orderBy(`${TableName.AuditLog}.createdAt`, "desc");
if (actorId) {
void sqlQuery.whereRaw(`"actorMetadata"->>'userId' = ?`, [actorId]);
}
if (eventMetadata && Object.keys(eventMetadata).length) {
Object.entries(eventMetadata).forEach(([key, value]) => {
void sqlQuery.whereRaw(`"eventMetadata"->>'${key}' = ?`, [value]);
});
}
if (actorType) {
void sqlQuery.where("actor", actorType);
}
if (eventType?.length) {
void sqlQuery.whereIn("eventType", eventType);
}
if (startDate) {
void sqlQuery.where("createdAt", ">=", startDate);
void sqlQuery.where(`${TableName.AuditLog}.createdAt`, ">=", startDate);
}
if (endDate) {
void sqlQuery.where("createdAt", "<=", endDate);
void sqlQuery.where(`${TableName.AuditLog}.createdAt`, "<=", endDate);
}
const docs = await sqlQuery;
return docs;
return docs.map((doc) => ({
...AuditLogsSchema.parse(doc),
project: {
name: doc.projectName,
slug: doc.projectSlug
}
}));
} catch (error) {
throw new DatabaseError({ error });
}

View File

@@ -3,6 +3,7 @@ import { ForbiddenError } from "@casl/ability";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors";
import { OrgPermissionActions, OrgPermissionSubjects } from "../permission/org-permission";
import { TPermissionServiceFactory } from "../permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "../permission/project-permission";
import { TAuditLogDALFactory } from "./audit-log-dal";
@@ -11,7 +12,7 @@ import { EventType, TCreateAuditLogDTO, TListProjectAuditLogDTO } from "./audit-
type TAuditLogServiceFactoryDep = {
auditLogDAL: TAuditLogDALFactory;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission" | "getOrgPermission">;
auditLogQueue: TAuditLogQueueServiceFactory;
};
@@ -22,38 +23,47 @@ export const auditLogServiceFactory = ({
auditLogQueue,
permissionService
}: TAuditLogServiceFactoryDep) => {
const listProjectAuditLogs = async ({
userAgentType,
eventType,
offset,
limit,
endDate,
startDate,
actor,
actorId,
actorOrgId,
actorAuthMethod,
projectId,
auditLogActor
}: TListProjectAuditLogDTO) => {
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.AuditLogs);
const listAuditLogs = async ({ actorAuthMethod, actorId, actorOrgId, actor, filter }: TListProjectAuditLogDTO) => {
if (filter.projectId) {
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
filter.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.AuditLogs);
} else {
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
/**
* NOTE (dangtony98): Update this to organization-level audit log permission check once audit logs are moved
* to the organization level
*/
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Member);
}
// If project ID is not provided, then we need to return all the audit logs for the organization itself.
const auditLogs = await auditLogDAL.find({
startDate,
endDate,
limit,
offset,
eventType,
userAgentType,
actor: auditLogActor,
projectId
startDate: filter.startDate,
endDate: filter.endDate,
limit: filter.limit,
offset: filter.offset,
eventType: filter.eventType,
userAgentType: filter.userAgentType,
actorId: filter.auditLogActorId,
actorType: filter.actorType,
eventMetadata: filter.eventMetadata,
...(filter.projectId ? { projectId: filter.projectId } : { orgId: actorOrgId })
});
return auditLogs.map(({ eventType: logEventType, actor: eActor, actorMetadata, eventMetadata, ...el }) => ({
...el,
event: { type: logEventType, metadata: eventMetadata },
@@ -76,6 +86,6 @@ export const auditLogServiceFactory = ({
return {
createAuditLog,
listProjectAuditLogs
listAuditLogs
};
};

View File

@@ -5,19 +5,23 @@ import { TIdentityTrustedIp } from "@app/services/identity/identity-types";
import { PkiItemType } from "@app/services/pki-collection/pki-collection-types";
export type TListProjectAuditLogDTO = {
auditLogActor?: string;
projectId: string;
eventType?: string;
startDate?: string;
endDate?: string;
userAgentType?: string;
limit?: number;
offset?: number;
} & TProjectPermission;
filter: {
userAgentType?: UserAgentType;
eventType?: EventType[];
offset?: number;
limit: number;
endDate?: string;
startDate?: string;
projectId?: string;
auditLogActorId?: string;
actorType?: ActorType;
eventMetadata?: Record<string, string>;
};
} & Omit<TProjectPermission, "projectId">;
export type TCreateAuditLogDTO = {
event: Event;
actor: UserActor | IdentityActor | ServiceActor | ScimClientActor;
actor: UserActor | IdentityActor | ServiceActor | ScimClientActor | PlatformActor;
orgId?: string;
projectId?: string;
} & BaseAuthData;
@@ -140,6 +144,7 @@ export enum EventType {
GET_CA_CRLS = "get-certificate-authority-crls",
ISSUE_CERT = "issue-cert",
SIGN_CERT = "sign-cert",
GET_CA_CERTIFICATE_TEMPLATES = "get-ca-certificate-templates",
GET_CERT = "get-cert",
DELETE_CERT = "delete-cert",
REVOKE_CERT = "revoke-cert",
@@ -169,7 +174,15 @@ export enum EventType {
GET_CERTIFICATE_TEMPLATE = "get-certificate-template",
CREATE_CERTIFICATE_TEMPLATE_EST_CONFIG = "create-certificate-template-est-config",
UPDATE_CERTIFICATE_TEMPLATE_EST_CONFIG = "update-certificate-template-est-config",
GET_CERTIFICATE_TEMPLATE_EST_CONFIG = "get-certificate-template-est-config"
GET_CERTIFICATE_TEMPLATE_EST_CONFIG = "get-certificate-template-est-config",
ATTEMPT_CREATE_SLACK_INTEGRATION = "attempt-create-slack-integration",
ATTEMPT_REINSTALL_SLACK_INTEGRATION = "attempt-reinstall-slack-integration",
GET_SLACK_INTEGRATION = "get-slack-integration",
UPDATE_SLACK_INTEGRATION = "update-slack-integration",
DELETE_SLACK_INTEGRATION = "delete-slack-integration",
GET_PROJECT_SLACK_CONFIG = "get-project-slack-config",
UPDATE_PROJECT_SLACK_CONFIG = "update-project-slack-config",
INTEGRATION_SYNCED = "integration-synced"
}
interface UserActorMetadata {
@@ -190,6 +203,8 @@ interface IdentityActorMetadata {
interface ScimClientActorMetadata {}
interface PlatformActorMetadata {}
export interface UserActor {
type: ActorType.USER;
metadata: UserActorMetadata;
@@ -200,6 +215,11 @@ export interface ServiceActor {
metadata: ServiceActorMetadata;
}
export interface PlatformActor {
type: ActorType.PLATFORM;
metadata: PlatformActorMetadata;
}
export interface IdentityActor {
type: ActorType.IDENTITY;
metadata: IdentityActorMetadata;
@@ -210,7 +230,7 @@ export interface ScimClientActor {
metadata: ScimClientActorMetadata;
}
export type Actor = UserActor | ServiceActor | IdentityActor | ScimClientActor;
export type Actor = UserActor | ServiceActor | IdentityActor | ScimClientActor | PlatformActor;
interface GetSecretsEvent {
type: EventType.GET_SECRETS;
@@ -1192,6 +1212,14 @@ interface SignCert {
};
}
interface GetCaCertificateTemplates {
type: EventType.GET_CA_CERTIFICATE_TEMPLATES;
metadata: {
caId: string;
dn: string;
};
}
interface GetCert {
type: EventType.GET_CERT;
metadata: {
@@ -1446,6 +1474,73 @@ interface GetCertificateTemplateEstConfig {
};
}
interface AttemptCreateSlackIntegration {
type: EventType.ATTEMPT_CREATE_SLACK_INTEGRATION;
metadata: {
slug: string;
description?: string;
};
}
interface AttemptReinstallSlackIntegration {
type: EventType.ATTEMPT_REINSTALL_SLACK_INTEGRATION;
metadata: {
id: string;
};
}
interface UpdateSlackIntegration {
type: EventType.UPDATE_SLACK_INTEGRATION;
metadata: {
id: string;
slug: string;
description?: string;
};
}
interface DeleteSlackIntegration {
type: EventType.DELETE_SLACK_INTEGRATION;
metadata: {
id: string;
};
}
interface GetSlackIntegration {
type: EventType.GET_SLACK_INTEGRATION;
metadata: {
id: string;
};
}
interface UpdateProjectSlackConfig {
type: EventType.UPDATE_PROJECT_SLACK_CONFIG;
metadata: {
id: string;
slackIntegrationId: string;
isAccessRequestNotificationEnabled: boolean;
accessRequestChannels: string;
isSecretRequestNotificationEnabled: boolean;
secretRequestChannels: string;
};
}
interface GetProjectSlackConfig {
type: EventType.GET_PROJECT_SLACK_CONFIG;
metadata: {
id: string;
};
}
interface IntegrationSyncedEvent {
type: EventType.INTEGRATION_SYNCED;
metadata: {
integrationId: string;
lastSyncJobId: string;
lastUsed: Date;
syncMessage: string;
isSynced: boolean;
};
}
export type Event =
| GetSecretsEvent
| GetSecretEvent
@@ -1547,6 +1642,7 @@ export type Event =
| GetCaCrls
| IssueCert
| SignCert
| GetCaCertificateTemplates
| GetCert
| DeleteCert
| RevokeCert
@@ -1576,4 +1672,12 @@ export type Event =
| DeleteCertificateTemplate
| CreateCertificateTemplateEstConfig
| UpdateCertificateTemplateEstConfig
| GetCertificateTemplateEstConfig;
| GetCertificateTemplateEstConfig
| AttemptCreateSlackIntegration
| AttemptReinstallSlackIntegration
| UpdateSlackIntegration
| DeleteSlackIntegration
| GetSlackIntegration
| UpdateProjectSlackConfig
| GetProjectSlackConfig
| IntegrationSyncedEvent;

View File

@@ -17,7 +17,7 @@ const generateUsername = () => {
return alphaNumericNanoId(32);
};
export const ElasticSearchDatabaseProvider = (): TDynamicProviderFns => {
export const ElasticSearchProvider = (): TDynamicProviderFns => {
const validateProviderInputs = async (inputs: unknown) => {
const appCfg = getConfig();
const isCloud = Boolean(appCfg.LICENSE_SERVER_KEY); // quick and dirty way to check if its cloud or not

View File

@@ -1,10 +1,11 @@
import { AwsElastiCacheDatabaseProvider } from "./aws-elasticache";
import { AwsIamProvider } from "./aws-iam";
import { CassandraProvider } from "./cassandra";
import { ElasticSearchDatabaseProvider } from "./elastic-search";
import { ElasticSearchProvider } from "./elastic-search";
import { DynamicSecretProviders } from "./models";
import { MongoAtlasProvider } from "./mongo-atlas";
import { MongoDBProvider } from "./mongo-db";
import { RabbitMqProvider } from "./rabbit-mq";
import { RedisDatabaseProvider } from "./redis";
import { SqlDatabaseProvider } from "./sql-database";
@@ -15,6 +16,7 @@ export const buildDynamicSecretProviders = () => ({
[DynamicSecretProviders.Redis]: RedisDatabaseProvider(),
[DynamicSecretProviders.AwsElastiCache]: AwsElastiCacheDatabaseProvider(),
[DynamicSecretProviders.MongoAtlas]: MongoAtlasProvider(),
[DynamicSecretProviders.ElasticSearch]: ElasticSearchDatabaseProvider(),
[DynamicSecretProviders.MongoDB]: MongoDBProvider()
[DynamicSecretProviders.MongoDB]: MongoDBProvider(),
[DynamicSecretProviders.ElasticSearch]: ElasticSearchProvider(),
[DynamicSecretProviders.RabbitMq]: RabbitMqProvider()
});

View File

@@ -56,6 +56,26 @@ export const DynamicSecretElasticSearchSchema = z.object({
ca: z.string().optional()
});
export const DynamicSecretRabbitMqSchema = z.object({
host: z.string().trim().min(1),
port: z.number(),
tags: z.array(z.string().trim()).default([]),
username: z.string().trim().min(1),
password: z.string().trim().min(1),
ca: z.string().optional(),
virtualHost: z.object({
name: z.string().trim().min(1),
permissions: z.object({
read: z.string().trim().min(1),
write: z.string().trim().min(1),
configure: z.string().trim().min(1)
})
})
});
export const DynamicSecretSqlDBSchema = z.object({
client: z.nativeEnum(SqlProviders),
host: z.string().trim().toLowerCase(),
@@ -154,7 +174,8 @@ export enum DynamicSecretProviders {
AwsElastiCache = "aws-elasticache",
MongoAtlas = "mongo-db-atlas",
ElasticSearch = "elastic-search",
MongoDB = "mongo-db"
MongoDB = "mongo-db",
RabbitMq = "rabbit-mq"
}
export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [
@@ -165,7 +186,8 @@ export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(DynamicSecretProviders.AwsElastiCache), inputs: DynamicSecretAwsElastiCacheSchema }),
z.object({ type: z.literal(DynamicSecretProviders.MongoAtlas), inputs: DynamicSecretMongoAtlasSchema }),
z.object({ type: z.literal(DynamicSecretProviders.ElasticSearch), inputs: DynamicSecretElasticSearchSchema }),
z.object({ type: z.literal(DynamicSecretProviders.MongoDB), inputs: DynamicSecretMongoDBSchema })
z.object({ type: z.literal(DynamicSecretProviders.MongoDB), inputs: DynamicSecretMongoDBSchema }),
z.object({ type: z.literal(DynamicSecretProviders.RabbitMq), inputs: DynamicSecretRabbitMqSchema })
]);
export type TDynamicProviderFns = {

View File

@@ -0,0 +1,172 @@
import axios, { Axios } from "axios";
import https from "https";
import { customAlphabet } from "nanoid";
import { z } from "zod";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors";
import { removeTrailingSlash } from "@app/lib/fn";
import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { DynamicSecretRabbitMqSchema, TDynamicProviderFns } from "./models";
const generatePassword = () => {
const charset = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789-_.~!*$#";
return customAlphabet(charset, 64)();
};
const generateUsername = () => {
return alphaNumericNanoId(32);
};
type TCreateRabbitMQUser = {
axiosInstance: Axios;
createUser: {
username: string;
password: string;
tags: string[];
};
virtualHost: {
name: string;
permissions: {
read: string;
write: string;
configure: string;
};
};
};
type TDeleteRabbitMqUser = {
axiosInstance: Axios;
usernameToDelete: string;
};
async function createRabbitMqUser({ axiosInstance, createUser, virtualHost }: TCreateRabbitMQUser): Promise<void> {
try {
// Create user
const userUrl = `/users/${createUser.username}`;
const userData = {
password: createUser.password,
tags: createUser.tags.join(",")
};
await axiosInstance.put(userUrl, userData);
// Set permissions for the virtual host
if (virtualHost) {
const permissionData = {
configure: virtualHost.permissions.configure,
write: virtualHost.permissions.write,
read: virtualHost.permissions.read
};
await axiosInstance.put(
`/permissions/${encodeURIComponent(virtualHost.name)}/${createUser.username}`,
permissionData
);
}
} catch (error) {
logger.error(error, "Error creating RabbitMQ user");
throw error;
}
}
async function deleteRabbitMqUser({ axiosInstance, usernameToDelete }: TDeleteRabbitMqUser) {
await axiosInstance.delete(`users/${usernameToDelete}`);
return { username: usernameToDelete };
}
export const RabbitMqProvider = (): TDynamicProviderFns => {
const validateProviderInputs = async (inputs: unknown) => {
const appCfg = getConfig();
const isCloud = Boolean(appCfg.LICENSE_SERVER_KEY); // quick and dirty way to check if its cloud or not
const providerInputs = await DynamicSecretRabbitMqSchema.parseAsync(inputs);
if (
isCloud &&
// localhost
// internal ips
(providerInputs.host === "host.docker.internal" ||
providerInputs.host.match(/^10\.\d+\.\d+\.\d+/) ||
providerInputs.host.match(/^192\.168\.\d+\.\d+/))
) {
throw new BadRequestError({ message: "Invalid db host" });
}
if (providerInputs.host === "localhost" || providerInputs.host === "127.0.0.1") {
throw new BadRequestError({ message: "Invalid db host" });
}
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretRabbitMqSchema>) => {
const axiosInstance = axios.create({
baseURL: `${removeTrailingSlash(providerInputs.host)}:${providerInputs.port}/api`,
auth: {
username: providerInputs.username,
password: providerInputs.password
},
headers: {
"Content-Type": "application/json"
},
...(providerInputs.ca && {
httpsAgent: new https.Agent({ ca: providerInputs.ca, rejectUnauthorized: false })
})
});
return axiosInstance;
};
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const infoResponse = await connection.get("/whoami").then(() => true);
return infoResponse;
};
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
await createRabbitMqUser({
axiosInstance: connection,
virtualHost: providerInputs.virtualHost,
createUser: {
password,
username,
tags: [...(providerInputs.tags ?? []), "infisical-user"]
}
});
return { entityId: username, data: { DB_USERNAME: username, DB_PASSWORD: password } };
};
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
await deleteRabbitMqUser({ axiosInstance: connection, usernameToDelete: entityId });
return { entityId };
};
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
return { entityId };
};
return {
validateProviderInputs,
validateConnection,
create,
revoke,
renew
};
};

View File

@@ -41,10 +41,9 @@ export const userGroupMembershipDALFactory = (db: TDbClient) => {
};
// special query
const findUserGroupMembershipsInProject = async (usernames: string[], projectId: string) => {
const findUserGroupMembershipsInProject = async (usernames: string[], projectId: string, tx?: Knex) => {
try {
const usernameDocs: string[] = await db
.replicaNode()(TableName.UserGroupMembership)
const usernameDocs: string[] = await (tx || db.replicaNode())(TableName.UserGroupMembership)
.join(
TableName.GroupProjectMembership,
`${TableName.UserGroupMembership}.groupId`,

View File

@@ -47,6 +47,9 @@ import {
} from "@app/services/secret-v2-bridge/secret-v2-bridge-fns";
import { TSecretVersionV2DALFactory } from "@app/services/secret-v2-bridge/secret-version-dal";
import { TSecretVersionV2TagDALFactory } from "@app/services/secret-v2-bridge/secret-version-tag-dal";
import { TProjectSlackConfigDALFactory } from "@app/services/slack/project-slack-config-dal";
import { triggerSlackNotification } from "@app/services/slack/slack-fns";
import { SlackTriggerFeature } from "@app/services/slack/slack-types";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TUserDALFactory } from "@app/services/user/user-dal";
@@ -89,7 +92,7 @@ type TSecretApprovalRequestServiceFactoryDep = {
secretVersionDAL: Pick<TSecretVersionDALFactory, "findLatestVersionMany" | "insertMany">;
secretVersionTagDAL: Pick<TSecretVersionTagDALFactory, "insertMany">;
smtpService: Pick<TSmtpService, "sendMail">;
userDAL: Pick<TUserDALFactory, "find" | "findOne">;
userDAL: Pick<TUserDALFactory, "find" | "findOne" | "findById">;
projectEnvDAL: Pick<TProjectEnvDALFactory, "findOne">;
projectDAL: Pick<
TProjectDALFactory,
@@ -104,6 +107,7 @@ type TSecretApprovalRequestServiceFactoryDep = {
secretVersionV2BridgeDAL: Pick<TSecretVersionV2DALFactory, "insertMany" | "findLatestVersionMany">;
secretVersionTagV2BridgeDAL: Pick<TSecretVersionV2TagDALFactory, "insertMany">;
secretApprovalPolicyDAL: Pick<TSecretApprovalPolicyDALFactory, "findById">;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "getIntegrationDetailsByProject">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
};
@@ -132,7 +136,8 @@ export const secretApprovalRequestServiceFactory = ({
secretV2BridgeDAL,
secretVersionV2BridgeDAL,
secretVersionTagV2BridgeDAL,
licenseService
licenseService,
projectSlackConfigDAL
}: TSecretApprovalRequestServiceFactoryDep) => {
const requestCount = async ({ projectId, actor, actorId, actorOrgId, actorAuthMethod }: TApprovalRequestCountDTO) => {
if (actor === ActorType.SERVICE) throw new BadRequestError({ message: "Cannot use service token" });
@@ -1069,6 +1074,25 @@ export const secretApprovalRequestServiceFactory = ({
return { ...doc, commits: approvalCommits };
});
const env = await projectEnvDAL.findOne({ id: policy.envId });
const user = await userDAL.findById(secretApprovalRequest.committerUserId);
await triggerSlackNotification({
projectId,
projectDAL,
kmsService,
projectSlackConfigDAL,
notification: {
type: SlackTriggerFeature.SECRET_APPROVAL,
payload: {
userEmail: user.email as string,
environment: env.name,
secretPath,
projectId,
requestId: secretApprovalRequest.id
}
}
});
await sendApprovalEmailsFn({
projectDAL,
secretApprovalPolicyDAL,
@@ -1331,6 +1355,25 @@ export const secretApprovalRequestServiceFactory = ({
return { ...doc, commits: approvalCommits };
});
const user = await userDAL.findById(secretApprovalRequest.committerUserId);
const env = await projectEnvDAL.findOne({ id: policy.envId });
await triggerSlackNotification({
projectId,
projectDAL,
kmsService,
projectSlackConfigDAL,
notification: {
type: SlackTriggerFeature.SECRET_APPROVAL,
payload: {
userEmail: user.email as string,
environment: env.name,
secretPath,
projectId,
requestId: secretApprovalRequest.id
}
}
});
await sendApprovalEmailsFn({
projectDAL,
secretApprovalPolicyDAL,

View File

@@ -447,7 +447,9 @@ export const PROJECT_USERS = {
INVITE_MEMBER: {
projectId: "The ID of the project to invite the member to.",
emails: "A list of organization member emails to invite to the project.",
usernames: "A list of usernames to invite to the project."
usernames: "A list of usernames to invite to the project.",
roleSlugs:
"A list of role slugs to assign to the newly created project membership. If nothing is provided, it will default to the Member role."
},
REMOVE_MEMBER: {
projectId: "The ID of the project to remove the member from.",
@@ -1037,14 +1039,18 @@ export const CERTIFICATE_AUTHORITIES = {
maxPathLength:
"The maximum number of intermediate CAs that may follow this CA in the certificate / CA chain. A maxPathLength of -1 implies no path limit on the chain.",
keyAlgorithm:
"The type of public key algorithm and size, in bits, of the key pair for the CA; when you create an intermediate CA, you must use a key algorithm supported by the parent CA."
"The type of public key algorithm and size, in bits, of the key pair for the CA; when you create an intermediate CA, you must use a key algorithm supported by the parent CA.",
requireTemplateForIssuance:
"Whether or not certificates for this CA can only be issued through certificate templates."
},
GET: {
caId: "The ID of the CA to get"
},
UPDATE: {
caId: "The ID of the CA to update",
status: "The status of the CA to update to. This can be one of active or disabled"
status: "The status of the CA to update to. This can be one of active or disabled",
requireTemplateForIssuance:
"Whether or not certificates for this CA can only be issued through certificate templates."
},
DELETE: {
caId: "The ID of the CA to delete"

View File

@@ -146,7 +146,9 @@ const envSchema = z
PLAIN_API_KEY: zpStr(z.string().optional()),
PLAIN_WISH_LABEL_IDS: zpStr(z.string().optional()),
DISABLE_AUDIT_LOG_GENERATION: zodStrBool.default("false"),
SSL_CLIENT_CERTIFICATE_HEADER_KEY: zpStr(z.string().optional()).default("x-ssl-client-cert")
SSL_CLIENT_CERTIFICATE_HEADER_KEY: zpStr(z.string().optional()).default("x-ssl-client-cert"),
WORKFLOW_SLACK_CLIENT_ID: zpStr(z.string()).optional(),
WORKFLOW_SLACK_CLIENT_SECRET: zpStr(z.string()).optional()
})
.transform((data) => ({
...data,

View File

@@ -5,6 +5,7 @@ import nacl from "tweetnacl";
import tweetnacl from "tweetnacl-util";
import { TUserEncryptionKeys } from "@app/db/schemas";
import { UserEncryption } from "@app/services/user/user-types";
import { decryptSymmetric128BitHexKeyUTF8, encryptAsymmetric, encryptSymmetric } from "./encryption";
@@ -36,12 +37,16 @@ export const srpCheckClientProof = async (
// Ghost user related:
// This functionality is intended for ghost user logic. This happens on the frontend when a user is being created.
// We replicate the same functionality on the backend when creating a ghost user.
export const generateUserSrpKeys = async (email: string, password: string) => {
export const generateUserSrpKeys = async (
email: string,
password: string,
customKeys?: { publicKey: string; privateKey: string }
) => {
const pair = nacl.box.keyPair();
const secretKeyUint8Array = pair.secretKey;
const publicKeyUint8Array = pair.publicKey;
const privateKey = tweetnacl.encodeBase64(secretKeyUint8Array);
const publicKey = tweetnacl.encodeBase64(publicKeyUint8Array);
const privateKey = customKeys?.privateKey || tweetnacl.encodeBase64(secretKeyUint8Array);
const publicKey = customKeys?.publicKey || tweetnacl.encodeBase64(publicKeyUint8Array);
// eslint-disable-next-line
const client = new jsrp.client();
@@ -111,7 +116,7 @@ export const getUserPrivateKey = async (
| "encryptionVersion"
>
) => {
if (user.encryptionVersion === 1) {
if (user.encryptionVersion === UserEncryption.V1) {
return decryptSymmetric128BitHexKeyUTF8({
ciphertext: user.encryptedPrivateKey,
iv: user.iv,
@@ -119,7 +124,12 @@ export const getUserPrivateKey = async (
key: password.slice(0, 32).padStart(32 + (password.slice(0, 32).length - new Blob([password]).size), "0")
});
}
if (user.encryptionVersion === 2 && user.protectedKey && user.protectedKeyIV && user.protectedKeyTag) {
if (
user.encryptionVersion === UserEncryption.V2 &&
user.protectedKey &&
user.protectedKeyIV &&
user.protectedKeyTag
) {
const derivedKey = await argon2.hash(password, {
salt: Buffer.from(user.salt),
memoryCost: 65536,

View File

@@ -1,10 +1,16 @@
import fs from "fs/promises";
import path from "path";
export const isDisposableEmail = async (email: string) => {
const emailDomain = email.split("@")[1];
export const isDisposableEmail = async (emails: string | string[]) => {
const disposableEmails = await fs.readFile(path.join(__dirname, "disposable_emails.txt"), "utf8");
if (Array.isArray(emails)) {
return emails.some((email) => {
const emailDomain = email.split("@")[1];
return disposableEmails.split("\n").includes(emailDomain);
});
}
const emailDomain = emails.split("@")[1];
if (disposableEmails.split("\n").includes(emailDomain)) return true;
return false;
};

View File

@@ -91,6 +91,8 @@ export type TQueueJobTypes = {
[QueueName.IntegrationSync]: {
name: QueueJobs.IntegrationSync;
payload: {
isManual?: boolean;
actorId?: string;
projectId: string;
environment: string;
secretPath: string;

View File

@@ -182,6 +182,9 @@ import { secretVersionV2BridgeDALFactory } from "@app/services/secret-v2-bridge/
import { secretVersionV2TagBridgeDALFactory } from "@app/services/secret-v2-bridge/secret-version-tag-dal";
import { serviceTokenDALFactory } from "@app/services/service-token/service-token-dal";
import { serviceTokenServiceFactory } from "@app/services/service-token/service-token-service";
import { projectSlackConfigDALFactory } from "@app/services/slack/project-slack-config-dal";
import { slackIntegrationDALFactory } from "@app/services/slack/slack-integration-dal";
import { slackServiceFactory } from "@app/services/slack/slack-service";
import { TSmtpService } from "@app/services/smtp/smtp-service";
import { superAdminDALFactory } from "@app/services/super-admin/super-admin-dal";
import { getServerCfg, superAdminServiceFactory } from "@app/services/super-admin/super-admin-service";
@@ -194,6 +197,8 @@ import { userAliasDALFactory } from "@app/services/user-alias/user-alias-dal";
import { userEngagementServiceFactory } from "@app/services/user-engagement/user-engagement-service";
import { webhookDALFactory } from "@app/services/webhook/webhook-dal";
import { webhookServiceFactory } from "@app/services/webhook/webhook-service";
import { workflowIntegrationDALFactory } from "@app/services/workflow-integration/workflow-integration-dal";
import { workflowIntegrationServiceFactory } from "@app/services/workflow-integration/workflow-integration-service";
import { injectAuditLogInfo } from "../plugins/audit-log";
import { injectIdentity } from "../plugins/auth/inject-identity";
@@ -322,6 +327,10 @@ export const registerRoutes = async (
const externalKmsDAL = externalKmsDALFactory(db);
const kmsRootConfigDAL = kmsRootConfigDALFactory(db);
const slackIntegrationDAL = slackIntegrationDALFactory(db);
const projectSlackConfigDAL = projectSlackConfigDALFactory(db);
const workflowIntegrationDAL = workflowIntegrationDALFactory(db);
const permissionService = permissionServiceFactory({
permissionDAL,
orgRoleDAL,
@@ -464,6 +473,8 @@ export const registerRoutes = async (
userAliasDAL,
orgMembershipDAL,
tokenService,
permissionService,
groupProjectDAL,
smtpService,
projectMembershipDAL
});
@@ -488,6 +499,7 @@ export const registerRoutes = async (
tokenService,
projectUserAdditionalPrivilegeDAL,
projectUserMembershipRoleDAL,
projectRoleDAL,
projectDAL,
projectMembershipDAL,
orgMembershipDAL,
@@ -520,8 +532,10 @@ export const registerRoutes = async (
serverCfgDAL: superAdminDAL,
orgService,
keyStore,
licenseService
licenseService,
kmsService
});
const orgAdminService = orgAdminServiceFactory({
projectDAL,
permissionService,
@@ -721,7 +735,9 @@ export const registerRoutes = async (
keyStore,
kmsService,
projectBotDAL,
certificateTemplateDAL
certificateTemplateDAL,
projectSlackConfigDAL,
slackIntegrationDAL
});
const projectEnvService = projectEnvServiceFactory({
@@ -795,6 +811,8 @@ export const registerRoutes = async (
projectEnvDAL,
webhookDAL,
orgDAL,
auditLogService,
userDAL,
projectMembershipDAL,
smtpService,
projectDAL,
@@ -872,7 +890,8 @@ export const registerRoutes = async (
smtpService,
projectEnvDAL,
userDAL,
licenseService
licenseService,
projectSlackConfigDAL
});
const secretService = secretServiceFactory({
@@ -922,7 +941,9 @@ export const registerRoutes = async (
projectEnvDAL,
userDAL,
smtpService,
accessApprovalPolicyApproverDAL
accessApprovalPolicyApproverDAL,
projectSlackConfigDAL,
kmsService
});
const secretReplicationService = secretReplicationServiceFactory({
@@ -1150,6 +1171,18 @@ export const registerRoutes = async (
userDAL
});
const slackService = slackServiceFactory({
permissionService,
kmsService,
slackIntegrationDAL,
workflowIntegrationDAL
});
const workflowIntegrationService = workflowIntegrationServiceFactory({
permissionService,
workflowIntegrationDAL
});
await superAdminService.initServerCfg();
//
// setup the communication with license key server
@@ -1231,7 +1264,9 @@ export const registerRoutes = async (
secretSharing: secretSharingService,
userEngagement: userEngagementService,
externalKms: externalKmsService,
orgAdmin: orgAdminService
orgAdmin: orgAdminService,
slack: slackService,
workflowIntegration: workflowIntegrationService
});
const cronJobs: CronJob[] = [];

View File

@@ -21,7 +21,12 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
schema: {
response: {
200: z.object({
config: SuperAdminSchema.omit({ createdAt: true, updatedAt: true }).extend({
config: SuperAdminSchema.omit({
createdAt: true,
updatedAt: true,
encryptedSlackClientId: true,
encryptedSlackClientSecret: true
}).extend({
isMigrationModeOn: z.boolean(),
defaultAuthOrgSlug: z.string().nullable(),
isSecretScanningDisabled: z.boolean()
@@ -62,7 +67,9 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
.optional()
.refine((methods) => !methods || methods.length > 0, {
message: "At least one login method should be enabled."
})
}),
slackClientId: z.string().optional(),
slackClientSecret: z.string().optional()
}),
response: {
200: z.object({
@@ -123,6 +130,32 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
}
});
server.route({
method: "GET",
url: "/integrations/slack/config",
config: {
rateLimit: readLimit
},
schema: {
response: {
200: z.object({
clientId: z.string(),
clientSecret: z.string()
})
}
},
onRequest: (req, res, done) => {
verifyAuth([AuthMode.JWT])(req, res, () => {
verifySuperAdmin(req, res, done);
});
},
handler: async () => {
const adminSlackConfig = await server.services.superAdmin.getAdminSlackConfig();
return adminSlackConfig;
}
});
server.route({
method: "DELETE",
url: "/user-management/users/:userId",

View File

@@ -1,7 +1,7 @@
import ms from "ms";
import { z } from "zod";
import { CertificateAuthoritiesSchema } from "@app/db/schemas";
import { CertificateAuthoritiesSchema, CertificateTemplatesSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { CERTIFICATE_AUTHORITIES } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
@@ -42,7 +42,11 @@ export const registerCaRouter = async (server: FastifyZodProvider) => {
keyAlgorithm: z
.nativeEnum(CertKeyAlgorithm)
.default(CertKeyAlgorithm.RSA_2048)
.describe(CERTIFICATE_AUTHORITIES.CREATE.keyAlgorithm)
.describe(CERTIFICATE_AUTHORITIES.CREATE.keyAlgorithm),
requireTemplateForIssuance: z
.boolean()
.default(false)
.describe(CERTIFICATE_AUTHORITIES.CREATE.requireTemplateForIssuance)
})
.refine(
(data) => {
@@ -148,7 +152,11 @@ export const registerCaRouter = async (server: FastifyZodProvider) => {
caId: z.string().trim().describe(CERTIFICATE_AUTHORITIES.UPDATE.caId)
}),
body: z.object({
status: z.enum([CaStatus.ACTIVE, CaStatus.DISABLED]).optional().describe(CERTIFICATE_AUTHORITIES.UPDATE.status)
status: z.enum([CaStatus.ACTIVE, CaStatus.DISABLED]).optional().describe(CERTIFICATE_AUTHORITIES.UPDATE.status),
requireTemplateForIssuance: z
.boolean()
.optional()
.describe(CERTIFICATE_AUTHORITIES.CREATE.requireTemplateForIssuance)
}),
response: {
200: z.object({
@@ -700,6 +708,51 @@ export const registerCaRouter = async (server: FastifyZodProvider) => {
}
});
server.route({
method: "GET",
url: "/:caId/certificate-templates",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
schema: {
description: "Get list of certificate templates for the CA",
params: z.object({
caId: z.string().trim().describe(CERTIFICATE_AUTHORITIES.SIGN_CERT.caId)
}),
response: {
200: z.object({
certificateTemplates: CertificateTemplatesSchema.array()
})
}
},
handler: async (req) => {
const { certificateTemplates, ca } = await server.services.certificateAuthority.getCaCertificateTemplates({
caId: req.params.caId,
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: ca.projectId,
event: {
type: EventType.GET_CA_CERTIFICATE_TEMPLATES,
metadata: {
caId: ca.id,
dn: ca.dn
}
}
});
return {
certificateTemplates
};
}
});
server.route({
method: "GET",
url: "/:caId/crls",

View File

@@ -29,11 +29,13 @@ import { registerSecretFolderRouter } from "./secret-folder-router";
import { registerSecretImportRouter } from "./secret-import-router";
import { registerSecretSharingRouter } from "./secret-sharing-router";
import { registerSecretTagRouter } from "./secret-tag-router";
import { registerSlackRouter } from "./slack-router";
import { registerSsoRouter } from "./sso-router";
import { registerUserActionRouter } from "./user-action-router";
import { registerUserEngagementRouter } from "./user-engagement-router";
import { registerUserRouter } from "./user-router";
import { registerWebhookRouter } from "./webhook-router";
import { registerWorkflowIntegrationRouter } from "./workflow-integration-router";
export const registerV1Routes = async (server: FastifyZodProvider) => {
await server.register(registerSsoRouter, { prefix: "/sso" });
@@ -61,6 +63,14 @@ export const registerV1Routes = async (server: FastifyZodProvider) => {
await server.register(registerSecretImportRouter, { prefix: "/secret-imports" });
await server.register(registerSecretFolderRouter, { prefix: "/folders" });
await server.register(
async (workflowIntegrationRouter) => {
await workflowIntegrationRouter.register(registerWorkflowIntegrationRouter);
await workflowIntegrationRouter.register(registerSlackRouter, { prefix: "/slack" });
},
{ prefix: "/workflow-integrations" }
);
await server.register(
async (projectRouter) => {
await projectRouter.register(registerProjectRouter);

View File

@@ -4,7 +4,7 @@ import { IntegrationsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { INTEGRATION } from "@app/lib/api-docs";
import { removeTrailingSlash, shake } from "@app/lib/fn";
import { writeLimit } from "@app/server/config/rateLimiter";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { getTelemetryDistinctId } from "@app/server/lib/telemetry";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
@@ -154,6 +154,48 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
}
});
server.route({
method: "GET",
url: "/:integrationId",
config: {
rateLimit: readLimit
},
schema: {
description: "Get an integration by integration id",
security: [
{
bearerAuth: []
}
],
params: z.object({
integrationId: z.string().trim().describe(INTEGRATION.UPDATE.integrationId)
}),
response: {
200: z.object({
integration: IntegrationsSchema.extend({
environment: z.object({
slug: z.string().trim(),
name: z.string().trim(),
id: z.string().trim()
})
})
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const integration = await server.services.integration.getIntegration({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationId
});
return { integration };
}
});
server.route({
method: "DELETE",
url: "/:integrationId",

View File

@@ -18,9 +18,14 @@ export const registerInviteOrgRouter = async (server: FastifyZodProvider) => {
body: z.object({
inviteeEmails: z.array(z.string().trim().email()),
organizationId: z.string().trim(),
projectIds: z.array(z.string().trim()).optional(),
projectRoleSlug: z.nativeEnum(ProjectMembershipRole).optional(),
organizationRoleSlug: z.nativeEnum(OrgMembershipRole)
projects: z
.object({
id: z.string(),
projectRoleSlug: z.string().array().default([ProjectMembershipRole.Member])
})
.array()
.optional(),
organizationRoleSlug: z.string().default(OrgMembershipRole.Member)
}),
response: {
200: z.object({
@@ -40,12 +45,12 @@ export const registerInviteOrgRouter = async (server: FastifyZodProvider) => {
handler: async (req) => {
if (req.auth.actor !== ActorType.USER) return;
const completeInviteLinks = await server.services.org.inviteUserToOrganization({
const { signupTokens: completeInviteLinks } = await server.services.org.inviteUserToOrganization({
orgId: req.body.organizationId,
userId: req.permission.id,
actor: req.permission.type,
actorId: req.permission.id,
inviteeEmails: req.body.inviteeEmails,
projectIds: req.body.projectIds,
projectRoleSlug: req.body.projectRoleSlug,
projects: req.body.projects,
organizationRoleSlug: req.body.organizationRoleSlug,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId

View File

@@ -1,6 +1,7 @@
import { z } from "zod";
import {
AuditLogsSchema,
GroupsSchema,
IncidentContactsSchema,
OrganizationsSchema,
@@ -8,10 +9,12 @@ import {
OrgRolesSchema,
UsersSchema
} from "@app/db/schemas";
import { ORGANIZATIONS } from "@app/lib/api-docs";
import { EventType, UserAgentType } from "@app/ee/services/audit-log/audit-log-types";
import { AUDIT_LOGS, ORGANIZATIONS } from "@app/lib/api-docs";
import { getLastMidnightDateISO } from "@app/lib/fn";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { ActorType, AuthMode } from "@app/services/auth/auth-type";
export const registerOrgRouter = async (server: FastifyZodProvider) => {
server.route({
@@ -62,6 +65,101 @@ export const registerOrgRouter = async (server: FastifyZodProvider) => {
}
});
server.route({
method: "GET",
url: "/audit-logs",
config: {
rateLimit: readLimit
},
schema: {
description: "Get all audit logs for an organization",
querystring: z.object({
projectId: z.string().optional(),
actorType: z.nativeEnum(ActorType).optional(),
// eventType is split with , for multiple values, we need to transform it to array
eventType: z
.string()
.optional()
.transform((val) => (val ? val.split(",") : undefined)),
userAgentType: z.nativeEnum(UserAgentType).optional().describe(AUDIT_LOGS.EXPORT.userAgentType),
eventMetadata: z
.string()
.optional()
.transform((val) => {
if (!val) {
return undefined;
}
const pairs = val.split(",");
return pairs.reduce(
(acc, pair) => {
const [key, value] = pair.split("=");
if (key && value) {
acc[key] = value;
}
return acc;
},
{} as Record<string, string>
);
}),
startDate: z.string().datetime().optional().describe(AUDIT_LOGS.EXPORT.startDate),
endDate: z.string().datetime().optional().describe(AUDIT_LOGS.EXPORT.endDate),
offset: z.coerce.number().default(0).describe(AUDIT_LOGS.EXPORT.offset),
limit: z.coerce.number().default(20).describe(AUDIT_LOGS.EXPORT.limit),
actor: z.string().optional().describe(AUDIT_LOGS.EXPORT.actor)
}),
response: {
200: z.object({
auditLogs: AuditLogsSchema.omit({
eventMetadata: true,
eventType: true,
actor: true,
actorMetadata: true
})
.merge(
z.object({
project: z.object({
name: z.string(),
slug: z.string()
}),
event: z.object({
type: z.string(),
metadata: z.any()
}),
actor: z.object({
type: z.string(),
metadata: z.any()
})
})
)
.array()
})
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const auditLogs = await server.services.auditLog.listAuditLogs({
filter: {
...req.query,
endDate: req.query.endDate,
projectId: req.query.projectId,
startDate: req.query.startDate || getLastMidnightDateISO(),
auditLogActorId: req.query.actor,
actorType: req.query.actorType,
eventType: req.query.eventType as EventType[] | undefined
},
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type
});
return { auditLogs };
}
});
server.route({
method: "GET",
url: "/:organizationId/users",

View File

@@ -4,14 +4,17 @@ import {
IntegrationsSchema,
ProjectMembershipsSchema,
ProjectRolesSchema,
ProjectSlackConfigsSchema,
UserEncryptionKeysSchema,
UsersSchema
} from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { PROJECTS } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { ProjectFilterType } from "@app/services/project/project-types";
import { validateSlackChannelsField } from "@app/services/slack/slack-auth-validators";
import { integrationAuthPubSchema, SanitizedProjectSchema } from "../sanitizedSchemas";
import { sanitizedServiceTokenSchema } from "../v2/service-token-router";
@@ -542,4 +545,111 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
return { serviceTokenData };
}
});
server.route({
method: "GET",
url: "/:workspaceId/slack-config",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim()
}),
response: {
200: ProjectSlackConfigsSchema.pick({
id: true,
slackIntegrationId: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackConfig = await server.services.project.getProjectSlackConfig({
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
projectId: req.params.workspaceId
});
if (slackConfig) {
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.GET_PROJECT_SLACK_CONFIG,
metadata: {
id: slackConfig.id
}
}
});
}
return slackConfig;
}
});
server.route({
method: "PUT",
url: "/:workspaceId/slack-config",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
workspaceId: z.string().trim()
}),
body: z.object({
slackIntegrationId: z.string(),
isAccessRequestNotificationEnabled: z.boolean(),
accessRequestChannels: validateSlackChannelsField,
isSecretRequestNotificationEnabled: z.boolean(),
secretRequestChannels: validateSlackChannelsField
}),
response: {
200: ProjectSlackConfigsSchema.pick({
id: true,
slackIntegrationId: true,
isAccessRequestNotificationEnabled: true,
accessRequestChannels: true,
isSecretRequestNotificationEnabled: true,
secretRequestChannels: true
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackConfig = await server.services.project.updateProjectSlackConfig({
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
projectId: req.params.workspaceId,
...req.body
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: req.params.workspaceId,
event: {
type: EventType.UPDATE_PROJECT_SLACK_CONFIG,
metadata: {
id: slackConfig.id,
slackIntegrationId: slackConfig.slackIntegrationId,
isAccessRequestNotificationEnabled: slackConfig.isAccessRequestNotificationEnabled,
accessRequestChannels: slackConfig.accessRequestChannels,
isSecretRequestNotificationEnabled: slackConfig.isSecretRequestNotificationEnabled,
secretRequestChannels: slackConfig.secretRequestChannels
}
}
});
return slackConfig;
}
});
};

View File

@@ -0,0 +1,355 @@
import slugify from "@sindresorhus/slugify";
import { z } from "zod";
import { SlackIntegrationsSchema, WorkflowIntegrationsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { getConfig } from "@app/lib/config/env";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
const sanitizedSlackIntegrationSchema = WorkflowIntegrationsSchema.pick({
id: true,
description: true,
slug: true,
integration: true
}).merge(
SlackIntegrationsSchema.pick({
teamName: true
})
);
export const registerSlackRouter = async (server: FastifyZodProvider) => {
const appCfg = getConfig();
server.route({
method: "GET",
url: "/install",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
querystring: z.object({
slug: z
.string()
.trim()
.refine((v) => slugify(v) === v, {
message: "Slug must be a valid slug"
}),
description: z.string().optional()
}),
response: {
200: z.string()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const url = await server.services.slack.getInstallUrl({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
...req.query
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.ATTEMPT_CREATE_SLACK_INTEGRATION,
metadata: {
slug: req.query.slug,
description: req.query.description
}
}
});
return url;
}
});
server.route({
method: "GET",
url: "/reinstall",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
querystring: z.object({
id: z.string()
}),
response: {
200: z.string()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const url = await server.services.slack.getReinstallUrl({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.query.id
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: req.permission.orgId,
event: {
type: EventType.ATTEMPT_REINSTALL_SLACK_INTEGRATION,
metadata: {
id: req.query.id
}
}
});
return url;
}
});
server.route({
method: "GET",
url: "/",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
response: {
200: sanitizedSlackIntegrationSchema.array()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackIntegrations = await server.services.slack.getSlackIntegrationsByOrg({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
return slackIntegrations;
}
});
server.route({
method: "DELETE",
url: "/:id",
config: {
rateLimit: writeLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
response: {
200: sanitizedSlackIntegrationSchema
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const deletedSlackIntegration = await server.services.slack.deleteSlackIntegration({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: deletedSlackIntegration.orgId,
event: {
type: EventType.DELETE_SLACK_INTEGRATION,
metadata: {
id: deletedSlackIntegration.id
}
}
});
return deletedSlackIntegration;
}
});
server.route({
method: "GET",
url: "/:id",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
response: {
200: sanitizedSlackIntegrationSchema
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackIntegration = await server.services.slack.getSlackIntegrationById({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: slackIntegration.orgId,
event: {
type: EventType.GET_SLACK_INTEGRATION,
metadata: {
id: slackIntegration.id
}
}
});
return slackIntegration;
}
});
server.route({
method: "GET",
url: "/:id/channels",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
response: {
200: z
.object({
name: z.string(),
id: z.string()
})
.array()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackChannels = await server.services.slack.getSlackIntegrationChannels({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id
});
return slackChannels;
}
});
server.route({
method: "PATCH",
url: "/:id",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
params: z.object({
id: z.string()
}),
body: z.object({
slug: z
.string()
.trim()
.refine((v) => slugify(v) === v, {
message: "Slug must be a valid slug"
})
.optional(),
description: z.string().optional()
}),
response: {
200: sanitizedSlackIntegrationSchema
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const slackIntegration = await server.services.slack.updateSlackIntegration({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.id,
...req.body
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: slackIntegration.orgId,
event: {
type: EventType.UPDATE_SLACK_INTEGRATION,
metadata: {
id: slackIntegration.id,
slug: slackIntegration.slug,
description: slackIntegration.description as string
}
}
});
return slackIntegration;
}
});
server.route({
method: "GET",
url: "/oauth_redirect",
config: {
rateLimit: readLimit
},
handler: async (req, res) => {
const installer = await server.services.slack.getSlackInstaller();
return installer.handleCallback(req.raw, res.raw, {
failureAsync: async () => {
return res.redirect(appCfg.SITE_URL as string);
},
successAsync: async (installation) => {
const metadata = JSON.parse(installation.metadata || "") as {
orgId: string;
};
return res.redirect(`${appCfg.SITE_URL}/org/${metadata.orgId}/settings?selectedTab=workflow-integrations`);
}
});
}
});
};

View File

@@ -134,4 +134,39 @@ export const registerUserRouter = async (server: FastifyZodProvider) => {
);
}
});
server.route({
method: "GET",
url: "/me/:username/groups",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
username: z.string().trim()
}),
response: {
200: z
.object({
id: z.string(),
name: z.string(),
slug: z.string(),
orgId: z.string()
})
.array()
}
},
onRequest: verifyAuth([AuthMode.JWT]),
handler: async (req) => {
const groupMemberships = await server.services.user.listUserGroups({
username: req.params.username,
actorOrgId: req.permission.orgId,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actor: req.permission.type
});
return groupMemberships;
}
});
};

View File

@@ -0,0 +1,42 @@
import { WorkflowIntegrationsSchema } from "@app/db/schemas";
import { readLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
const sanitizedWorkflowIntegrationSchema = WorkflowIntegrationsSchema.pick({
id: true,
description: true,
slug: true,
integration: true
});
export const registerWorkflowIntegrationRouter = async (server: FastifyZodProvider) => {
server.route({
method: "GET",
url: "/",
config: {
rateLimit: readLimit
},
schema: {
security: [
{
bearerAuth: []
}
],
response: {
200: sanitizedWorkflowIntegrationSchema.array()
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const workflowIntegrations = await server.services.workflowIntegration.getIntegrationsByOrg({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId
});
return workflowIntegrations;
}
});
};

View File

@@ -1,6 +1,6 @@
import { z } from "zod";
import { ProjectMembershipsSchema } from "@app/db/schemas";
import { OrgMembershipRole, ProjectMembershipRole, ProjectMembershipsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { PROJECT_USERS } from "@app/lib/api-docs";
import { writeLimit } from "@app/server/config/rateLimiter";
@@ -26,7 +26,8 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
}),
body: z.object({
emails: z.string().email().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.emails),
usernames: z.string().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.usernames)
usernames: z.string().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.usernames),
roleSlugs: z.string().array().optional().describe(PROJECT_USERS.INVITE_MEMBER.roleSlugs)
}),
response: {
200: z.object({
@@ -36,14 +37,21 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const memberships = await server.services.projectMembership.addUsersToProjectNonE2EE({
projectId: req.params.projectId,
const usernamesAndEmails = [...req.body.emails, ...req.body.usernames];
const { projectMemberships: memberships } = await server.services.org.inviteUserToOrganization({
actorAuthMethod: req.permission.authMethod,
actorId: req.permission.id,
actorOrgId: req.permission.orgId,
actor: req.permission.type,
emails: req.body.emails,
usernames: req.body.usernames
inviteeEmails: usernamesAndEmails,
orgId: req.permission.orgId,
organizationRoleSlug: OrgMembershipRole.NoAccess,
projects: [
{
id: req.params.projectId,
projectRoleSlug: [ProjectMembershipRole.Member]
}
]
});
await server.services.auditLog.createAuditLog({

View File

@@ -1,15 +1,15 @@
import bcrypt from "bcrypt";
import jwt from "jsonwebtoken";
import { OrgMembershipStatus, TableName } from "@app/db/schemas";
import { OrgMembershipStatus, SecretKeyEncoding, TableName } from "@app/db/schemas";
import { convertPendingGroupAdditionsToGroupMemberships } from "@app/ee/services/group/group-fns";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { isAuthMethodSaml } from "@app/ee/services/permission/permission-fns";
import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { getUserPrivateKey } from "@app/lib/crypto/srp";
import { BadRequestError, UnauthorizedError } from "@app/lib/errors";
import { infisicalSymmetricDecrypt, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { generateUserSrpKeys, getUserPrivateKey } from "@app/lib/crypto/srp";
import { BadRequestError } from "@app/lib/errors";
import { isDisposableEmail } from "@app/lib/validator";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
@@ -17,14 +17,14 @@ import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { TAuthTokenServiceFactory } from "../auth-token/auth-token-service";
import { TokenMetadataType, TokenType, TTokenMetadata } from "../auth-token/auth-token-types";
import { TokenType } from "../auth-token/auth-token-types";
import { TOrgDALFactory } from "../org/org-dal";
import { TOrgServiceFactory } from "../org/org-service";
import { TProjectMembershipDALFactory } from "../project-membership/project-membership-dal";
import { addMembersToProject } from "../project-membership/project-membership-fns";
import { TProjectUserMembershipRoleDALFactory } from "../project-membership/project-user-membership-role-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { TUserDALFactory } from "../user/user-dal";
import { UserEncryption } from "../user/user-types";
import { TAuthDALFactory } from "./auth-dal";
import { validateProviderAuthToken, validateSignUpAuthorization } from "./auth-fns";
import { TCompleteAccountInviteDTO, TCompleteAccountSignupDTO } from "./auth-signup-type";
@@ -67,8 +67,6 @@ export const authSignupServiceFactory = ({
smtpService,
orgService,
orgDAL,
projectMembershipDAL,
projectUserMembershipRoleDAL,
licenseService
}: TAuthSignupDep) => {
// first step of signup. create user and send email
@@ -177,32 +175,88 @@ export const authSignupServiceFactory = ({
encryptedPrivateKey,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
encryptionVersion: 2
encryptionVersion: UserEncryption.V2
});
const { tag, encoding, ciphertext, iv } = infisicalSymmetricEncypt(privateKey);
const updateduser = await authDAL.transaction(async (tx) => {
const us = await userDAL.updateById(user.id, { firstName, lastName, isAccepted: true }, tx);
if (!us) throw new Error("User not found");
const userEncKey = await userDAL.upsertUserEncryptionKey(
us.id,
{
salt,
verifier,
publicKey,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
hashedPassword,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
const systemGeneratedUserEncryptionKey = await userDAL.findUserEncKeyByUserId(us.id, tx);
let userEncKey;
// below condition is true means this is system generated credentials
// the private key is actually system generated password
// thus we will re-encrypt the system generated private key with the new password
// akhilmhdh: you may find this like why? The reason is simple we are moving away from e2ee and these are pieces of it
// without a dummy key in place some things will break and backward compatiability too. 2025 we will be removing all these things
if (
systemGeneratedUserEncryptionKey &&
!systemGeneratedUserEncryptionKey.hashedPassword &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKey &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyTag &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyIV &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyEncoding
) {
// get server generated password
const serverGeneratedPassword = infisicalSymmetricDecrypt({
iv: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyIV,
tag: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyTag,
ciphertext: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKey,
keyEncoding: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyEncoding as SecretKeyEncoding
});
const serverGeneratedPrivateKey = await getUserPrivateKey(serverGeneratedPassword, {
...systemGeneratedUserEncryptionKey
});
const encKeys = await generateUserSrpKeys(email, password, {
publicKey: systemGeneratedUserEncryptionKey.publicKey,
privateKey: serverGeneratedPrivateKey
});
// now reencrypt server generated key with user provided password
userEncKey = await userDAL.upsertUserEncryptionKey(
us.id,
{
encryptionVersion: UserEncryption.V2,
protectedKey: encKeys.protectedKey,
protectedKeyIV: encKeys.protectedKeyIV,
protectedKeyTag: encKeys.protectedKeyTag,
publicKey: encKeys.publicKey,
encryptedPrivateKey: encKeys.encryptedPrivateKey,
iv: encKeys.encryptedPrivateKeyIV,
tag: encKeys.encryptedPrivateKeyTag,
salt: encKeys.salt,
verifier: encKeys.verifier,
hashedPassword,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
} else {
userEncKey = await userDAL.upsertUserEncryptionKey(
us.id,
{
encryptionVersion: UserEncryption.V2,
salt,
verifier,
publicKey,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
hashedPassword,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
}
// If it's SAML Auth and the organization ID is present, we should check if the user has a pending invite for this org, and accept it
if (
(isAuthMethodSaml(authMethod) || [AuthMethod.LDAP, AuthMethod.OIDC].includes(authMethod as AuthMethod)) &&
@@ -312,8 +366,7 @@ export const authSignupServiceFactory = ({
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
authorization,
tokenMetadata
authorization
}: TCompleteAccountInviteDTO) => {
const user = await userDAL.findUserByUsername(email);
if (!user || (user && user.isAccepted)) {
@@ -348,65 +401,76 @@ export const authSignupServiceFactory = ({
const updateduser = await authDAL.transaction(async (tx) => {
const us = await userDAL.updateById(user.id, { firstName, lastName, isAccepted: true }, tx);
if (!us) throw new Error("User not found");
const userEncKey = await userDAL.upsertUserEncryptionKey(
us.id,
{
salt,
encryptionVersion: 2,
verifier,
publicKey,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
hashedPassword,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
if (tokenMetadata) {
const metadataObj = jwt.verify(tokenMetadata, appCfg.AUTH_SECRET) as TTokenMetadata;
if (
metadataObj?.payload?.userId !== user.id ||
metadataObj?.payload?.orgId !== orgMembership.orgId ||
metadataObj?.type !== TokenMetadataType.InviteToProjects
) {
throw new UnauthorizedError({
message: "Malformed or invalid metadata token"
});
}
for await (const projectId of metadataObj.payload.projectIds) {
await addMembersToProject({
orgDAL,
projectDAL,
projectMembershipDAL,
projectKeyDAL,
userGroupMembershipDAL,
projectBotDAL,
projectUserMembershipRoleDAL,
smtpService
}).addMembersToNonE2EEProject(
{
emails: [user.email!],
usernames: [],
projectId,
projectMembershipRole: metadataObj.payload.projectRoleSlug,
sendEmails: false
},
{
tx,
throwOnProjectNotFound: false
}
);
}
const systemGeneratedUserEncryptionKey = await userDAL.findUserEncKeyByUserId(us.id, tx);
let userEncKey;
// this means this is system generated credentials
// now replace the private key
if (
systemGeneratedUserEncryptionKey &&
!systemGeneratedUserEncryptionKey.hashedPassword &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKey &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyTag &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyIV &&
systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyEncoding
) {
// get server generated password
const serverGeneratedPassword = infisicalSymmetricDecrypt({
iv: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyIV,
tag: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyTag,
ciphertext: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKey,
keyEncoding: systemGeneratedUserEncryptionKey.serverEncryptedPrivateKeyEncoding as SecretKeyEncoding
});
const serverGeneratedPrivateKey = await getUserPrivateKey(serverGeneratedPassword, {
...systemGeneratedUserEncryptionKey
});
const encKeys = await generateUserSrpKeys(email, password, {
publicKey: systemGeneratedUserEncryptionKey.publicKey,
privateKey: serverGeneratedPrivateKey
});
// now reencrypt server generated key with user provided password
userEncKey = await userDAL.upsertUserEncryptionKey(
us.id,
{
encryptionVersion: 2,
protectedKey: encKeys.protectedKey,
protectedKeyIV: encKeys.protectedKeyIV,
protectedKeyTag: encKeys.protectedKeyTag,
publicKey: encKeys.publicKey,
encryptedPrivateKey: encKeys.encryptedPrivateKey,
iv: encKeys.encryptedPrivateKeyIV,
tag: encKeys.encryptedPrivateKeyTag,
salt: encKeys.salt,
verifier: encKeys.verifier,
hashedPassword,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
} else {
userEncKey = await userDAL.upsertUserEncryptionKey(
us.id,
{
encryptionVersion: UserEncryption.V2,
salt,
verifier,
publicKey,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
hashedPassword,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
}
const updatedMembersips = await orgDAL.updateMembership(

View File

@@ -34,6 +34,7 @@ export enum AuthMode {
}
export enum ActorType { // would extend to AWS, Azure, ...
PLATFORM = "platform", // Useful for when we want to perform logging on automated actions such as integration syncs.
USER = "user", // userIdentity
SERVICE = "service",
IDENTITY = "identity",

View File

@@ -41,6 +41,7 @@ import {
TCreateCaDTO,
TDeleteCaDTO,
TGetCaCertDTO,
TGetCaCertificateTemplatesDTO,
TGetCaCertsDTO,
TGetCaCsrDTO,
TGetCaDTO,
@@ -64,7 +65,7 @@ type TCertificateAuthorityServiceFactoryDep = {
>;
certificateAuthoritySecretDAL: Pick<TCertificateAuthoritySecretDALFactory, "create" | "findOne">;
certificateAuthorityCrlDAL: Pick<TCertificateAuthorityCrlDALFactory, "create" | "findOne" | "update">;
certificateTemplateDAL: Pick<TCertificateTemplateDALFactory, "getById">;
certificateTemplateDAL: Pick<TCertificateTemplateDALFactory, "getById" | "find">;
certificateAuthorityQueue: TCertificateAuthorityQueueFactory; // TODO: Pick
certificateDAL: Pick<TCertificateDALFactory, "transaction" | "create" | "find">;
certificateBodyDAL: Pick<TCertificateBodyDALFactory, "create">;
@@ -108,6 +109,7 @@ export const certificateAuthorityServiceFactory = ({
notAfter,
maxPathLength,
keyAlgorithm,
requireTemplateForIssuance,
actorId,
actorAuthMethod,
actor,
@@ -170,7 +172,8 @@ export const certificateAuthorityServiceFactory = ({
notBefore: notBeforeDate,
notAfter: notAfterDate,
serialNumber
})
}),
requireTemplateForIssuance
},
tx
);
@@ -302,7 +305,15 @@ export const certificateAuthorityServiceFactory = ({
* Update CA with id [caId].
* Note: Used to enable/disable CA
*/
const updateCaById = async ({ caId, status, actorId, actorAuthMethod, actor, actorOrgId }: TUpdateCaDTO) => {
const updateCaById = async ({
caId,
status,
requireTemplateForIssuance,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TUpdateCaDTO) => {
const ca = await certificateAuthorityDAL.findById(caId);
if (!ca) throw new BadRequestError({ message: "CA not found" });
@@ -319,7 +330,7 @@ export const certificateAuthorityServiceFactory = ({
ProjectPermissionSub.CertificateAuthorities
);
const updatedCa = await certificateAuthorityDAL.updateById(caId, { status });
const updatedCa = await certificateAuthorityDAL.updateById(caId, { status, requireTemplateForIssuance });
return updatedCa;
};
@@ -1077,6 +1088,9 @@ export const certificateAuthorityServiceFactory = ({
if (ca.status === CaStatus.DISABLED) throw new BadRequestError({ message: "CA is disabled" });
if (!ca.activeCaCertId) throw new BadRequestError({ message: "CA does not have a certificate installed" });
if (ca.requireTemplateForIssuance && !certificateTemplate) {
throw new BadRequestError({ message: "Certificate template is required for issuance" });
}
const caCert = await certificateAuthorityCertDAL.findById(ca.activeCaCertId);
if (ca.notAfter && new Date() > new Date(ca.notAfter)) {
@@ -1347,6 +1361,9 @@ export const certificateAuthorityServiceFactory = ({
if (ca.status === CaStatus.DISABLED) throw new BadRequestError({ message: "CA is disabled" });
if (!ca.activeCaCertId) throw new BadRequestError({ message: "CA does not have a certificate installed" });
if (ca.requireTemplateForIssuance && !certificateTemplate) {
throw new BadRequestError({ message: "Certificate template is required for issuance" });
}
const caCert = await certificateAuthorityCertDAL.findById(ca.activeCaCertId);
@@ -1568,6 +1585,40 @@ export const certificateAuthorityServiceFactory = ({
};
};
/**
* Return list of certificate templates for CA with id [caId].
*/
const getCaCertificateTemplates = async ({
caId,
actorId,
actorAuthMethod,
actor,
actorOrgId
}: TGetCaCertificateTemplatesDTO) => {
const ca = await certificateAuthorityDAL.findById(caId);
if (!ca) throw new BadRequestError({ message: "CA not found" });
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
ca.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Read,
ProjectPermissionSub.CertificateTemplates
);
const certificateTemplates = await certificateTemplateDAL.find({ caId });
return {
certificateTemplates,
ca
};
};
return {
createCa,
getCaById,
@@ -1580,6 +1631,7 @@ export const certificateAuthorityServiceFactory = ({
signIntermediate,
importCertToCa,
issueCertFromCa,
signCertFromCa
signCertFromCa,
getCaCertificateTemplates
};
};

View File

@@ -38,6 +38,7 @@ export type TCreateCaDTO = {
notAfter?: string;
maxPathLength: number;
keyAlgorithm: CertKeyAlgorithm;
requireTemplateForIssuance: boolean;
} & Omit<TProjectPermission, "projectId">;
export type TGetCaDTO = {
@@ -47,6 +48,7 @@ export type TGetCaDTO = {
export type TUpdateCaDTO = {
caId: string;
status?: CaStatus;
requireTemplateForIssuance?: boolean;
} & Omit<TProjectPermission, "projectId">;
export type TDeleteCaDTO = {
@@ -125,6 +127,10 @@ export type TSignCertFromCaDTO =
notAfter?: string;
} & Omit<TProjectPermission, "projectId">);
export type TGetCaCertificateTemplatesDTO = {
caId: string;
} & Omit<TProjectPermission, "projectId">;
export type TDNParts = {
commonName?: string;
organization?: string;

View File

@@ -7,7 +7,7 @@ const isValidDate = (dateString: string) => {
export const validateCaDateField = z.string().trim().refine(isValidDate, { message: "Invalid date format" });
export const hostnameRegex = /^(?!:\/\/)([a-zA-Z0-9-_]{1,63}\.?)+(?!:\/\/)([a-zA-Z]{2,63})$/;
export const hostnameRegex = /^(?!:\/\/)(\*\.)?([a-zA-Z0-9-_]{1,63}\.?)+(?!:\/\/)([a-zA-Z]{2,63})$/;
export const validateAltNamesField = z
.string()
.trim()

View File

@@ -95,6 +95,30 @@ export const groupProjectDALFactory = (db: TDbClient) => {
}
};
const findByUserId = async (userId: string, orgId: string, tx?: Knex) => {
try {
const docs = await (tx || db.replicaNode())(TableName.UserGroupMembership)
.where(`${TableName.UserGroupMembership}.userId`, userId)
.join(TableName.Groups, function () {
this.on(`${TableName.UserGroupMembership}.groupId`, "=", `${TableName.Groups}.id`).andOn(
`${TableName.Groups}.orgId`,
"=",
db.raw("?", [orgId])
);
})
.select(
db.ref("id").withSchema(TableName.Groups),
db.ref("name").withSchema(TableName.Groups),
db.ref("slug").withSchema(TableName.Groups),
db.ref("orgId").withSchema(TableName.Groups)
);
return docs;
} catch (error) {
throw new DatabaseError({ error, name: "FindByUserId" });
}
};
// The GroupProjectMembership table has a reference to the project (projectId) AND the group (groupId).
// We need to join the GroupProjectMembership table with the Groups table to get the group name and slug.
// We also need to join the GroupProjectMembershipRole table to get the role of the group in the project.
@@ -197,5 +221,5 @@ export const groupProjectDALFactory = (db: TDbClient) => {
return members;
};
return { ...groupProjectOrm, findByProjectId, findAllProjectGroupMembers };
return { ...groupProjectOrm, findByProjectId, findByUserId, findAllProjectGroupMembers };
};

View File

@@ -0,0 +1,4 @@
import picomatch from "picomatch";
export const doesFieldValueMatchOidcPolicy = (fieldValue: string, policyValue: string) =>
policyValue === fieldValue || picomatch.isMatch(fieldValue, policyValue);

View File

@@ -28,6 +28,7 @@ import { TIdentityAccessTokenDALFactory } from "../identity-access-token/identit
import { TIdentityAccessTokenJwtPayload } from "../identity-access-token/identity-access-token-types";
import { TOrgBotDALFactory } from "../org/org-bot-dal";
import { TIdentityOidcAuthDALFactory } from "./identity-oidc-auth-dal";
import { doesFieldValueMatchOidcPolicy } from "./identity-oidc-auth-fns";
import {
TAttachOidcAuthDTO,
TGetOidcAuthDTO,
@@ -123,7 +124,7 @@ export const identityOidcAuthServiceFactory = ({
}) as Record<string, string>;
if (identityOidcAuth.boundSubject) {
if (tokenData.sub !== identityOidcAuth.boundSubject) {
if (!doesFieldValueMatchOidcPolicy(tokenData.sub, identityOidcAuth.boundSubject)) {
throw new ForbiddenRequestError({
message: "Access denied: OIDC subject not allowed."
});
@@ -131,7 +132,11 @@ export const identityOidcAuthServiceFactory = ({
}
if (identityOidcAuth.boundAudiences) {
if (!identityOidcAuth.boundAudiences.split(", ").includes(tokenData.aud)) {
if (
!identityOidcAuth.boundAudiences
.split(", ")
.some((policyValue) => doesFieldValueMatchOidcPolicy(tokenData.aud, policyValue))
) {
throw new ForbiddenRequestError({
message: "Access denied: OIDC audience not allowed."
});
@@ -142,7 +147,9 @@ export const identityOidcAuthServiceFactory = ({
Object.keys(identityOidcAuth.boundClaims).forEach((claimKey) => {
const claimValue = (identityOidcAuth.boundClaims as Record<string, string>)[claimKey];
// handle both single and multi-valued claims
if (!claimValue.split(", ").some((claimEntry) => tokenData[claimKey] === claimEntry)) {
if (
!claimValue.split(", ").some((claimEntry) => doesFieldValueMatchOidcPolicy(tokenData[claimKey], claimEntry))
) {
throw new ForbiddenRequestError({
message: "Access denied: OIDC claim not allowed."
});

View File

@@ -567,8 +567,8 @@ const syncSecretsAWSParameterStore = async ({
});
ssm.config.update(config);
const metadata = z.record(z.any()).parse(integration.metadata || {});
const awsParameterStoreSecretsObj: Record<string, AWS.SSM.Parameter> = {};
const metadata = IntegrationMetadataSchema.parse(integration.metadata);
const awsParameterStoreSecretsObj: Record<string, AWS.SSM.Parameter & { KeyId?: string }> = {};
logger.info(
`getIntegrationSecrets: integration sync triggered for ssm with [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [shouldDisableDelete=${metadata.shouldDisableDelete}]`
);
@@ -598,18 +598,57 @@ const syncSecretsAWSParameterStore = async ({
nextToken = parameters.NextToken;
}
logger.info(
`getIntegrationSecrets: all fetched keys from AWS SSM [projectId=${projectId}] [environment=${
integration.environment.slug
}] [secretPath=${integration.secretPath}] [awsParameterStoreSecretsObj=${Object.keys(
awsParameterStoreSecretsObj
).join(",")}]`
);
logger.info(
`getIntegrationSecrets: all secrets from Infisical to send to AWS SSM [projectId=${projectId}] [environment=${
integration.environment.slug
}] [secretPath=${integration.secretPath}] [secrets=${Object.keys(secrets).join(",")}]`
);
let areParametersKmsKeysFetched = false;
if (metadata.kmsKeyId) {
// we put this inside a try catch so that existing integrations without the ssm:DescribeParameters
// AWS permission will not break
try {
let hasNextDescribePage = true;
let describeNextToken: string | undefined;
while (hasNextDescribePage) {
const parameters = await ssm
.describeParameters({
MaxResults: 10,
NextToken: describeNextToken,
ParameterFilters: [
{
Key: "Path",
Option: "OneLevel",
Values: [integration.path as string]
}
]
})
.promise();
if (parameters.Parameters) {
parameters.Parameters.forEach((parameter) => {
if (parameter.Name) {
const secKey = parameter.Name.substring((integration.path as string).length);
awsParameterStoreSecretsObj[secKey].KeyId = parameter.KeyId;
}
});
}
areParametersKmsKeysFetched = true;
hasNextDescribePage = Boolean(parameters.NextToken);
describeNextToken = parameters.NextToken;
}
} catch (error) {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
if ((error as any).code === "AccessDeniedException") {
logger.error(
`AWS Parameter Store Error [integration=${integration.id}]: double check AWS account permissions (refer to the Infisical docs)`
);
}
response = {
isSynced: false,
syncMessage: (error as AWSError)?.message || "Error syncing with AWS Parameter Store"
};
}
}
// Identify secrets to create
// don't use Promise.all() and promise map here
// it will cause rate limit
@@ -620,7 +659,7 @@ const syncSecretsAWSParameterStore = async ({
// -> create secret
if (secrets[key].value) {
logger.info(
`getIntegrationSecrets: create secret in AWS SSM for [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}]`
`getIntegrationSecrets: create secret in AWS SSM for [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}]`
);
await ssm
.putParameter({
@@ -648,7 +687,7 @@ const syncSecretsAWSParameterStore = async ({
} catch (err) {
logger.error(
err,
`getIntegrationSecrets: create secret in AWS SSM for failed [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}]`
`getIntegrationSecrets: create secret in AWS SSM for failed [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}]`
);
// eslint-disable-next-line @typescript-eslint/no-explicit-any
if ((err as any).code === "AccessDeniedException") {
@@ -667,16 +706,23 @@ const syncSecretsAWSParameterStore = async ({
// case: secret exists in AWS parameter store
} else {
logger.info(
`getIntegrationSecrets: update secret in AWS SSM for [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}]`
`getIntegrationSecrets: update secret in AWS SSM for [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}]`
);
// -> update secret
if (awsParameterStoreSecretsObj[key].Value !== secrets[key].value) {
const shouldUpdateKms =
areParametersKmsKeysFetched &&
Boolean(metadata.kmsKeyId) &&
awsParameterStoreSecretsObj[key].KeyId !== metadata.kmsKeyId;
// we ensure that the KMS key configured in the integration is applied for ALL parameters on AWS
if (shouldUpdateKms || awsParameterStoreSecretsObj[key].Value !== secrets[key].value) {
await ssm
.putParameter({
Name: `${integration.path}${key}`,
Type: "SecureString",
Value: secrets[key].value,
Overwrite: true
Overwrite: true,
...(metadata.kmsKeyId && { KeyId: metadata.kmsKeyId })
})
.promise();
}
@@ -698,7 +744,7 @@ const syncSecretsAWSParameterStore = async ({
} catch (err) {
logger.error(
err,
`getIntegrationSecrets: update secret in AWS SSM for failed [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}]`
`getIntegrationSecrets: update secret in AWS SSM for failed [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}]`
);
// eslint-disable-next-line @typescript-eslint/no-explicit-any
if ((err as any).code === "AccessDeniedException") {
@@ -728,11 +774,11 @@ const syncSecretsAWSParameterStore = async ({
for (const key in awsParameterStoreSecretsObj) {
if (Object.hasOwn(awsParameterStoreSecretsObj, key)) {
logger.info(
`getIntegrationSecrets: inside of shouldDisableDelete AWS SSM [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}] [step=2]`
`getIntegrationSecrets: inside of shouldDisableDelete AWS SSM [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [step=2]`
);
if (!(key in secrets)) {
logger.info(
`getIntegrationSecrets: inside of shouldDisableDelete AWS SSM [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}] [step=3]`
`getIntegrationSecrets: inside of shouldDisableDelete AWS SSM [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [step=3]`
);
// case:
// -> delete secret
@@ -742,7 +788,7 @@ const syncSecretsAWSParameterStore = async ({
})
.promise();
logger.info(
`getIntegrationSecrets: inside of shouldDisableDelete AWS SSM [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [key=${key}] [step=4]`
`getIntegrationSecrets: inside of shouldDisableDelete AWS SSM [projectId=${projectId}] [environment=${integration.environment.slug}] [secretPath=${integration.secretPath}] [step=4]`
);
}
await new Promise((resolve) => {

View File

@@ -2,7 +2,7 @@ import { ForbiddenError, subject } from "@casl/ability";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { BadRequestError } from "@app/lib/errors";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { TProjectPermission } from "@app/lib/types";
import { TIntegrationAuthDALFactory } from "../integration-auth/integration-auth-dal";
@@ -19,6 +19,7 @@ import { TIntegrationDALFactory } from "./integration-dal";
import {
TCreateIntegrationDTO,
TDeleteIntegrationDTO,
TGetIntegrationDTO,
TSyncIntegrationDTO,
TUpdateIntegrationDTO
} from "./integration-types";
@@ -180,6 +181,27 @@ export const integrationServiceFactory = ({
return updatedIntegration;
};
const getIntegration = async ({ id, actor, actorAuthMethod, actorId, actorOrgId }: TGetIntegrationDTO) => {
const integration = await integrationDAL.findById(id);
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integration?.projectId || "",
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
if (!integration) {
throw new NotFoundError({
message: "Integration not found"
});
}
return { ...integration, envId: integration.environment.id };
};
const deleteIntegration = async ({
actorId,
id,
@@ -276,6 +298,8 @@ export const integrationServiceFactory = ({
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
await secretQueueService.syncIntegrations({
isManual: true,
actorId,
environment: integration.environment.slug,
secretPath: integration.secretPath,
projectId: integration.projectId
@@ -289,6 +313,7 @@ export const integrationServiceFactory = ({
updateIntegration,
deleteIntegration,
listIntegrationByProject,
getIntegration,
syncIntegration
};
};

View File

@@ -39,6 +39,10 @@ export type TCreateIntegrationDTO = {
};
} & Omit<TProjectPermission, "projectId">;
export type TGetIntegrationDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;
export type TUpdateIntegrationDTO = {
id: string;
app?: string;

View File

@@ -208,6 +208,23 @@ export const kmsServiceFactory = ({
return org.kmsDefaultKeyId;
};
const encryptWithRootKey = async () => {
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
return ({ plainText }: { plainText: Buffer }) => {
const encryptedPlainTextBlob = cipher.encrypt(plainText, ROOT_ENCRYPTION_KEY);
return Promise.resolve({ cipherTextBlob: encryptedPlainTextBlob });
};
};
const decryptWithRootKey = async () => {
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
return ({ cipherTextBlob }: { cipherTextBlob: Buffer }) => {
const decryptedBlob = cipher.decrypt(cipherTextBlob, ROOT_ENCRYPTION_KEY);
return Promise.resolve(decryptedBlob);
};
};
const decryptWithKmsKey = async ({
kmsId,
depth = 0
@@ -808,6 +825,8 @@ export const kmsServiceFactory = ({
decryptWithKmsKey,
encryptWithInputKey,
decryptWithInputKey,
encryptWithRootKey,
decryptWithRootKey,
getOrgKmsKeyId,
getProjectSecretManagerKmsKeyId,
updateProjectSecretManagerKmsKey,

View File

@@ -153,7 +153,6 @@ export const orgAdminServiceFactory = ({
members: [
{
orgMembershipId: membership.id,
projectMembershipRole: ProjectMembershipRole.Admin,
userPublicKey: userEncryptionKey.publicKey
}
]

View File

@@ -9,7 +9,10 @@ import {
OrgMembershipStatus,
ProjectMembershipRole,
ProjectVersion,
SecretKeyEncoding,
TableName,
TProjectMemberships,
TProjectUserMembershipRolesInsert,
TUsers
} from "@app/db/schemas";
import { TProjects } from "@app/db/schemas/projects";
@@ -18,13 +21,15 @@ import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-grou
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { TProjectUserAdditionalPrivilegeDALFactory } from "@app/ee/services/project-user-additional-privilege/project-user-additional-privilege-dal";
import { TSamlConfigDALFactory } from "@app/ee/services/saml-config/saml-config-dal";
import { getConfig } from "@app/lib/config/env";
import { generateAsymmetricKeyPair } from "@app/lib/crypto";
import { generateSymmetricKey, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { generateSymmetricKey, infisicalSymmetricDecrypt, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { generateUserSrpKeys } from "@app/lib/crypto/srp";
import { BadRequestError, NotFoundError, UnauthorizedError } from "@app/lib/errors";
import { groupBy } from "@app/lib/fn";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { isDisposableEmail } from "@app/lib/validator";
import { TOrgMembershipDALFactory } from "@app/services/org-membership/org-membership-dal";
@@ -32,14 +37,14 @@ import { TUserAliasDALFactory } from "@app/services/user-alias/user-alias-dal";
import { ActorAuthMethod, ActorType, AuthMethod, AuthTokenType } from "../auth/auth-type";
import { TAuthTokenServiceFactory } from "../auth-token/auth-token-service";
import { TokenMetadataType, TokenType, TTokenMetadata } from "../auth-token/auth-token-types";
import { TokenType } from "../auth-token/auth-token-types";
import { TProjectDALFactory } from "../project/project-dal";
import { verifyProjectVersions } from "../project/project-fns";
import { assignWorkspaceKeysToMembers } from "../project/project-fns";
import { TProjectBotDALFactory } from "../project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "../project-key/project-key-dal";
import { TProjectMembershipDALFactory } from "../project-membership/project-membership-dal";
import { addMembersToProject } from "../project-membership/project-membership-fns";
import { TProjectUserMembershipRoleDALFactory } from "../project-membership/project-user-membership-role-dal";
import { TProjectRoleDALFactory } from "../project-role/project-role-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { TUserDALFactory } from "../user/user-dal";
import { TIncidentContactsDALFactory } from "./incident-contacts-dal";
@@ -84,6 +89,7 @@ type TOrgServiceFactoryDep = {
"getPlan" | "updateSubscriptionOrgMemberCount" | "generateOrgCustomerId" | "removeOrgCustomer"
>;
projectUserAdditionalPrivilegeDAL: Pick<TProjectUserAdditionalPrivilegeDALFactory, "delete">;
projectRoleDAL: Pick<TProjectRoleDALFactory, "find">;
userGroupMembershipDAL: Pick<TUserGroupMembershipDALFactory, "findUserGroupMembershipsInProject">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
projectUserMembershipRoleDAL: Pick<TProjectUserMembershipRoleDALFactory, "insertMany">;
@@ -108,6 +114,7 @@ export const orgServiceFactory = ({
tokenService,
orgBotDAL,
licenseService,
projectRoleDAL,
samlConfigDAL,
userGroupMembershipDAL,
projectBotDAL,
@@ -440,17 +447,17 @@ export const orgServiceFactory = ({
*/
const inviteUserToOrganization = async ({
orgId,
userId,
actorId,
actor,
inviteeEmails,
organizationRoleSlug,
projectRoleSlug,
projectIds,
projects: invitedProjects,
actorAuthMethod,
actorOrgId
}: TInviteUserToOrgDTO) => {
const appCfg = getConfig();
const { permission } = await permissionService.getUserOrgPermission(userId, orgId, actorAuthMethod, actorOrgId);
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Member);
const org = await orgDAL.findOrgById(orgId);
@@ -461,6 +468,13 @@ export const orgServiceFactory = ({
});
}
const isEmailInvalid = await isDisposableEmail(inviteeEmails);
if (isEmailInvalid) {
throw new BadRequestError({
message: "Provided a disposable email",
name: "Org invite"
});
}
const plan = await licenseService.getPlan(orgId);
if (plan?.memberLimit && plan.membersUsed >= plan.memberLimit) {
// limit imposed on number of members allowed / number of members used exceeds the number of members allowed
@@ -475,205 +489,331 @@ export const orgServiceFactory = ({
message: "Failed to invite member due to member limit reached. Upgrade plan to invite more members."
});
}
if (projectIds?.length) {
const projects = await projectDAL.find({
orgId,
$in: {
id: projectIds
}
});
// if its not v3, throw an error
if (!verifyProjectVersions(projects, ProjectVersion.V3)) {
const isCustomOrgRole = !Object.values(OrgMembershipRole).includes(organizationRoleSlug as OrgMembershipRole);
if (isCustomOrgRole) {
if (!plan?.rbac)
throw new BadRequestError({
message: "One or more selected projects are not compatible with this operation. Please upgrade your projects."
message: "Failed to assign custom role due to RBAC restriction. Upgrade plan to assign custom role to member."
});
}
}
const inviteeUsers = await orgDAL.transaction(async (tx) => {
const users: Pick<
TUsers & { orgId: string },
"id" | "firstName" | "lastName" | "email" | "orgId" | "username"
>[] = [];
const projectsToInvite = invitedProjects?.length
? await projectDAL.find({
orgId,
$in: {
id: invitedProjects?.map(({ id }) => id)
}
})
: [];
if (projectsToInvite.length !== invitedProjects?.length) {
throw new UnauthorizedError({
message: "One or more project doesn't have access to"
});
}
if (projectsToInvite.some((el) => el.version !== ProjectVersion.V3)) {
throw new BadRequestError({
message: "One or more selected projects are not compatible with this operation. Please upgrade your projects."
});
}
const mailsForOrgInvitation: { email: string; userId: string; firstName: string; lastName: string }[] = [];
const mailsForProjectInvitaion: { email: string[]; projectName: string }[] = [];
const newProjectMemberships: TProjectMemberships[] = [];
await orgDAL.transaction(async (tx) => {
const users: Pick<TUsers, "id" | "firstName" | "lastName" | "email" | "username">[] = [];
for await (const inviteeEmail of inviteeEmails) {
const inviteeUser = await userDAL.findUserByUsername(inviteeEmail, tx);
let inviteeUser = await userDAL.findUserByUsername(inviteeEmail, tx);
if (inviteeUser) {
// if user already exist means its already part of infisical
// Thus the signup flow is not needed anymore
const [inviteeMembership] = await orgDAL.findMembership(
// if the user doesn't exist we create the user with the email
if (!inviteeUser) {
inviteeUser = await userDAL.create(
{
[`${TableName.OrgMembership}.orgId` as "orgId"]: orgId,
[`${TableName.OrgMembership}.userId` as "userId"]: inviteeUser.id
isAccepted: false,
email: inviteeEmail,
username: inviteeEmail,
authMethods: [AuthMethod.EMAIL],
isGhost: false
},
{ tx }
tx
);
if (inviteeMembership && inviteeMembership.status === OrgMembershipStatus.Accepted) {
throw new BadRequestError({
message: `Failed to invite members because ${inviteeEmail} is already part of the organization`,
name: "Invite user to org"
});
}
if (!inviteeMembership) {
await orgDAL.createMembership(
{
userId: inviteeUser.id,
inviteEmail: inviteeEmail,
orgId,
role: OrgMembershipRole.Member,
status: OrgMembershipStatus.Invited,
isActive: true
},
tx
);
if (projectIds?.length) {
if (
organizationRoleSlug === OrgMembershipRole.Custom ||
projectRoleSlug === ProjectMembershipRole.Custom
) {
throw new BadRequestError({
message: "Custom roles are not supported for inviting users to projects and organizations"
});
}
if (!projectRoleSlug) {
throw new BadRequestError({
message: "Selecting a project role is required to invite users to projects"
});
}
await projectMembershipDAL.insertMany(
projectIds.map((id) => ({ projectId: id, userId: inviteeUser.id })),
tx
);
for await (const projectId of projectIds) {
await addMembersToProject({
orgDAL,
projectDAL,
projectMembershipDAL,
projectKeyDAL,
userGroupMembershipDAL,
projectBotDAL,
projectUserMembershipRoleDAL,
smtpService
}).addMembersToNonE2EEProject(
{
emails: [inviteeEmail],
usernames: [],
projectId,
projectMembershipRole: projectRoleSlug,
sendEmails: false
},
{
tx
}
);
}
}
}
return [{ ...inviteeUser, orgId }];
}
const isEmailInvalid = await isDisposableEmail(inviteeEmail);
if (isEmailInvalid) {
throw new BadRequestError({
message: "Provided a disposable email",
name: "Org invite"
const inviteeUserId = inviteeUser?.id;
const existingEncrytionKey = await userDAL.findUserEncKeyByUserId(inviteeUserId, tx);
// when user is missing the encrytion keys
// this could happen either if user doesn't exist or user didn't find step 3 of generating the encryption keys of srp
// So what we do is we generate a random secure password and then encrypt it with a random pub-private key
// Then when user sign in (as login is not possible as isAccepted is false) we rencrypt the private key with the user password
if (!inviteeUser || (inviteeUser && !inviteeUser?.isAccepted && !existingEncrytionKey)) {
const serverGeneratedPassword = crypto.randomBytes(32).toString("hex");
const { tag, encoding, ciphertext, iv } = infisicalSymmetricEncypt(serverGeneratedPassword);
const encKeys = await generateUserSrpKeys(inviteeEmail, serverGeneratedPassword);
await userDAL.createUserEncryption(
{
userId: inviteeUserId,
encryptionVersion: 2,
protectedKey: encKeys.protectedKey,
protectedKeyIV: encKeys.protectedKeyIV,
protectedKeyTag: encKeys.protectedKeyTag,
publicKey: encKeys.publicKey,
encryptedPrivateKey: encKeys.encryptedPrivateKey,
iv: encKeys.encryptedPrivateKeyIV,
tag: encKeys.encryptedPrivateKeyTag,
salt: encKeys.salt,
verifier: encKeys.verifier,
serverEncryptedPrivateKeyEncoding: encoding,
serverEncryptedPrivateKeyTag: tag,
serverEncryptedPrivateKeyIV: iv,
serverEncryptedPrivateKey: ciphertext
},
tx
);
}
const [inviteeMembership] = await orgDAL.findMembership(
{
[`${TableName.OrgMembership}.orgId` as "orgId"]: orgId,
[`${TableName.OrgMembership}.userId` as "userId"]: inviteeUserId
},
{ tx }
);
// if there exist no org membership we set is as given by the request
if (!inviteeMembership) {
let roleId;
const orgRole = isCustomOrgRole ? OrgMembershipRole.Custom : organizationRoleSlug;
if (isCustomOrgRole) {
const customRole = await orgRoleDAL.findOne({ slug: organizationRoleSlug, orgId });
if (!customRole)
throw new BadRequestError({ name: "Invite membership", message: "Organization role not found" });
roleId = customRole.id;
}
await orgDAL.createMembership(
{
userId: inviteeUser.id,
inviteEmail: inviteeEmail,
orgId,
role: orgRole,
status: OrgMembershipStatus.Invited,
isActive: true,
roleId
},
tx
);
mailsForOrgInvitation.push({
email: inviteeEmail,
userId: inviteeUser.id,
firstName: inviteeUser?.firstName || "",
lastName: inviteeUser.lastName || ""
});
}
// not invited before
const user = await userDAL.create(
{
username: inviteeEmail,
email: inviteeEmail,
isAccepted: false,
authMethods: [AuthMethod.EMAIL],
isGhost: false
},
tx
);
await orgDAL.createMembership(
{
inviteEmail: inviteeEmail,
orgId,
userId: user.id,
role: organizationRoleSlug,
status: OrgMembershipStatus.Invited,
isActive: true
},
tx
);
users.push({
...user,
orgId
users.push(inviteeUser);
}
const userIds = users.map(({ id }) => id);
const usernames = users.map((el) => el.username);
const userEncryptionKeys = await userDAL.findUserEncKeyByUserIdsBatch({ userIds }, tx);
// we don't need to spam with email. Thus org invitation doesn't need project invitation again
const userIdsWithOrgInvitation = new Set(mailsForOrgInvitation.map((el) => el.userId));
// if there exist no project membership we set is as given by the request
for await (const project of projectsToInvite) {
const projectId = project.id;
const { permission: projectPermission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(projectPermission).throwUnlessCan(
ProjectPermissionActions.Create,
ProjectPermissionSub.Member
);
const existingMembers = await projectMembershipDAL.find(
{
projectId: project.id,
$in: { userId: userIds }
},
{ tx }
);
const existingMembersGroupByUserId = groupBy(existingMembers, (i) => i.userId);
const userIdsToExcludeAsPartOfGroup = new Set(
await userGroupMembershipDAL.findUserGroupMembershipsInProject(usernames, projectId, tx)
);
const userWithEncryptionKeyInvitedToProject = userEncryptionKeys.filter(
(user) => !existingMembersGroupByUserId?.[user.userId] && !userIdsToExcludeAsPartOfGroup.has(user.userId)
);
// eslint-disable-next-line no-continue
if (!userWithEncryptionKeyInvitedToProject.length) continue;
// validate custom project role
const invitedProjectRoles = invitedProjects.find((el) => el.id === project.id)?.projectRoleSlug || [
ProjectMembershipRole.Member
];
const customProjectRoles = invitedProjectRoles.filter(
(role) => !Object.values(ProjectMembershipRole).includes(role as ProjectMembershipRole)
);
const hasCustomRole = Boolean(customProjectRoles.length);
if (hasCustomRole) {
if (!plan?.rbac)
throw new BadRequestError({
message:
"Failed to assign custom role due to RBAC restriction. Upgrade plan to assign custom role to member."
});
}
const customRoles = hasCustomRole
? await projectRoleDAL.find({
projectId,
$in: { slug: customProjectRoles.map((role) => role) }
})
: [];
if (customRoles.length !== customProjectRoles.length)
throw new BadRequestError({ message: "Custom role not found" });
const customRolesGroupBySlug = groupBy(customRoles, ({ slug }) => slug);
const ghostUser = await projectDAL.findProjectGhostUser(projectId, tx);
if (!ghostUser) {
throw new BadRequestError({
message: "Failed to find sudo user"
});
}
const ghostUserLatestKey = await projectKeyDAL.findLatestProjectKey(ghostUser.id, projectId, tx);
if (!ghostUserLatestKey) {
throw new BadRequestError({
message: "Failed to find sudo user latest key"
});
}
const bot = await projectBotDAL.findOne({ projectId }, tx);
if (!bot) {
throw new BadRequestError({
message: "Failed to find bot"
});
}
const botPrivateKey = infisicalSymmetricDecrypt({
keyEncoding: bot.keyEncoding as SecretKeyEncoding,
iv: bot.iv,
tag: bot.tag,
ciphertext: bot.encryptedPrivateKey
});
const newWsMembers = assignWorkspaceKeysToMembers({
decryptKey: ghostUserLatestKey,
userPrivateKey: botPrivateKey,
members: userWithEncryptionKeyInvitedToProject.map((userEnc) => ({
orgMembershipId: userEnc.userId,
projectMembershipRole: ProjectMembershipRole.Admin,
userPublicKey: userEnc.publicKey
}))
});
const projectMemberships = await projectMembershipDAL.insertMany(
userWithEncryptionKeyInvitedToProject.map((userEnc) => ({
projectId,
userId: userEnc.userId
})),
tx
);
newProjectMemberships.push(...projectMemberships);
const sanitizedProjectMembershipRoles: TProjectUserMembershipRolesInsert[] = [];
invitedProjectRoles.forEach((projectRole) => {
const isCustomRole = Boolean(customRolesGroupBySlug?.[projectRole]?.[0]);
projectMemberships.forEach((membership) => {
sanitizedProjectMembershipRoles.push({
projectMembershipId: membership.id,
role: isCustomRole ? ProjectMembershipRole.Custom : projectRole,
customRoleId: customRolesGroupBySlug[projectRole] ? customRolesGroupBySlug[projectRole][0].id : null
});
});
});
await projectUserMembershipRoleDAL.insertMany(sanitizedProjectMembershipRoles, tx);
await projectKeyDAL.insertMany(
newWsMembers.map((el) => ({
encryptedKey: el.workspaceEncryptedKey,
nonce: el.workspaceEncryptedNonce,
senderId: ghostUser.id,
receiverId: el.orgMembershipId,
projectId
})),
tx
);
mailsForProjectInvitaion.push({
email: userWithEncryptionKeyInvitedToProject
.filter((el) => !userIdsWithOrgInvitation.has(el.userId))
.map((el) => el.email || el.username),
projectName: project.name
});
}
return users;
});
const user = await userDAL.findById(userId);
await licenseService.updateSubscriptionOrgMemberCount(orgId);
const signupTokens: { email: string; link: string }[] = [];
if (inviteeUsers) {
for await (const invitee of inviteeUsers) {
// send org invite mail
await Promise.allSettled(
mailsForOrgInvitation.map(async (el) => {
const token = await tokenService.createTokenForUser({
type: TokenType.TOKEN_EMAIL_ORG_INVITATION,
userId: invitee.id,
userId: el.userId,
orgId
});
let inviteMetadata: string = "";
if (projectIds && projectIds?.length > 0) {
inviteMetadata = jwt.sign(
{
type: TokenMetadataType.InviteToProjects,
payload: {
projectIds,
projectRoleSlug: projectRoleSlug!, // Implicitly checked inside transaction if projectRoleSlug is undefined
userId: invitee.id,
orgId
}
} satisfies TTokenMetadata,
appCfg.AUTH_SECRET,
{
expiresIn: appCfg.JWT_INVITE_LIFETIME
}
);
}
signupTokens.push({
email: invitee.email || invitee.username,
link: `${appCfg.SITE_URL}/signupinvite?token=${token}${
inviteMetadata ? `&metadata=${inviteMetadata}` : ""
}&to=${invitee.email || invitee.username}&organization_id=${org?.id}`
email: el.email,
link: `${appCfg.SITE_URL}/signupinvite?token=${token}&to=${el.email}&organization_id=${org?.id}`
});
await smtpService.sendMail({
return smtpService.sendMail({
template: SmtpTemplates.OrgInvite,
subjectLine: "Infisical organization invitation",
recipients: [invitee.email || invitee.username],
recipients: [el.email],
substitutions: {
metadata: inviteMetadata,
inviterFirstName: user.firstName,
inviterUsername: user.username,
inviterFirstName: el.firstName,
inviterUsername: el.email,
organizationName: org?.name,
email: invitee.email || invitee.username,
email: el.email,
organizationId: org?.id.toString(),
token,
callback_url: `${appCfg.SITE_URL}/signupinvite`
}
});
}
}
await licenseService.updateSubscriptionOrgMemberCount(orgId);
})
);
await Promise.allSettled(
mailsForProjectInvitaion
.filter((el) => Boolean(el.email.length))
.map(async (el) => {
return smtpService.sendMail({
template: SmtpTemplates.WorkspaceInvite,
subjectLine: "Infisical project invitation",
recipients: el.email,
substitutions: {
workspaceName: el.projectName,
callback_url: `${appCfg.SITE_URL}/login`
}
});
})
);
if (!appCfg.isSmtpConfigured) {
return signupTokens;
return { signupTokens, projectMemberships: newProjectMemberships };
}
return { signupTokens: undefined, projectMemberships: newProjectMemberships };
};
/**

View File

@@ -1,4 +1,3 @@
import { OrgMembershipRole, ProjectMembershipRole } from "@app/db/schemas";
import { TOrgPermission } from "@app/lib/types";
import { ActorAuthMethod, ActorType } from "../auth/auth-type";
@@ -26,14 +25,17 @@ export type TDeleteOrgMembershipDTO = {
};
export type TInviteUserToOrgDTO = {
userId: string;
actorId: string;
actor: ActorType;
orgId: string;
actorOrgId: string | undefined;
actorAuthMethod: ActorAuthMethod;
inviteeEmails: string[];
organizationRoleSlug: OrgMembershipRole;
projectIds?: string[];
projectRoleSlug?: ProjectMembershipRole;
organizationRoleSlug: string;
projects?: {
id: string;
projectRoleSlug?: string[];
}[];
};
export type TVerifyUserToOrgDTO = {

View File

@@ -1,190 +0,0 @@
import { Knex } from "knex";
import { ProjectMembershipRole, SecretKeyEncoding, TProjectMemberships } from "@app/db/schemas";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { BadRequestError } from "@app/lib/errors";
import { groupBy } from "@app/lib/fn";
import { TOrgDALFactory } from "../org/org-dal";
import { TProjectDALFactory } from "../project/project-dal";
import { assignWorkspaceKeysToMembers } from "../project/project-fns";
import { TProjectBotDALFactory } from "../project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "../project-key/project-key-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { TProjectMembershipDALFactory } from "./project-membership-dal";
import { TProjectUserMembershipRoleDALFactory } from "./project-user-membership-role-dal";
type TAddMembersToProjectArg = {
orgDAL: Pick<TOrgDALFactory, "findMembership" | "findOrgMembersByUsername">;
projectMembershipDAL: Pick<TProjectMembershipDALFactory, "find" | "transaction" | "insertMany">;
projectDAL: Pick<TProjectDALFactory, "findProjectById" | "findProjectGhostUser">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "findLatestProjectKey" | "insertMany">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
userGroupMembershipDAL: Pick<TUserGroupMembershipDALFactory, "findUserGroupMembershipsInProject">;
projectUserMembershipRoleDAL: Pick<TProjectUserMembershipRoleDALFactory, "insertMany">;
smtpService: Pick<TSmtpService, "sendMail">;
};
type AddMembersToNonE2EEProjectDTO = {
emails: string[];
usernames: string[];
projectId: string;
projectMembershipRole: ProjectMembershipRole;
sendEmails?: boolean;
};
type AddMembersToNonE2EEProjectOptions = {
tx?: Knex;
throwOnProjectNotFound?: boolean;
};
export const addMembersToProject = ({
orgDAL,
projectDAL,
projectMembershipDAL,
projectKeyDAL,
projectBotDAL,
userGroupMembershipDAL,
projectUserMembershipRoleDAL,
smtpService
}: TAddMembersToProjectArg) => {
// Can create multiple memberships for a singular project, based on user email / username
const addMembersToNonE2EEProject = async (
{ emails, usernames, projectId, projectMembershipRole, sendEmails }: AddMembersToNonE2EEProjectDTO,
options: AddMembersToNonE2EEProjectOptions = { throwOnProjectNotFound: true }
) => {
const processTransaction = async (tx: Knex) => {
const usernamesAndEmails = [...emails, ...usernames];
const project = await projectDAL.findProjectById(projectId);
if (!project) {
if (options.throwOnProjectNotFound) {
throw new BadRequestError({ message: "Project not found when attempting to add user to project" });
}
return [];
}
const orgMembers = await orgDAL.findOrgMembersByUsername(
project.orgId,
[...new Set(usernamesAndEmails.map((element) => element.toLowerCase()))],
tx
);
if (orgMembers.length !== usernamesAndEmails.length)
throw new BadRequestError({ message: "Some users are not part of org" });
if (!orgMembers.length) return [];
const existingMembers = await projectMembershipDAL.find({
projectId,
$in: { userId: orgMembers.map(({ user }) => user.id).filter(Boolean) }
});
if (existingMembers.length) throw new BadRequestError({ message: "Some users are already part of project" });
const ghostUser = await projectDAL.findProjectGhostUser(projectId);
if (!ghostUser) {
throw new BadRequestError({
message: "Failed to find sudo user"
});
}
const ghostUserLatestKey = await projectKeyDAL.findLatestProjectKey(ghostUser.id, projectId);
if (!ghostUserLatestKey) {
throw new BadRequestError({
message: "Failed to find sudo user latest key"
});
}
const bot = await projectBotDAL.findOne({ projectId });
if (!bot) {
throw new BadRequestError({
message: "Failed to find bot"
});
}
const botPrivateKey = infisicalSymmetricDecrypt({
keyEncoding: bot.keyEncoding as SecretKeyEncoding,
iv: bot.iv,
tag: bot.tag,
ciphertext: bot.encryptedPrivateKey
});
const newWsMembers = assignWorkspaceKeysToMembers({
decryptKey: ghostUserLatestKey,
userPrivateKey: botPrivateKey,
members: orgMembers.map((membership) => ({
orgMembershipId: membership.id,
projectMembershipRole,
userPublicKey: membership.user.publicKey
}))
});
const members: TProjectMemberships[] = [];
const userIdsToExcludeForProjectKeyAddition = new Set(
await userGroupMembershipDAL.findUserGroupMembershipsInProject(usernamesAndEmails, projectId)
);
const projectMemberships = await projectMembershipDAL.insertMany(
orgMembers.map(({ user }) => ({
projectId,
userId: user.id
})),
tx
);
await projectUserMembershipRoleDAL.insertMany(
projectMemberships.map(({ id }) => ({ projectMembershipId: id, role: projectMembershipRole })),
tx
);
members.push(...projectMemberships);
const encKeyGroupByOrgMembId = groupBy(newWsMembers, (i) => i.orgMembershipId);
await projectKeyDAL.insertMany(
orgMembers
.filter(({ user }) => !userIdsToExcludeForProjectKeyAddition.has(user.id))
.map(({ user, id }) => ({
encryptedKey: encKeyGroupByOrgMembId[id][0].workspaceEncryptedKey,
nonce: encKeyGroupByOrgMembId[id][0].workspaceEncryptedNonce,
senderId: ghostUser.id,
receiverId: user.id,
projectId
})),
tx
);
if (sendEmails) {
const recipients = orgMembers.filter((i) => i.user.email).map((i) => i.user.email as string);
const appCfg = getConfig();
if (recipients.length) {
await smtpService.sendMail({
template: SmtpTemplates.WorkspaceInvite,
subjectLine: "Infisical project invitation",
recipients: orgMembers.filter((i) => i.user.email).map((i) => i.user.email as string),
substitutions: {
workspaceName: project.name,
callback_url: `${appCfg.SITE_URL}/login`
}
});
}
}
return members;
};
if (options.tx) {
return processTransaction(options.tx);
}
return projectMembershipDAL.transaction(processTransaction);
};
return {
addMembersToNonE2EEProject
};
};

View File

@@ -22,11 +22,9 @@ import { TProjectRoleDALFactory } from "../project-role/project-role-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { TUserDALFactory } from "../user/user-dal";
import { TProjectMembershipDALFactory } from "./project-membership-dal";
import { addMembersToProject } from "./project-membership-fns";
import {
ProjectUserMembershipTemporaryMode,
TAddUsersToWorkspaceDTO,
TAddUsersToWorkspaceNonE2EEDTO,
TDeleteProjectMembershipOldDTO,
TDeleteProjectMembershipsDTO,
TGetProjectMembershipByUsernameDTO,
@@ -44,7 +42,7 @@ type TProjectMembershipServiceFactoryDep = {
projectUserMembershipRoleDAL: Pick<TProjectUserMembershipRoleDALFactory, "insertMany" | "find" | "delete">;
userDAL: Pick<TUserDALFactory, "findById" | "findOne" | "findUserByProjectMembershipId" | "find">;
userGroupMembershipDAL: TUserGroupMembershipDALFactory;
projectRoleDAL: Pick<TProjectRoleDALFactory, "find">;
projectRoleDAL: Pick<TProjectRoleDALFactory, "find" | "findOne">;
orgDAL: Pick<TOrgDALFactory, "findMembership" | "findOrgMembersByUsername">;
projectDAL: Pick<TProjectDALFactory, "findById" | "findProjectGhostUser" | "transaction" | "findProjectById">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "findLatestProjectKey" | "delete" | "insertMany">;
@@ -61,7 +59,6 @@ export const projectMembershipServiceFactory = ({
projectUserMembershipRoleDAL,
smtpService,
projectRoleDAL,
projectBotDAL,
orgDAL,
projectUserAdditionalPrivilegeDAL,
userDAL,
@@ -214,52 +211,6 @@ export const projectMembershipServiceFactory = ({
return orgMembers;
};
const addUsersToProjectNonE2EE = async ({
projectId,
actorId,
actorAuthMethod,
actor,
actorOrgId,
emails,
usernames,
sendEmails = true
}: TAddUsersToWorkspaceNonE2EEDTO) => {
const project = await projectDAL.findById(projectId);
if (!project) throw new BadRequestError({ message: "Project not found" });
if (project.version === ProjectVersion.V1) {
throw new BadRequestError({ message: "Please upgrade your project on your dashboard" });
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Create, ProjectPermissionSub.Member);
const members = await addMembersToProject({
orgDAL,
projectDAL,
projectMembershipDAL,
projectKeyDAL,
userGroupMembershipDAL,
projectBotDAL,
projectUserMembershipRoleDAL,
smtpService
}).addMembersToNonE2EEProject({
emails,
usernames,
projectId,
projectMembershipRole: ProjectMembershipRole.Member,
sendEmails
});
return members;
};
const updateProjectMembership = async ({
actorId,
actor,
@@ -530,7 +481,6 @@ export const projectMembershipServiceFactory = ({
getProjectMemberships,
getProjectMembershipByUsername,
updateProjectMembership,
addUsersToProjectNonE2EE,
deleteProjectMemberships,
deleteProjectMembership, // TODO: Remove this
addUsersToProject,

View File

@@ -53,4 +53,5 @@ export type TAddUsersToWorkspaceNonE2EEDTO = {
sendEmails?: boolean;
emails: string[];
usernames: string[];
roleSlugs?: string[];
} & TProjectPermission;

View File

@@ -16,7 +16,7 @@ export const assignWorkspaceKeysToMembers = ({ members, decryptKey, userPrivateK
privateKey: userPrivateKey
});
const newWsMembers = members.map(({ orgMembershipId, userPublicKey, projectMembershipRole }) => {
const newWsMembers = members.map(({ orgMembershipId, userPublicKey }) => {
const { ciphertext: inviteeCipherText, nonce: inviteeNonce } = encryptAsymmetric(
plaintextProjectKey,
userPublicKey,
@@ -25,7 +25,6 @@ export const assignWorkspaceKeysToMembers = ({ members, decryptKey, userPrivateK
return {
orgMembershipId,
projectRole: projectMembershipRole,
workspaceEncryptedKey: inviteeCipherText,
workspaceEncryptedNonce: inviteeNonce
};

View File

@@ -300,8 +300,7 @@ export const projectQueueFactory = ({
members: [
{
userPublicKey: user.publicKey,
orgMembershipId: orgMembership.id,
projectMembershipRole: ProjectMembershipRole.Admin
orgMembershipId: orgMembership.id
}
]
});

View File

@@ -34,6 +34,8 @@ import { TProjectUserMembershipRoleDALFactory } from "../project-membership/proj
import { TProjectRoleDALFactory } from "../project-role/project-role-dal";
import { getPredefinedRoles } from "../project-role/project-role-fns";
import { ROOT_FOLDER_NAME, TSecretFolderDALFactory } from "../secret-folder/secret-folder-dal";
import { TProjectSlackConfigDALFactory } from "../slack/project-slack-config-dal";
import { TSlackIntegrationDALFactory } from "../slack/slack-integration-dal";
import { TUserDALFactory } from "../user/user-dal";
import { TProjectDALFactory } from "./project-dal";
import { assignWorkspaceKeysToMembers, createProjectKey } from "./project-fns";
@@ -43,6 +45,7 @@ import {
TDeleteProjectDTO,
TGetProjectDTO,
TGetProjectKmsKey,
TGetProjectSlackConfig,
TListProjectAlertsDTO,
TListProjectCasDTO,
TListProjectCertificateTemplatesDTO,
@@ -54,6 +57,7 @@ import {
TUpdateProjectDTO,
TUpdateProjectKmsDTO,
TUpdateProjectNameDTO,
TUpdateProjectSlackConfig,
TUpdateProjectVersionLimitDTO,
TUpgradeProjectDTO
} from "./project-types";
@@ -76,6 +80,8 @@ type TProjectServiceFactoryDep = {
identityProjectMembershipRoleDAL: Pick<TIdentityProjectMembershipRoleDALFactory, "create">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "create" | "findLatestProjectKey" | "delete" | "find" | "insertMany">;
projectMembershipDAL: Pick<TProjectMembershipDALFactory, "create" | "findProjectGhostUser" | "findOne">;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "findOne" | "transaction" | "updateById" | "create">;
slackIntegrationDAL: Pick<TSlackIntegrationDALFactory, "findById" | "findByIdWithWorkflowIntegrationDetails">;
projectUserMembershipRoleDAL: Pick<TProjectUserMembershipRoleDALFactory, "create">;
certificateAuthorityDAL: Pick<TCertificateAuthorityDALFactory, "find">;
certificateDAL: Pick<TCertificateDALFactory, "find" | "countCertificatesInProject">;
@@ -126,7 +132,9 @@ export const projectServiceFactory = ({
pkiAlertDAL,
keyStore,
kmsService,
projectBotDAL
projectBotDAL,
projectSlackConfigDAL,
slackIntegrationDAL
}: TProjectServiceFactoryDep) => {
/*
* Create workspace. Make user the admin
@@ -269,8 +277,7 @@ export const projectServiceFactory = ({
members: [
{
userPublicKey: user.publicKey,
orgMembershipId: orgMembership.id,
projectMembershipRole: ProjectMembershipRole.Admin
orgMembershipId: orgMembership.id
}
]
});
@@ -284,7 +291,7 @@ export const projectServiceFactory = ({
tx
);
await projectUserMembershipRoleDAL.create(
{ projectMembershipId: userProjectMembership.id, role: projectAdmin.projectRole },
{ projectMembershipId: userProjectMembership.id, role: ProjectMembershipRole.Admin },
tx
);
@@ -909,6 +916,113 @@ export const projectServiceFactory = ({
return { secretManagerKmsKey: kmsKey };
};
const getProjectSlackConfig = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
projectId
}: TGetProjectSlackConfig) => {
const project = await projectDAL.findById(projectId);
if (!project) {
throw new NotFoundError({
message: "Project not found"
});
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Settings);
return projectSlackConfigDAL.findOne({
projectId: project.id
});
};
const updateProjectSlackConfig = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
projectId,
slackIntegrationId,
isAccessRequestNotificationEnabled,
accessRequestChannels,
isSecretRequestNotificationEnabled,
secretRequestChannels
}: TUpdateProjectSlackConfig) => {
const project = await projectDAL.findById(projectId);
if (!project) {
throw new NotFoundError({
message: "Project not found"
});
}
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(slackIntegrationId);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found"
});
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Edit, ProjectPermissionSub.Settings);
if (slackIntegration.orgId !== project.orgId) {
throw new BadRequestError({
message: "Selected slack integration is not in the same organization"
});
}
return projectSlackConfigDAL.transaction(async (tx) => {
const slackConfig = await projectSlackConfigDAL.findOne(
{
projectId
},
tx
);
if (slackConfig) {
return projectSlackConfigDAL.updateById(
slackConfig.id,
{
slackIntegrationId,
isAccessRequestNotificationEnabled,
accessRequestChannels,
isSecretRequestNotificationEnabled,
secretRequestChannels
},
tx
);
}
return projectSlackConfigDAL.create(
{
projectId,
slackIntegrationId,
isAccessRequestNotificationEnabled,
accessRequestChannels,
isSecretRequestNotificationEnabled,
secretRequestChannels
},
tx
);
});
};
return {
createProject,
deleteProject,
@@ -929,6 +1043,8 @@ export const projectServiceFactory = ({
updateProjectKmsKey,
getProjectKmsBackup,
loadProjectKmsBackup,
getProjectKmsKeys
getProjectKmsKeys,
getProjectSlackConfig,
updateProjectSlackConfig
};
};

View File

@@ -1,4 +1,4 @@
import { ProjectMembershipRole, TProjectKeys } from "@app/db/schemas";
import { TProjectKeys } from "@app/db/schemas";
import { TProjectPermission } from "@app/lib/types";
import { ActorAuthMethod, ActorType } from "../auth/auth-type";
@@ -88,7 +88,6 @@ export type AddUserToWsDTO = {
userPrivateKey: string;
members: {
orgMembershipId: string;
projectMembershipRole: ProjectMembershipRole;
userPublicKey: string;
}[];
};
@@ -123,3 +122,13 @@ export type TLoadProjectKmsBackupDTO = {
export type TGetProjectKmsKey = TProjectPermission;
export type TListProjectCertificateTemplatesDTO = TProjectPermission;
export type TGetProjectSlackConfig = TProjectPermission;
export type TUpdateProjectSlackConfig = {
slackIntegrationId: string;
isAccessRequestNotificationEnabled: boolean;
accessRequestChannels: string;
isSecretRequestNotificationEnabled: boolean;
secretRequestChannels: string;
} & TProjectPermission;

View File

@@ -444,7 +444,9 @@ export const expandSecretReferencesFactory = ({
depth: depth + 1
});
}
expandedValue = expandedValue.replaceAll(interpolationSyntax, referedValue);
if (referedValue) {
expandedValue = expandedValue.replaceAll(interpolationSyntax, referedValue);
}
} else {
const secretReferenceEnvironment = entities[0];
const secretReferencePath = path.join("/", ...entities.slice(1, entities.length - 1));
@@ -463,7 +465,9 @@ export const expandSecretReferencesFactory = ({
});
}
expandedValue = expandedValue.replaceAll(interpolationSyntax, referedValue);
if (referedValue) {
expandedValue = expandedValue.replaceAll(interpolationSyntax, referedValue);
}
}
}
}

View File

@@ -2,6 +2,8 @@
import { AxiosError } from "axios";
import { ProjectUpgradeStatus, ProjectVersion, TSecretSnapshotSecretsV2, TSecretVersionsV2 } from "@app/db/schemas";
import { TAuditLogServiceFactory } from "@app/ee/services/audit-log/audit-log-service";
import { Actor, EventType } from "@app/ee/services/audit-log/audit-log-types";
import { TSecretApprovalRequestDALFactory } from "@app/ee/services/secret-approval-request/secret-approval-request-dal";
import { TSecretRotationDALFactory } from "@app/ee/services/secret-rotation/secret-rotation-dal";
import { TSnapshotDALFactory } from "@app/ee/services/secret-snapshot/snapshot-dal";
@@ -21,6 +23,7 @@ import { TSecretVersionTagDALFactory } from "@app/services/secret/secret-version
import { TSecretBlindIndexDALFactory } from "@app/services/secret-blind-index/secret-blind-index-dal";
import { TSecretTagDALFactory } from "@app/services/secret-tag/secret-tag-dal";
import { ActorType } from "../auth/auth-type";
import { TIntegrationDALFactory } from "../integration/integration-dal";
import { TIntegrationAuthDALFactory } from "../integration-auth/integration-auth-dal";
import { TIntegrationAuthServiceFactory } from "../integration-auth/integration-auth-service";
@@ -40,6 +43,7 @@ import { expandSecretReferencesFactory, getAllNestedSecretReferences } from "../
import { TSecretVersionV2DALFactory } from "../secret-v2-bridge/secret-version-dal";
import { TSecretVersionV2TagDALFactory } from "../secret-v2-bridge/secret-version-tag-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { TUserDALFactory } from "../user/user-dal";
import { TWebhookDALFactory } from "../webhook/webhook-dal";
import { fnTriggerWebhook } from "../webhook/webhook-fns";
import { TSecretDALFactory } from "./secret-dal";
@@ -71,6 +75,7 @@ type TSecretQueueFactoryDep = {
secretVersionDAL: TSecretVersionDALFactory;
secretBlindIndexDAL: TSecretBlindIndexDALFactory;
secretTagDAL: TSecretTagDALFactory;
userDAL: Pick<TUserDALFactory, "findById">;
secretVersionTagDAL: TSecretVersionTagDALFactory;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
secretV2BridgeDAL: TSecretV2BridgeDALFactory;
@@ -81,6 +86,7 @@ type TSecretQueueFactoryDep = {
snapshotDAL: Pick<TSnapshotDALFactory, "findNSecretV1SnapshotByFolderId" | "deleteSnapshotsAboveLimit">;
snapshotSecretV2BridgeDAL: Pick<TSnapshotSecretV2DALFactory, "insertMany" | "batchInsert">;
keyStore: Pick<TKeyStoreFactory, "acquireLock" | "setItemWithExpiry" | "getItem">;
auditLogService: Pick<TAuditLogServiceFactory, "createAuditLog">;
};
export type TGetSecrets = {
@@ -106,6 +112,7 @@ export const secretQueueFactory = ({
secretDAL,
secretImportDAL,
folderDAL,
userDAL,
webhookDAL,
projectEnvDAL,
orgDAL,
@@ -125,7 +132,8 @@ export const secretQueueFactory = ({
snapshotDAL,
snapshotSecretV2BridgeDAL,
secretApprovalRequestDAL,
keyStore
keyStore,
auditLogService
}: TSecretQueueFactoryDep) => {
const removeSecretReminder = async (dto: TRemoveSecretReminderDTO) => {
const appCfg = getConfig();
@@ -430,7 +438,9 @@ export const secretQueueFactory = ({
return content;
};
const syncIntegrations = async (dto: TGetSecrets & { deDupeQueue?: Record<string, boolean> }) => {
const syncIntegrations = async (
dto: TGetSecrets & { isManual?: boolean; actorId?: string; deDupeQueue?: Record<string, boolean> }
) => {
await queueService.queue(QueueName.IntegrationSync, QueueJobs.IntegrationSync, dto, {
attempts: 3,
delay: 1000,
@@ -528,7 +538,7 @@ export const secretQueueFactory = ({
}
}
);
await syncIntegrations({ secretPath, projectId, environment, deDupeQueue });
await syncIntegrations({ secretPath, projectId, environment, deDupeQueue, isManual: false });
if (!excludeReplication) {
await replicateSecrets({
_deDupeReplicationQueue: deDupeReplicationQueue,
@@ -544,7 +554,7 @@ export const secretQueueFactory = ({
});
queueService.start(QueueName.IntegrationSync, async (job) => {
const { environment, projectId, secretPath, depth = 1, deDupeQueue = {} } = job.data;
const { environment, actorId, isManual, projectId, secretPath, depth = 1, deDupeQueue = {} } = job.data;
if (depth > MAX_SYNC_SECRET_DEPTH) return;
const folder = await folderDAL.findBySecretPath(projectId, environment, secretPath);
@@ -693,6 +703,30 @@ export const secretQueueFactory = ({
});
}
const generateActor = async (): Promise<Actor> => {
if (isManual && actorId) {
const user = await userDAL.findById(actorId);
if (!user) {
throw new Error("User not found");
}
return {
type: ActorType.USER,
metadata: {
email: user.email,
username: user.username,
userId: user.id
}
};
}
return {
type: ActorType.PLATFORM,
metadata: {}
};
};
// akhilmhdh: this try catch is for lock release
try {
const secrets = shouldUseSecretV2Bridge
@@ -778,6 +812,21 @@ export const secretQueueFactory = ({
}
});
await auditLogService.createAuditLog({
projectId,
actor: await generateActor(),
event: {
type: EventType.INTEGRATION_SYNCED,
metadata: {
integrationId: integration.id,
isSynced: response?.isSynced ?? true,
lastSyncJobId: job?.id ?? "",
lastUsed: new Date(),
syncMessage: response?.syncMessage ?? ""
}
}
});
await integrationDAL.updateById(integration.id, {
lastSyncJobId: job.id,
lastUsed: new Date(),
@@ -794,9 +843,23 @@ export const secretQueueFactory = ({
(err instanceof AxiosError ? JSON.stringify(err?.response?.data) : (err as Error)?.message) ||
"Unknown error occurred.";
await auditLogService.createAuditLog({
projectId,
actor: await generateActor(),
event: {
type: EventType.INTEGRATION_SYNCED,
metadata: {
integrationId: integration.id,
isSynced: false,
lastSyncJobId: job?.id ?? "",
lastUsed: new Date(),
syncMessage: message
}
}
});
await integrationDAL.updateById(integration.id, {
lastSyncJobId: job.id,
lastUsed: new Date(),
syncMessage: message,
isSynced: false
});

View File

@@ -0,0 +1,25 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TProjectSlackConfigDALFactory = ReturnType<typeof projectSlackConfigDALFactory>;
export const projectSlackConfigDALFactory = (db: TDbClient) => {
const projectSlackConfigOrm = ormify(db, TableName.ProjectSlackConfigs);
const getIntegrationDetailsByProject = (projectId: string, tx?: Knex) => {
return (tx || db.replicaNode())(TableName.ProjectSlackConfigs)
.join(
TableName.SlackIntegrations,
`${TableName.ProjectSlackConfigs}.slackIntegrationId`,
`${TableName.SlackIntegrations}.id`
)
.where("projectId", "=", projectId)
.select(selectAllTableCols(TableName.ProjectSlackConfigs), selectAllTableCols(TableName.SlackIntegrations))
.first();
};
return { ...projectSlackConfigOrm, getIntegrationDetailsByProject };
};

View File

@@ -0,0 +1,16 @@
import z from "zod";
export const validateSlackChannelsField = z
.string()
.trim()
.default("")
.transform((data) => {
if (data === "") return "";
return data
.split(",")
.map((id) => id.trim())
.join(", ");
})
.refine((data) => data.split(",").length <= 20, {
message: "You can only select up to 20 slack channels"
});

View File

@@ -0,0 +1,177 @@
import { WebClient } from "@slack/web-api";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors";
import { logger } from "@app/lib/logger";
import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types";
import { TProjectDALFactory } from "../project/project-dal";
import { TProjectSlackConfigDALFactory } from "./project-slack-config-dal";
import { SlackTriggerFeature, TSlackNotification } from "./slack-types";
export const fetchSlackChannels = async (botKey: string) => {
const slackChannels: {
name: string;
id: string;
}[] = [];
const slackWebClient = new WebClient(botKey);
let cursor;
do {
// eslint-disable-next-line no-await-in-loop
const response = await slackWebClient.conversations.list({
cursor,
limit: 1000,
types: "public_channel,private_channel"
});
response.channels?.forEach((channel) =>
slackChannels.push({
name: channel.name_normalized as string,
id: channel.id as string
})
);
// Set the cursor for the next page
cursor = response.response_metadata?.next_cursor;
} while (cursor); // Continue while there is a cursor
return slackChannels;
};
const buildSlackPayload = (notification: TSlackNotification) => {
const appCfg = getConfig();
switch (notification.type) {
case SlackTriggerFeature.SECRET_APPROVAL: {
const { payload } = notification;
const messageBody = `A secret approval request has been opened by ${payload.userEmail}.
*Environment*: ${payload.environment}
*Secret path*: ${payload.secretPath || "/"}
View the complete details <${appCfg.SITE_URL}/project/${payload.projectId}/approval?requestId=${
payload.requestId
}|here>.`;
const payloadBlocks = [
{
type: "header",
text: {
type: "plain_text",
text: "Secret approval request",
emoji: true
}
},
{
type: "section",
text: {
type: "mrkdwn",
text: messageBody
}
}
];
return {
payloadMessage: messageBody,
payloadBlocks
};
}
case SlackTriggerFeature.ACCESS_REQUEST: {
const { payload } = notification;
const messageBody = `${payload.requesterFullName} (${payload.requesterEmail}) has requested ${
payload.isTemporary ? "temporary" : "permanent"
} access to ${payload.secretPath} in the ${payload.environment} environment of ${payload.projectName}.
The following permissions are requested: ${payload.permissions.join(", ")}
View the request and approve or deny it <${payload.approvalUrl}|here>.`;
const payloadBlocks = [
{
type: "header",
text: {
type: "plain_text",
text: "New access approval request pending for review",
emoji: true
}
},
{
type: "section",
text: {
type: "mrkdwn",
text: messageBody
}
}
];
return {
payloadMessage: messageBody,
payloadBlocks
};
}
default: {
throw new BadRequestError({
message: "Slack notification type not supported."
});
}
}
};
export const triggerSlackNotification = async ({
projectId,
notification,
projectSlackConfigDAL,
projectDAL,
kmsService
}: {
projectId: string;
notification: TSlackNotification;
projectSlackConfigDAL: Pick<TProjectSlackConfigDALFactory, "getIntegrationDetailsByProject">;
projectDAL: Pick<TProjectDALFactory, "findById">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey">;
}) => {
const { payloadMessage, payloadBlocks } = buildSlackPayload(notification);
const project = await projectDAL.findById(projectId);
const slackIntegration = await projectSlackConfigDAL.getIntegrationDetailsByProject(project.id);
if (!slackIntegration) {
return;
}
let targetChannelIds: string[] = [];
if (notification.type === SlackTriggerFeature.ACCESS_REQUEST) {
targetChannelIds = slackIntegration.accessRequestChannels?.split(", ") || [];
if (!targetChannelIds.length || !slackIntegration.isAccessRequestNotificationEnabled) {
return;
}
} else if (notification.type === SlackTriggerFeature.SECRET_APPROVAL) {
targetChannelIds = slackIntegration.secretRequestChannels?.split(", ") || [];
if (!targetChannelIds.length || !slackIntegration.isSecretRequestNotificationEnabled) {
return;
}
}
const { decryptor: orgDataKeyDecryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.Organization,
orgId: project.orgId
});
const botKey = orgDataKeyDecryptor({
cipherTextBlob: slackIntegration.encryptedBotAccessToken
}).toString("utf8");
const slackWebClient = new WebClient(botKey);
for await (const conversationId of targetChannelIds) {
// we send both text and blocks for compatibility with barebone clients
await slackWebClient.chat
.postMessage({
channel: conversationId,
text: payloadMessage,
blocks: payloadBlocks
})
.catch((err) => logger.error(err));
}
};

View File

@@ -0,0 +1,56 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName, TSlackIntegrations, TWorkflowIntegrations } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TSlackIntegrationDALFactory = ReturnType<typeof slackIntegrationDALFactory>;
export const slackIntegrationDALFactory = (db: TDbClient) => {
const slackIntegrationOrm = ormify(db, TableName.SlackIntegrations);
const findByIdWithWorkflowIntegrationDetails = async (id: string, tx?: Knex) => {
try {
return await (tx || db.replicaNode())(TableName.SlackIntegrations)
.join(
TableName.WorkflowIntegrations,
`${TableName.SlackIntegrations}.id`,
`${TableName.WorkflowIntegrations}.id`
)
.select(selectAllTableCols(TableName.SlackIntegrations))
.select(db.ref("orgId").withSchema(TableName.WorkflowIntegrations))
.select(db.ref("description").withSchema(TableName.WorkflowIntegrations))
.select(db.ref("integration").withSchema(TableName.WorkflowIntegrations))
.select(db.ref("slug").withSchema(TableName.WorkflowIntegrations))
.where(`${TableName.WorkflowIntegrations}.id`, "=", id)
.first();
} catch (error) {
throw new DatabaseError({ error, name: "Find by ID with Workflow integration details" });
}
};
const findWithWorkflowIntegrationDetails = async (
filter: Partial<TSlackIntegrations> & Partial<TWorkflowIntegrations>,
tx?: Knex
) => {
try {
return await (tx || db.replicaNode())(TableName.SlackIntegrations)
.join(
TableName.WorkflowIntegrations,
`${TableName.SlackIntegrations}.id`,
`${TableName.WorkflowIntegrations}.id`
)
.select(selectAllTableCols(TableName.SlackIntegrations))
.select(db.ref("orgId").withSchema(TableName.WorkflowIntegrations))
.select(db.ref("description").withSchema(TableName.WorkflowIntegrations))
.select(db.ref("integration").withSchema(TableName.WorkflowIntegrations))
.select(db.ref("slug").withSchema(TableName.WorkflowIntegrations))
.where(filter);
} catch (error) {
throw new DatabaseError({ error, name: "Find with Workflow integration details" });
}
};
return { ...slackIntegrationOrm, findByIdWithWorkflowIntegrationDetails, findWithWorkflowIntegrationDetails };
};

View File

@@ -0,0 +1,463 @@
import { ForbiddenError } from "@casl/ability";
import { InstallProvider } from "@slack/oauth";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types";
import { getServerCfg } from "../super-admin/super-admin-service";
import { TWorkflowIntegrationDALFactory } from "../workflow-integration/workflow-integration-dal";
import { WorkflowIntegration } from "../workflow-integration/workflow-integration-types";
import { fetchSlackChannels } from "./slack-fns";
import { TSlackIntegrationDALFactory } from "./slack-integration-dal";
import {
TCompleteSlackIntegrationDTO,
TDeleteSlackIntegrationDTO,
TGetReinstallUrlDTO,
TGetSlackInstallUrlDTO,
TGetSlackIntegrationByIdDTO,
TGetSlackIntegrationByOrgDTO,
TGetSlackIntegrationChannelsDTO,
TReinstallSlackIntegrationDTO,
TUpdateSlackIntegrationDTO
} from "./slack-types";
type TSlackServiceFactoryDep = {
slackIntegrationDAL: Pick<
TSlackIntegrationDALFactory,
| "deleteById"
| "updateById"
| "create"
| "findByIdWithWorkflowIntegrationDetails"
| "findWithWorkflowIntegrationDetails"
>;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission" | "getOrgPermission">;
kmsService: Pick<TKmsServiceFactory, "createCipherPairWithDataKey" | "encryptWithRootKey" | "decryptWithRootKey">;
workflowIntegrationDAL: Pick<TWorkflowIntegrationDALFactory, "transaction" | "create" | "updateById" | "deleteById">;
};
export type TSlackServiceFactory = ReturnType<typeof slackServiceFactory>;
export const slackServiceFactory = ({
permissionService,
slackIntegrationDAL,
kmsService,
workflowIntegrationDAL
}: TSlackServiceFactoryDep) => {
const completeSlackIntegration = async ({
orgId,
slug,
description,
teamId,
teamName,
slackUserId,
slackAppId,
botAccessToken,
slackBotId,
slackBotUserId
}: TCompleteSlackIntegrationDTO) => {
const { encryptor: orgDataKeyEncryptor } = await kmsService.createCipherPairWithDataKey({
orgId,
type: KmsDataKey.Organization
});
const { cipherTextBlob: encryptedBotAccessToken } = orgDataKeyEncryptor({
plainText: Buffer.from(botAccessToken, "utf8")
});
await workflowIntegrationDAL.transaction(async (tx) => {
const workflowIntegration = await workflowIntegrationDAL.create(
{
description,
orgId,
slug,
integration: WorkflowIntegration.SLACK
},
tx
);
await slackIntegrationDAL.create(
{
// @ts-expect-error id is kept as fixed because it is always equal to the workflow integration ID
id: workflowIntegration.id,
teamId,
teamName,
slackUserId,
slackAppId,
slackBotId,
slackBotUserId,
encryptedBotAccessToken
},
tx
);
});
};
const reinstallSlackIntegration = async ({
id,
teamId,
teamName,
slackUserId,
slackAppId,
botAccessToken,
slackBotId,
slackBotUserId
}: TReinstallSlackIntegrationDTO) => {
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(id);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found"
});
}
const { encryptor: orgDataKeyEncryptor } = await kmsService.createCipherPairWithDataKey({
orgId: slackIntegration.orgId,
type: KmsDataKey.Organization
});
const { cipherTextBlob: encryptedBotAccessToken } = orgDataKeyEncryptor({
plainText: Buffer.from(botAccessToken, "utf8")
});
await slackIntegrationDAL.updateById(id, {
teamId,
teamName,
slackUserId,
slackAppId,
slackBotId,
slackBotUserId,
encryptedBotAccessToken
});
};
const getSlackInstaller = async () => {
const appCfg = getConfig();
const serverCfg = await getServerCfg();
let slackClientId = appCfg.WORKFLOW_SLACK_CLIENT_ID as string;
let slackClientSecret = appCfg.WORKFLOW_SLACK_CLIENT_SECRET as string;
const decrypt = await kmsService.decryptWithRootKey();
if (serverCfg.encryptedSlackClientId) {
slackClientId = (await decrypt({ cipherTextBlob: Buffer.from(serverCfg.encryptedSlackClientId) })).toString();
}
if (serverCfg.encryptedSlackClientSecret) {
slackClientSecret = (
await decrypt({ cipherTextBlob: Buffer.from(serverCfg.encryptedSlackClientSecret) })
).toString();
}
if (!slackClientId || !slackClientSecret) {
throw new BadRequestError({
message: `Invalid Slack configuration. ${
appCfg.isCloud
? "Please contact the Infisical team."
: "Contact your instance admin to setup Slack integration in the Admin settings. Your configuration is missing Slack client ID and secret."
}`
});
}
return new InstallProvider({
clientId: slackClientId,
clientSecret: slackClientSecret,
stateSecret: appCfg.AUTH_SECRET,
legacyStateVerification: true,
installationStore: {
storeInstallation: async (installation) => {
if (installation.isEnterpriseInstall && installation.enterprise?.id) {
throw new BadRequestError({
message: "Enterprise not yet supported"
});
}
const metadata = JSON.parse(installation.metadata || "") as {
id?: string;
orgId: string;
slug: string;
description?: string;
};
if (metadata.id) {
return reinstallSlackIntegration({
id: metadata.id,
teamId: installation.team?.id || "",
teamName: installation.team?.name || "",
slackUserId: installation.user.id,
slackAppId: installation.appId || "",
botAccessToken: installation.bot?.token || "",
slackBotId: installation.bot?.id || "",
slackBotUserId: installation.bot?.userId || ""
});
}
return completeSlackIntegration({
orgId: metadata.orgId,
slug: metadata.slug,
description: metadata.description,
teamId: installation.team?.id || "",
teamName: installation.team?.name || "",
slackUserId: installation.user.id,
slackAppId: installation.appId || "",
botAccessToken: installation.bot?.token || "",
slackBotId: installation.bot?.id || "",
slackBotUserId: installation.bot?.userId || ""
});
},
// for our use-case we don't need to implement this because this will only be used
// when listening for events from slack
fetchInstallation: () => {
return {} as never;
},
// for our use-case we don't need to implement this yet
deleteInstallation: () => {
return {} as never;
}
}
});
};
const getInstallUrl = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
slug,
description
}: TGetSlackInstallUrlDTO) => {
const appCfg = getConfig();
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Settings);
const installer = await getSlackInstaller();
const url = await installer.generateInstallUrl({
scopes: ["chat:write.public", "chat:write", "channels:read", "groups:read"],
metadata: JSON.stringify({
slug,
description,
orgId: actorOrgId
}),
redirectUri: `${appCfg.SITE_URL}/api/v1/workflow-integrations/slack/oauth_redirect`
});
return url;
};
const getReinstallUrl = async ({ actorId, actor, actorOrgId, actorAuthMethod, id }: TGetReinstallUrlDTO) => {
const appCfg = getConfig();
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(id);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found"
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
slackIntegration.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Settings);
const installer = await getSlackInstaller();
const url = await installer.generateInstallUrl({
scopes: ["chat:write.public", "chat:write", "channels:read", "groups:read"],
metadata: JSON.stringify({
id,
orgId: slackIntegration.orgId
}),
redirectUri: `${appCfg.SITE_URL}/api/v1/workflow-integrations/slack/oauth_redirect`
});
return url;
};
const getSlackIntegrationsByOrg = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod
}: TGetSlackIntegrationByOrgDTO) => {
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Settings);
const slackIntegrations = await slackIntegrationDAL.findWithWorkflowIntegrationDetails({
orgId: actorOrgId
});
return slackIntegrations;
};
const getSlackIntegrationById = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id
}: TGetSlackIntegrationByIdDTO) => {
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(id);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found."
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
slackIntegration.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Settings);
return slackIntegration;
};
const getSlackIntegrationChannels = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id
}: TGetSlackIntegrationChannelsDTO) => {
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(id);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found."
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
slackIntegration.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Settings);
const { decryptor: orgDataKeyDecryptor } = await kmsService.createCipherPairWithDataKey({
orgId: slackIntegration.orgId,
type: KmsDataKey.Organization
});
const botKey = orgDataKeyDecryptor({
cipherTextBlob: slackIntegration.encryptedBotAccessToken
}).toString("utf8");
return fetchSlackChannels(botKey);
};
const updateSlackIntegration = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id,
slug,
description
}: TUpdateSlackIntegrationDTO) => {
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(id);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found"
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
slackIntegration.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
return workflowIntegrationDAL.transaction(async (tx) => {
await workflowIntegrationDAL.updateById(
slackIntegration.id,
{
slug,
description
},
tx
);
const updatedIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(
slackIntegration.id,
tx
);
return updatedIntegration!;
});
};
const deleteSlackIntegration = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id
}: TDeleteSlackIntegrationDTO) => {
const slackIntegration = await slackIntegrationDAL.findByIdWithWorkflowIntegrationDetails(id);
if (!slackIntegration) {
throw new NotFoundError({
message: "Slack integration not found"
});
}
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
slackIntegration.orgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Delete, OrgPermissionSubjects.Settings);
await workflowIntegrationDAL.deleteById(id);
return slackIntegration;
};
return {
getInstallUrl,
getReinstallUrl,
getSlackIntegrationsByOrg,
getSlackIntegrationById,
completeSlackIntegration,
getSlackInstaller,
updateSlackIntegration,
deleteSlackIntegration,
getSlackIntegrationChannels
};
};

View File

@@ -0,0 +1,79 @@
import { TOrgPermission } from "@app/lib/types";
export type TGetSlackInstallUrlDTO = {
slug: string;
description?: string;
} & Omit<TOrgPermission, "orgId">;
export type TGetReinstallUrlDTO = {
id: string;
} & Omit<TOrgPermission, "orgId">;
export type TGetSlackIntegrationByOrgDTO = Omit<TOrgPermission, "orgId">;
export type TGetSlackIntegrationByIdDTO = { id: string } & Omit<TOrgPermission, "orgId">;
export type TGetSlackIntegrationChannelsDTO = { id: string } & Omit<TOrgPermission, "orgId">;
export type TUpdateSlackIntegrationDTO = { id: string; slug?: string; description?: string } & Omit<
TOrgPermission,
"orgId"
>;
export type TDeleteSlackIntegrationDTO = {
id: string;
} & Omit<TOrgPermission, "orgId">;
export type TCompleteSlackIntegrationDTO = {
orgId: string;
slug: string;
description?: string;
teamId: string;
teamName: string;
slackUserId: string;
slackAppId: string;
botAccessToken: string;
slackBotId: string;
slackBotUserId: string;
};
export type TReinstallSlackIntegrationDTO = {
id: string;
teamId: string;
teamName: string;
slackUserId: string;
slackAppId: string;
botAccessToken: string;
slackBotId: string;
slackBotUserId: string;
};
export enum SlackTriggerFeature {
SECRET_APPROVAL = "secret-approval",
ACCESS_REQUEST = "access-request"
}
export type TSlackNotification =
| {
type: SlackTriggerFeature.SECRET_APPROVAL;
payload: {
userEmail: string;
environment: string;
secretPath: string;
requestId: string;
projectId: string;
};
}
| {
type: SlackTriggerFeature.ACCESS_REQUEST;
payload: {
requesterFullName: string;
requesterEmail: string;
isTemporary: boolean;
secretPath: string;
environment: string;
projectName: string;
permissions: string[];
approvalUrl: string;
};
};

View File

@@ -6,10 +6,11 @@ import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { getUserPrivateKey } from "@app/lib/crypto/srp";
import { BadRequestError } from "@app/lib/errors";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { TAuthLoginFactory } from "../auth/auth-login-service";
import { AuthMethod } from "../auth/auth-type";
import { TKmsServiceFactory } from "../kms/kms-service";
import { TOrgServiceFactory } from "../org/org-service";
import { TUserDALFactory } from "../user/user-dal";
import { TSuperAdminDALFactory } from "./super-admin-dal";
@@ -19,6 +20,7 @@ type TSuperAdminServiceFactoryDep = {
serverCfgDAL: TSuperAdminDALFactory;
userDAL: TUserDALFactory;
authService: Pick<TAuthLoginFactory, "generateUserTokens">;
kmsService: Pick<TKmsServiceFactory, "encryptWithRootKey" | "decryptWithRootKey">;
orgService: Pick<TOrgServiceFactory, "createOrganization">;
keyStore: Pick<TKeyStoreFactory, "getItem" | "setItemWithExpiry" | "deleteItem">;
licenseService: Pick<TLicenseServiceFactory, "onPremFeatures">;
@@ -39,6 +41,7 @@ export const superAdminServiceFactory = ({
authService,
orgService,
keyStore,
kmsService,
licenseService
}: TSuperAdminServiceFactoryDep) => {
const initServerCfg = async () => {
@@ -82,7 +85,12 @@ export const superAdminServiceFactory = ({
return newCfg;
};
const updateServerCfg = async (data: TSuperAdminUpdate, userId: string) => {
const updateServerCfg = async (
data: TSuperAdminUpdate & { slackClientId?: string; slackClientSecret?: string },
userId: string
) => {
const updatedData = data;
if (data.enabledLoginMethods) {
const superAdminUser = await userDAL.findById(userId);
const loginMethodToAuthMethod = {
@@ -113,7 +121,27 @@ export const superAdminServiceFactory = ({
});
}
}
const updatedServerCfg = await serverCfgDAL.updateById(ADMIN_CONFIG_DB_UUID, data);
const encryptWithRoot = await kmsService.encryptWithRootKey();
if (data.slackClientId) {
const { cipherTextBlob: encryptedClientId } = await encryptWithRoot({
plainText: Buffer.from(data.slackClientId)
});
updatedData.encryptedSlackClientId = encryptedClientId;
updatedData.slackClientId = undefined;
}
if (data.slackClientSecret) {
const { cipherTextBlob: encryptedClientSecret } = await encryptWithRoot({
plainText: Buffer.from(data.slackClientSecret)
});
updatedData.encryptedSlackClientSecret = encryptedClientSecret;
updatedData.slackClientSecret = undefined;
}
const updatedServerCfg = await serverCfgDAL.updateById(ADMIN_CONFIG_DB_UUID, updatedData);
await keyStore.setItemWithExpiry(ADMIN_CONFIG_KEY, ADMIN_CONFIG_KEY_EXP, JSON.stringify(updatedServerCfg));
@@ -232,11 +260,38 @@ export const superAdminServiceFactory = ({
return user;
};
const getAdminSlackConfig = async () => {
const serverCfg = await serverCfgDAL.findById(ADMIN_CONFIG_DB_UUID);
if (!serverCfg) {
throw new NotFoundError({ name: "Admin config", message: "Admin config not found" });
}
let clientId = "";
let clientSecret = "";
const decrypt = await kmsService.decryptWithRootKey();
if (serverCfg.encryptedSlackClientId) {
clientId = (await decrypt({ cipherTextBlob: serverCfg.encryptedSlackClientId })).toString();
}
if (serverCfg.encryptedSlackClientSecret) {
clientSecret = (await decrypt({ cipherTextBlob: serverCfg.encryptedSlackClientSecret })).toString();
}
return {
clientId,
clientSecret
};
};
return {
initServerCfg,
updateServerCfg,
adminSignUp,
getUsers,
deleteUser
deleteUser,
getAdminSlackConfig
};
};

View File

@@ -82,10 +82,9 @@ export const userDALFactory = (db: TDbClient) => {
}
};
const findUserEncKeyByUserId = async (userId: string) => {
const findUserEncKeyByUserId = async (userId: string, tx?: Knex) => {
try {
const user = await db
.replicaNode()(TableName.Users)
const user = await (tx || db.replicaNode())(TableName.Users)
.where(`${TableName.Users}.id`, userId)
.join(TableName.UserEncryptionKey, `${TableName.Users}.id`, `${TableName.UserEncryptionKey}.userId`)
.first();

View File

@@ -4,18 +4,14 @@ import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TUserDALFactory } from "@app/services/user/user-dal";
export const normalizeUsername = async (username: string, userDAL: Pick<TUserDALFactory, "findOne">) => {
let attempt = slugify(`${username}-${alphaNumericNanoId(4)}`);
let attempt: string;
let user;
let user = await userDAL.findOne({ username: attempt });
if (!user) return attempt;
while (true) {
do {
attempt = slugify(`${username}-${alphaNumericNanoId(4)}`);
// eslint-disable-next-line no-await-in-loop
user = await userDAL.findOne({ username: attempt });
} while (user);
if (!user) {
return attempt;
}
}
return attempt;
};

View File

@@ -1,4 +1,8 @@
import { ForbiddenError } from "@casl/ability";
import { SecretKeyEncoding } from "@app/db/schemas";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { BadRequestError } from "@app/lib/errors";
import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-service";
@@ -8,8 +12,10 @@ import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TUserAliasDALFactory } from "@app/services/user-alias/user-alias-dal";
import { AuthMethod } from "../auth/auth-type";
import { TGroupProjectDALFactory } from "../group-project/group-project-dal";
import { TProjectMembershipDALFactory } from "../project-membership/project-membership-dal";
import { TUserDALFactory } from "./user-dal";
import { TListUserGroupsDTO } from "./user-types";
type TUserServiceFactoryDep = {
userDAL: Pick<
@@ -27,10 +33,12 @@ type TUserServiceFactoryDep = {
| "delete"
>;
userAliasDAL: Pick<TUserAliasDALFactory, "find" | "insertMany">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "findByUserId">;
orgMembershipDAL: Pick<TOrgMembershipDALFactory, "find" | "insertMany" | "findOne" | "updateById">;
tokenService: Pick<TAuthTokenServiceFactory, "createTokenForUser" | "validateTokenForUser">;
projectMembershipDAL: Pick<TProjectMembershipDALFactory, "find">;
smtpService: Pick<TSmtpService, "sendMail">;
permissionService: TPermissionServiceFactory;
};
export type TUserServiceFactory = ReturnType<typeof userServiceFactory>;
@@ -40,8 +48,10 @@ export const userServiceFactory = ({
userAliasDAL,
orgMembershipDAL,
projectMembershipDAL,
groupProjectDAL,
tokenService,
smtpService
smtpService,
permissionService
}: TUserServiceFactoryDep) => {
const sendEmailVerificationCode = async (username: string) => {
const user = await userDAL.findOne({ username });
@@ -295,6 +305,27 @@ export const userServiceFactory = ({
return updatedOrgMembership.projectFavorites;
};
const listUserGroups = async ({ username, actorOrgId, actor, actorId, actorAuthMethod }: TListUserGroupsDTO) => {
const user = await userDAL.findOne({
username
});
// This makes it so the user can always read information about themselves, but no one else if they don't have the Members Read permission.
if (user.id !== actorId) {
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Member);
}
const memberships = await groupProjectDAL.findByUserId(user.id, actorOrgId);
return memberships;
};
return {
sendEmailVerificationCode,
verifyEmailVerificationCode,
@@ -304,6 +335,7 @@ export const userServiceFactory = ({
deleteUser,
getMe,
createUserAction,
listUserGroups,
getUserAction,
unlockUser,
getUserPrivateKey,

View File

@@ -0,0 +1,10 @@
import { TOrgPermission } from "@app/lib/types";
export type TListUserGroupsDTO = {
username: string;
} & Omit<TOrgPermission, "orgId">;
export enum UserEncryption {
V1 = 1,
V2 = 2
}

View File

@@ -0,0 +1,11 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { ormify } from "@app/lib/knex";
export type TWorkflowIntegrationDALFactory = ReturnType<typeof workflowIntegrationDALFactory>;
export const workflowIntegrationDALFactory = (db: TDbClient) => {
const workflowIntegrationOrm = ormify(db, TableName.WorkflowIntegrations);
return workflowIntegrationOrm;
};

View File

@@ -0,0 +1,43 @@
import { ForbiddenError } from "@casl/ability";
import { OrgPermissionActions, OrgPermissionSubjects } from "@app/ee/services/permission/org-permission";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { TWorkflowIntegrationDALFactory } from "./workflow-integration-dal";
import { TGetWorkflowIntegrationsByOrg } from "./workflow-integration-types";
type TWorkflowIntegrationServiceFactoryDep = {
workflowIntegrationDAL: Pick<TWorkflowIntegrationDALFactory, "find">;
permissionService: Pick<TPermissionServiceFactory, "getProjectPermission" | "getOrgPermission">;
};
export type TWorkflowIntegrationServiceFactory = ReturnType<typeof workflowIntegrationServiceFactory>;
export const workflowIntegrationServiceFactory = ({
workflowIntegrationDAL,
permissionService
}: TWorkflowIntegrationServiceFactoryDep) => {
const getIntegrationsByOrg = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod
}: TGetWorkflowIntegrationsByOrg) => {
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
actorOrgId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Settings);
return workflowIntegrationDAL.find({
orgId: actorOrgId
});
};
return {
getIntegrationsByOrg
};
};

View File

@@ -0,0 +1,7 @@
import { TOrgPermission } from "@app/lib/types";
export enum WorkflowIntegration {
SLACK = "slack"
}
export type TGetWorkflowIntegrationsByOrg = Omit<TOrgPermission, "orgId">;

View File

@@ -4,22 +4,27 @@ Copyright (c) 2023 Infisical Inc.
package cmd
import (
"errors"
"fmt"
"os"
"os/exec"
"os/signal"
"runtime"
"strings"
"sync"
"syscall"
"time"
"github.com/Infisical/infisical-merge/packages/models"
"github.com/Infisical/infisical-merge/packages/util"
"github.com/fatih/color"
"github.com/posthog/posthog-go"
"github.com/rs/zerolog/log"
"github.com/spf13/cobra"
)
var ErrManualSignalInterrupt = errors.New("signal: interrupt")
var watcherWaitGroup = new(sync.WaitGroup)
// runCmd represents the run command
var runCmd = &cobra.Command{
Example: `
@@ -77,11 +82,35 @@ var runCmd = &cobra.Command{
util.HandleError(err, "Unable to parse flag")
}
command, err := cmd.Flags().GetString("command")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secretOverriding, err := cmd.Flags().GetBool("secret-overriding")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
watchMode, err := cmd.Flags().GetBool("watch")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
watchModeInterval, err := cmd.Flags().GetInt("watch-interval")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
// If the --watch flag has been set, the --watch-interval flag should also be set
if watchMode && watchModeInterval < 5 {
util.HandleError(fmt.Errorf("watch interval must be at least 5 seconds, you passed %d seconds", watchModeInterval))
}
shouldExpandSecrets, err := cmd.Flags().GetBool("expand")
if err != nil {
util.HandleError(err, "Unable to parse flag")
@@ -116,108 +145,50 @@ var runCmd = &cobra.Command{
Recursive: recursive,
}
if token != nil && token.Type == util.SERVICE_TOKEN_IDENTIFIER {
request.InfisicalToken = token.Token
} else if token != nil && token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER {
request.UniversalAuthAccessToken = token.Token
}
secrets, err := util.GetAllEnvironmentVariables(request, projectConfigDir)
injectableEnvironment, err := fetchAndFormatSecretsForShell(request, projectConfigDir, secretOverriding, shouldExpandSecrets, token)
if err != nil {
util.HandleError(err, "Could not fetch secrets", "If you are using a service token to fetch secrets, please ensure it is valid")
}
if secretOverriding {
secrets = util.OverrideSecrets(secrets, util.SECRET_TYPE_PERSONAL)
log.Debug().Msgf("injecting the following environment variables into shell: %v", injectableEnvironment.Variables)
if watchMode {
executeCommandWithWatchMode(command, args, watchModeInterval, request, projectConfigDir, shouldExpandSecrets, secretOverriding, token)
} else {
secrets = util.OverrideSecrets(secrets, util.SECRET_TYPE_SHARED)
}
if cmd.Flags().Changed("command") {
command := cmd.Flag("command").Value.String()
err = executeMultipleCommandWithEnvs(command, injectableEnvironment.SecretsCount, injectableEnvironment.Variables)
if err != nil {
fmt.Println(err)
os.Exit(1)
}
if shouldExpandSecrets {
authParams := models.ExpandSecretsAuthentication{}
if token != nil && token.Type == util.SERVICE_TOKEN_IDENTIFIER {
authParams.InfisicalToken = token.Token
} else if token != nil && token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER {
authParams.UniversalAuthAccessToken = token.Token
}
secrets = util.ExpandSecrets(secrets, authParams, projectConfigDir)
}
secretsByKey := getSecretsByKeys(secrets)
environmentVariables := make(map[string]string)
// add all existing environment vars
for _, s := range os.Environ() {
kv := strings.SplitN(s, "=", 2)
key := kv[0]
value := kv[1]
environmentVariables[key] = value
}
// check to see if there are any reserved key words in secrets to inject
filterReservedEnvVars(secretsByKey)
// now add infisical secrets
for k, v := range secretsByKey {
environmentVariables[k] = v.Value
}
// turn it back into a list of envs
var env []string
for key, value := range environmentVariables {
s := key + "=" + value
env = append(env, s)
}
log.Debug().Msgf("injecting the following environment variables into shell: %v", env)
Telemetry.CaptureEvent("cli-command:run",
posthog.NewProperties().
Set("secretsCount", len(secrets)).
Set("environment", environmentName).
Set("isUsingServiceToken", token != nil && token.Type == util.SERVICE_TOKEN_IDENTIFIER).
Set("isUsingUniversalAuthToken", token != nil && token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER).
Set("single-command", strings.Join(args, " ")).
Set("multi-command", cmd.Flag("command").Value.String()).
Set("version", util.CLI_VERSION))
if cmd.Flags().Changed("command") {
command := cmd.Flag("command").Value.String()
err = executeMultipleCommandWithEnvs(command, len(secretsByKey), env)
if err != nil {
fmt.Println(err)
os.Exit(1)
}
} else {
err = executeSingleCommandWithEnvs(args, len(secretsByKey), env)
if err != nil {
fmt.Println(err)
os.Exit(1)
} else {
err = executeSingleCommandWithEnvs(args, injectableEnvironment.SecretsCount, injectableEnvironment.Variables)
if err != nil {
fmt.Println(err)
os.Exit(1)
}
}
}
},
}
var (
reservedEnvVars = []string{
"HOME", "PATH", "PS1", "PS2",
"PWD", "EDITOR", "XAUTHORITY", "USER",
"TERM", "TERMINFO", "SHELL", "MAIL",
}
reservedEnvVarPrefixes = []string{
"XDG_",
"LC_",
}
)
func filterReservedEnvVars(env map[string]models.SingleEnvironmentVariable) {
var (
reservedEnvVars = []string{
"HOME", "PATH", "PS1", "PS2",
"PWD", "EDITOR", "XAUTHORITY", "USER",
"TERM", "TERMINFO", "SHELL", "MAIL",
}
reservedEnvVarPrefixes = []string{
"XDG_",
"LC_",
}
)
for _, reservedEnvName := range reservedEnvVars {
if _, ok := env[reservedEnvName]; ok {
delete(env, reservedEnvName)
@@ -237,13 +208,15 @@ func filterReservedEnvVars(env map[string]models.SingleEnvironmentVariable) {
func init() {
rootCmd.AddCommand(runCmd)
runCmd.Flags().String("token", "", "Fetch secrets using service token or machine identity access token")
runCmd.Flags().String("token", "", "fetch secrets using service token or machine identity access token")
runCmd.Flags().String("projectId", "", "manually set the project ID to fetch secrets from when using machine identity based auth")
runCmd.Flags().StringP("env", "e", "dev", "Set the environment (dev, prod, etc.) from which your secrets should be pulled from")
runCmd.Flags().Bool("expand", true, "Parse shell parameter expansions in your secrets")
runCmd.Flags().Bool("include-imports", true, "Import linked secrets ")
runCmd.Flags().Bool("recursive", false, "Fetch secrets from all sub-folders")
runCmd.Flags().Bool("secret-overriding", true, "Prioritizes personal secrets, if any, with the same name over shared secrets")
runCmd.Flags().StringP("env", "e", "dev", "set the environment (dev, prod, etc.) from which your secrets should be pulled from")
runCmd.Flags().Bool("expand", true, "parse shell parameter expansions in your secrets")
runCmd.Flags().Bool("include-imports", true, "import linked secrets ")
runCmd.Flags().Bool("recursive", false, "fetch secrets from all sub-folders")
runCmd.Flags().Bool("secret-overriding", true, "prioritizes personal secrets, if any, with the same name over shared secrets")
runCmd.Flags().Bool("watch", false, "enable reload of application when secrets change")
runCmd.Flags().Int("watch-interval", 10, "interval in seconds to check for secret changes")
runCmd.Flags().StringP("command", "c", "", "chained commands to execute (e.g. \"npm install && npm run dev; echo ...\")")
runCmd.Flags().StringP("tags", "t", "", "filter secrets by tag slugs ")
runCmd.Flags().String("path", "/", "get secrets within a folder path")
@@ -263,7 +236,7 @@ func executeSingleCommandWithEnvs(args []string, secretsCount int, env []string)
cmd.Stderr = os.Stderr
cmd.Env = env
return execCmd(cmd)
return execBasicCmd(cmd)
}
func executeMultipleCommandWithEnvs(fullCommand string, secretsCount int, env []string) error {
@@ -286,11 +259,10 @@ func executeMultipleCommandWithEnvs(fullCommand string, secretsCount int, env []
log.Info().Msgf(color.GreenString("Injecting %v Infisical secrets into your application process", secretsCount))
log.Debug().Msgf("executing command: %s %s %s \n", shell[0], shell[1], fullCommand)
return execCmd(cmd)
return execBasicCmd(cmd)
}
// Credit: inspired by AWS Valut
func execCmd(cmd *exec.Cmd) error {
func execBasicCmd(cmd *exec.Cmd) error {
sigChannel := make(chan os.Signal, 1)
signal.Notify(sigChannel)
@@ -314,3 +286,217 @@ func execCmd(cmd *exec.Cmd) error {
os.Exit(waitStatus.ExitStatus())
return nil
}
func waitForExitCommand(cmd *exec.Cmd) (int, error) {
if err := cmd.Wait(); err != nil {
// ignore errors
cmd.Process.Signal(os.Kill) // #nosec G104
if exitError, ok := err.(*exec.ExitError); ok {
return exitError.ExitCode(), exitError
}
return 2, err
}
waitStatus, ok := cmd.ProcessState.Sys().(syscall.WaitStatus)
if !ok {
return 2, fmt.Errorf("unexpected ProcessState type, expected syscall.WaitStatus, got %T", waitStatus)
}
return waitStatus.ExitStatus(), nil
}
func executeCommandWithWatchMode(commandFlag string, args []string, watchModeInterval int, request models.GetAllSecretsParameters, projectConfigDir string, expandSecrets bool, secretOverriding bool, token *models.TokenDetails) {
var cmd *exec.Cmd
var err error
var lastSecretsFetch time.Time
var lastUpdateEvent time.Time
var watchMutex sync.Mutex
var processMutex sync.Mutex
var beingTerminated = false
var currentETag string
if err != nil {
util.HandleError(err, "Failed to fetch secrets")
}
runCommandWithWatcher := func(environmentVariables models.InjectableEnvironmentResult) {
currentETag = environmentVariables.ETag
secretsFetchedAt := time.Now()
if secretsFetchedAt.After(lastSecretsFetch) {
lastSecretsFetch = secretsFetchedAt
}
shouldRestartProcess := cmd != nil
// terminate the old process before starting a new one
if shouldRestartProcess {
log.Info().Msg(color.HiMagentaString("[HOT RELOAD] Environment changes detected. Reloading process..."))
beingTerminated = true
log.Debug().Msgf(color.HiMagentaString("[HOT RELOAD] Sending SIGTERM to PID %d", cmd.Process.Pid))
if e := cmd.Process.Signal(syscall.SIGTERM); e != nil {
log.Error().Err(e).Msg(color.HiMagentaString("[HOT RELOAD] Failed to send SIGTERM"))
}
// wait up to 10 sec for the process to exit
for i := 0; i < 10; i++ {
if !util.IsProcessRunning(cmd.Process) {
// process has been killed so we break out
break
}
if i == 5 {
log.Debug().Msg(color.HiMagentaString("[HOT RELOAD] Still waiting for process exit status"))
}
time.Sleep(time.Second)
}
// SIGTERM may not work on Windows so we try SIGKILL
if util.IsProcessRunning(cmd.Process) {
log.Debug().Msg(color.HiMagentaString("[HOT RELOAD] Process still hasn't fully exited, attempting SIGKILL"))
if e := cmd.Process.Kill(); e != nil {
log.Error().Err(e).Msg(color.HiMagentaString("[HOT RELOAD] Failed to send SIGKILL"))
}
}
cmd = nil
} else {
// If `cmd` is nil, we know this is the first time we are starting the process
log.Info().Msg(color.HiMagentaString("[HOT RELOAD] Watching for secret changes..."))
}
processMutex.Lock()
if lastUpdateEvent.After(secretsFetchedAt) {
processMutex.Unlock()
return
}
beingTerminated = false
watcherWaitGroup.Add(1)
// start the process
log.Info().Msgf(color.GreenString("Injecting %v Infisical secrets into your application process", environmentVariables.SecretsCount))
cmd, err = util.RunCommand(commandFlag, args, environmentVariables.Variables, false)
if err != nil {
defer watcherWaitGroup.Done()
util.HandleError(err)
}
go func() {
defer processMutex.Unlock()
defer watcherWaitGroup.Done()
exitCode, err := waitForExitCommand(cmd)
// ignore errors if we are being terminated
if !beingTerminated {
if err != nil {
if strings.HasPrefix(err.Error(), "exec") || strings.HasPrefix(err.Error(), "fork/exec") {
log.Error().Err(err).Msg("Failed to execute command")
}
if err.Error() != ErrManualSignalInterrupt.Error() {
log.Error().Err(err).Msg("Process exited with error")
}
}
os.Exit(exitCode)
}
}()
}
recheckSecretsChannel := make(chan bool, 1)
recheckSecretsChannel <- true
// a simple goroutine that triggers the recheckSecretsChan every watch interval (defaults to 10 seconds)
go func() {
for {
time.Sleep(time.Duration(watchModeInterval) * time.Second)
recheckSecretsChannel <- true
}
}()
for {
<-recheckSecretsChannel
watchMutex.Lock()
newEnvironmentVariables, err := fetchAndFormatSecretsForShell(request, projectConfigDir, secretOverriding, expandSecrets, token)
if err != nil {
log.Error().Err(err).Msg("[HOT RELOAD] Failed to fetch secrets")
continue
}
if newEnvironmentVariables.ETag != currentETag {
runCommandWithWatcher(newEnvironmentVariables)
} else {
log.Debug().Msg("[HOT RELOAD] No changes detected in secrets, not reloading process")
}
watchMutex.Unlock()
}
}
func fetchAndFormatSecretsForShell(request models.GetAllSecretsParameters, projectConfigDir string, secretOverriding bool, shouldExpandSecrets bool, token *models.TokenDetails) (models.InjectableEnvironmentResult, error) {
if token != nil && token.Type == util.SERVICE_TOKEN_IDENTIFIER {
request.InfisicalToken = token.Token
} else if token != nil && token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER {
request.UniversalAuthAccessToken = token.Token
}
secrets, err := util.GetAllEnvironmentVariables(request, projectConfigDir)
if err != nil {
return models.InjectableEnvironmentResult{}, err
}
if secretOverriding {
secrets = util.OverrideSecrets(secrets, util.SECRET_TYPE_PERSONAL)
} else {
secrets = util.OverrideSecrets(secrets, util.SECRET_TYPE_SHARED)
}
if shouldExpandSecrets {
authParams := models.ExpandSecretsAuthentication{}
if token != nil && token.Type == util.SERVICE_TOKEN_IDENTIFIER {
authParams.InfisicalToken = token.Token
} else if token != nil && token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER {
authParams.UniversalAuthAccessToken = token.Token
}
secrets = util.ExpandSecrets(secrets, authParams, projectConfigDir)
}
secretsByKey := getSecretsByKeys(secrets)
environmentVariables := make(map[string]string)
// add all existing environment vars
for _, s := range os.Environ() {
kv := strings.SplitN(s, "=", 2)
key := kv[0]
value := kv[1]
environmentVariables[key] = value
}
// check to see if there are any reserved key words in secrets to inject
filterReservedEnvVars(secretsByKey)
// now add infisical secrets
for k, v := range secretsByKey {
environmentVariables[k] = v.Value
}
env := make([]string, 0, len(environmentVariables))
for key, value := range environmentVariables {
env = append(env, key+"="+value)
}
return models.InjectableEnvironmentResult{
Variables: env,
ETag: util.GenerateETagFromSecrets(secrets),
SecretsCount: len(secretsByKey),
}, nil
}

View File

@@ -104,6 +104,12 @@ type GetAllSecretsParameters struct {
Recursive bool
}
type InjectableEnvironmentResult struct {
Variables []string
ETag string
SecretsCount int
}
type GetAllFoldersParameters struct {
WorkspaceId string
Environment string

92
cli/packages/util/exec.go Normal file
View File

@@ -0,0 +1,92 @@
package util
import (
"fmt"
"os"
"os/exec"
"os/signal"
"runtime"
"syscall"
)
func RunCommand(singleCommand string, args []string, env []string, waitForExit bool) (*exec.Cmd, error) {
var c *exec.Cmd
var err error
if singleCommand != "" {
c, err = RunCommandFromString(singleCommand, env, waitForExit)
} else {
c, err = RunCommandFromArgs(args, env, waitForExit)
}
return c, err
}
func IsProcessRunning(p *os.Process) bool {
err := p.Signal(syscall.Signal(0))
return err == nil
}
// For "infisical run -- COMMAND"
func RunCommandFromArgs(args []string, env []string, waitForExit bool) (*exec.Cmd, error) {
cmd := exec.Command(args[0], args[1:]...)
cmd.Stdin = os.Stdin
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
cmd.Env = env
err := execCommand(cmd, waitForExit)
return cmd, err
}
func execCommand(cmd *exec.Cmd, waitForExit bool) error {
sigChannel := make(chan os.Signal, 1)
signal.Notify(sigChannel)
if err := cmd.Start(); err != nil {
return err
}
go func() {
for {
sig := <-sigChannel
_ = cmd.Process.Signal(sig) // process all sigs
}
}()
if !waitForExit {
return nil
}
if err := cmd.Wait(); err != nil {
_ = cmd.Process.Signal(os.Kill)
return fmt.Errorf("failed to wait for command termination: %v", err)
}
waitStatus := cmd.ProcessState.Sys().(syscall.WaitStatus)
os.Exit(waitStatus.ExitStatus())
return nil
}
// For "infisical run --command=COMMAND"
func RunCommandFromString(command string, env []string, waitForExit bool) (*exec.Cmd, error) {
shell := [2]string{"sh", "-c"}
if runtime.GOOS == "windows" {
shell = [2]string{"cmd", "/C"}
} else {
currentShell := os.Getenv("SHELL")
if currentShell != "" {
shell[0] = currentShell
}
}
cmd := exec.Command(shell[0], shell[1], command) // #nosec G204 nosemgrep: semgrep_configs.prohibit-exec-command
cmd.Env = env
cmd.Stdin = os.Stdin
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
err := execCommand(cmd, waitForExit)
return cmd, err
}

View File

@@ -4,6 +4,7 @@ import (
"bytes"
"crypto/sha256"
"encoding/base64"
"encoding/hex"
"fmt"
"math/rand"
"os"
@@ -298,3 +299,16 @@ func GenerateRandomString(length int) string {
}
return string(b)
}
func GenerateETagFromSecrets(secrets []models.SingleEnvironmentVariable) string {
sortedSecrets := SortSecretsByKeys(secrets)
content := []byte{}
for _, secret := range sortedSecrets {
content = append(content, []byte(secret.Key)...)
content = append(content, []byte(secret.Value)...)
}
hash := sha256.Sum256(content)
return fmt.Sprintf(`"%s"`, hex.EncodeToString(hash[:]))
}

View File

@@ -47,20 +47,20 @@ $ infisical run -- npm run dev
Used to fetch secrets via a [machine identity](/documentation/platform/identities/machine-identities) apposed to logged in credentials. Simply, export this variable in the terminal before running this command.
```bash
# Example
export INFISICAL_TOKEN=$(infisical login --method=universal-auth --client-id=<identity-client-id> --client-secret=<identity-client-secret> --silent --plain) # --plain flag will output only the token, so it can be fed to an environment variable. --silent will disable any update messages.
# Example
export INFISICAL_TOKEN=$(infisical login --method=universal-auth --client-id=<identity-client-id> --client-secret=<identity-client-secret> --silent --plain) # --plain flag will output only the token, so it can be fed to an environment variable. --silent will disable any update messages.
```
<Info>
Alternatively, you may use service tokens.
Please note, however, that service tokens are being deprecated in favor of [machine identities](/documentation/platform/identities/machine-identities). They will be removed in the future in accordance with the deprecation notice and timeline stated [here](https://infisical.com/blog/deprecating-api-keys).
```bash
# Example
export INFISICAL_TOKEN=<service-token>
# Example
export INFISICAL_TOKEN=<service-token>
```
</Info>
</Info>
</Accordion>
<Accordion title="INFISICAL_DISABLE_UPDATE_CHECK">
@@ -69,22 +69,30 @@ $ infisical run -- npm run dev
To use, simply export this variable in the terminal before running this command.
```bash
# Example
export INFISICAL_DISABLE_UPDATE_CHECK=true
# Example
export INFISICAL_DISABLE_UPDATE_CHECK=true
```
</Accordion>
### Flags
<Accordion title="--project-config-dir">
<Accordion title="--watch">
By passing the `watch` flag, you are telling the CLI to watch for changes that happen in your Infisical project.
If secret changes happen, the command you provided will automatically be restarted with the new environment variables attached.
```bash
# Example
infisical run --watch -- printenv
```
</Accordion>
<Accordion title="--project-config-dir">
Explicitly set the directory where the .infisical.json resides. This is useful for some monorepo setups.
```bash
# Example
infisical run --project-config-dir=/some-dir -- printenv
# Example
infisical run --project-config-dir=/some-dir -- printenv
```
</Accordion>
<Accordion title="--command">
@@ -172,3 +180,19 @@ $ infisical run -- npm run dev
</Accordion>
</Accordion>
## Automatically reload command when secrets change
To automatically reload your command when secrets change, use the `--watch` flag.
```bash
infisical run --watch -- npm run dev
```
This will watch for changes in your secrets and automatically restart your command with the new secrets.
When your command restarts, it will have the new environment variables injeceted into it.
<Note>
Please note that this feature is intended for development purposes. It is not recommended to use this in production environments. Generally it's not recommended to automatically reload your application in production when remote changes are made.
</Note>

View File

@@ -58,7 +58,7 @@ The Infisical AWS ElastiCache dynamic secret allows you to generate AWS ElastiCa
Open the Secret Overview dashboard and select the environment in which you would like to add a dynamic secret.
</Step>
<Step title="Click on the 'Add Dynamic Secret' button">
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button-redis.png)
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button.png)
</Step>
<Step title="Select 'AWS ElastiCache'">
![Dynamic Secret Modal](../../../images/platform/dynamic-secrets/dynamic-secret-modal-aws-elasti-cache.png)
@@ -116,7 +116,7 @@ The Infisical AWS ElastiCache dynamic secret allows you to generate AWS ElastiCa
When generating these secrets, it's important to specify a Time-to-Live (TTL) duration. This will dictate how long the credentials are valid for.
![Provision Lease](/images/platform/dynamic-secrets/provision-lease-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/provision-lease.png)
<Tip>
Ensure that the TTL for the lease fall within the maximum TTL defined when configuring the dynamic secret.
@@ -125,7 +125,7 @@ The Infisical AWS ElastiCache dynamic secret allows you to generate AWS ElastiCa
Once you click the `Submit` button, a new secret lease will be generated and the credentials from it will be shown to you.
![Provision Lease](/images/platform/dynamic-secrets/lease-values-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/lease-values.png)
</Step>
</Steps>
@@ -133,11 +133,11 @@ The Infisical AWS ElastiCache dynamic secret allows you to generate AWS ElastiCa
Once you have created one or more leases, you will be able to access them by clicking on the respective dynamic secret item on the dashboard.
This will allow you see the expiration time of the lease or delete a lease before it's set time to live.
![Provision Lease](/images/platform/dynamic-secrets/lease-data-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/lease-data.png)
## Renew Leases
To extend the life of the generated dynamic secret leases past its initial time to live, simply click on the **Renew** as illustrated below.
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew.png)
<Warning>
Lease renewals cannot exceed the maximum TTL set when configuring the dynamic secret

View File

@@ -23,7 +23,7 @@ The Infisical Elasticsearch dynamic secret allows you to generate Elasticsearch
Open the Secret Overview dashboard and select the environment in which you would like to add a dynamic secret.
</Step>
<Step title="Click on the 'Add Dynamic Secret' button">
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button-redis.png)
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button.png)
</Step>
<Step title="Select 'Elasticsearch'">
![Dynamic Secret Modal](../../../images/platform/dynamic-secrets/dynamic-secret-modal-elastic-search.png)
@@ -99,7 +99,7 @@ The Infisical Elasticsearch dynamic secret allows you to generate Elasticsearch
When generating these secrets, it's important to specify a Time-to-Live (TTL) duration. This will dictate how long the credentials are valid for.
![Provision Lease](/images/platform/dynamic-secrets/provision-lease-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/provision-lease.png)
<Tip>
Ensure that the TTL for the lease fall within the maximum TTL defined when configuring the dynamic secret.
@@ -108,7 +108,7 @@ The Infisical Elasticsearch dynamic secret allows you to generate Elasticsearch
Once you click the `Submit` button, a new secret lease will be generated and the credentials from it will be shown to you.
![Provision Lease](/images/platform/dynamic-secrets/lease-values-elastic-search.png)
![Provision Lease](/images/platform/dynamic-secrets/lease-values.png)
</Step>
</Steps>
@@ -116,11 +116,11 @@ The Infisical Elasticsearch dynamic secret allows you to generate Elasticsearch
Once you have created one or more leases, you will be able to access them by clicking on the respective dynamic secret item on the dashboard.
This will allow you see the expiration time of the lease or delete a lease before it's set time to live.
![Provision Lease](/images/platform/dynamic-secrets/lease-data-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/lease-data.png)
## Renew Leases
To extend the life of the generated dynamic secret leases past its initial time to live, simply click on the **Renew** as illustrated below.
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew.png)
<Warning>
Lease renewals cannot exceed the maximum TTL set when configuring the dynamic secret

View File

@@ -0,0 +1,116 @@
---
title: "RabbitMQ"
description: "Learn how to dynamically generate RabbitMQ user credentials."
---
The Infisical RabbitMQ dynamic secret allows you to generate RabbitMQ credentials on demand based on configured role.
## Prerequisites
1. Ensure that the `management` plugin is enabled on your RabbitMQ instance. This is required for the dynamic secret to work.
## Set up Dynamic Secrets with RabbitMQ
<Steps>
<Step title="Open Secret Overview Dashboard">
Open the Secret Overview dashboard and select the environment in which you would like to add a dynamic secret.
</Step>
<Step title="Click on the 'Add Dynamic Secret' button">
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button.png)
</Step>
<Step title="Select 'RabbitMQ'">
![Dynamic Secret Modal](../../../images/platform/dynamic-secrets/dynamic-secret-modal-rabbit-mq.png)
</Step>
<Step title="Provide the inputs for dynamic secret parameters">
<ParamField path="Secret Name" type="string" required>
Name by which you want the secret to be referenced
</ParamField>
<ParamField path="Default TTL" type="string" required>
Default time-to-live for a generated secret (it is possible to modify this value when a secret is generate)
</ParamField>
<ParamField path="Max TTL" type="string" required>
Maximum time-to-live for a generated secret.
</ParamField>
<ParamField path="Host" type="string" required>
Your RabbitMQ host. This must be in HTTP format. _(Example: http://your-cluster-ip)_
</ParamField>
<ParamField path="Port" type="string" required>
The port that the RabbitMQ management plugin is listening on. This is `15672` by default.
</ParamField>
<ParamField path="Virtual host name" type="string" required>
The name of the virtual host that the user will be assigned to. This defaults to `/`.
</ParamField>
<ParamField path="Virtual host permissions (Read/Write/Configure)" type="string" required>
The permissions that the user will have on the virtual host. This defaults to `.*`.
The three permission fields all take a regular expression _(regex)_, that should match resource names for which the user is granted read / write / configuration permissions
</ParamField>
<ParamField path="Username" type="string" required>
The username of the user that will be used to provision new dynamic secret leases.
</ParamField>
<ParamField path="Password" type="string" required>
The password of the user that will be used to provision new dynamic secret leases.
</ParamField>
<ParamField path="CA(SSL)" type="string">
A CA may be required if your DB requires it for incoming connections. This is often the case when connecting to a managed service.
</ParamField>
![Dynamic Secret Setup Modal](../../../images/platform/dynamic-secrets/dynamic-secret-input-modal-rabbit-mq.png)
</Step>
<Step title="Click `Submit`">
After submitting the form, you will see a dynamic secret created in the dashboard.
<Note>
If this step fails, you may have to add the CA certificate.
</Note>
</Step>
<Step title="Generate dynamic secrets">
Once you've successfully configured the dynamic secret, you're ready to generate on-demand credentials.
To do this, simply click on the 'Generate' button which appears when hovering over the dynamic secret item.
Alternatively, you can initiate the creation of a new lease by selecting 'New Lease' from the dynamic secret lease list section.
![Dynamic Secret](/images/platform/dynamic-secrets/dynamic-secret-generate-redis.png)
![Dynamic Secret](/images/platform/dynamic-secrets/dynamic-secret-lease-empty-redis.png)
When generating these secrets, it's important to specify a Time-to-Live (TTL) duration. This will dictate how long the credentials are valid for.
![Provision Lease](/images/platform/dynamic-secrets/provision-lease.png)
<Tip>
Ensure that the TTL for the lease fall within the maximum TTL defined when configuring the dynamic secret.
</Tip>
Once you click the `Submit` button, a new secret lease will be generated and the credentials from it will be shown to you.
![Provision Lease](/images/platform/dynamic-secrets/lease-values.png)
</Step>
</Steps>
## Audit or Revoke Leases
Once you have created one or more leases, you will be able to access them by clicking on the respective dynamic secret item on the dashboard.
This will allow you see the expiration time of the lease or delete a lease before it's set time to live.
![Provision Lease](/images/platform/dynamic-secrets/lease-data.png)
## Renew Leases
To extend the life of the generated dynamic secret leases past its initial time to live, simply click on the **Renew** as illustrated below.
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew.png)
<Warning>
Lease renewals cannot exceed the maximum TTL set when configuring the dynamic secret
</Warning>

View File

@@ -16,7 +16,7 @@ Create a user with the required permission in your Redis instance. This user wil
Open the Secret Overview dashboard and select the environment in which you would like to add a dynamic secret.
</Step>
<Step title="Click on the 'Add Dynamic Secret' button">
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button-redis.png)
![Add Dynamic Secret Button](../../../images/platform/dynamic-secrets/add-dynamic-secret-button.png)
</Step>
<Step title="Select 'Redis'">
![Dynamic Secret Modal](../../../images/platform/dynamic-secrets/dynamic-secret-modal-redis.png)
@@ -78,7 +78,7 @@ Create a user with the required permission in your Redis instance. This user wil
When generating these secrets, it's important to specify a Time-to-Live (TTL) duration. This will dictate how long the credentials are valid for.
![Provision Lease](/images/platform/dynamic-secrets/provision-lease-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/provision-lease.png)
<Tip>
Ensure that the TTL for the lease fall within the maximum TTL defined when configuring the dynamic secret.
@@ -87,7 +87,7 @@ Create a user with the required permission in your Redis instance. This user wil
Once you click the `Submit` button, a new secret lease will be generated and the credentials from it will be shown to you.
![Provision Lease](/images/platform/dynamic-secrets/lease-values-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/lease-values.png)
</Step>
</Steps>
@@ -95,11 +95,11 @@ Create a user with the required permission in your Redis instance. This user wil
Once you have created one or more leases, you will be able to access them by clicking on the respective dynamic secret item on the dashboard.
This will allow you see the expiration time of the lease or delete a lease before it's set time to live.
![Provision Lease](/images/platform/dynamic-secrets/lease-data-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/lease-data.png)
## Renew Leases
To extend the life of the generated dynamic secret leases past its initial time to live, simply click on the **Renew** as illustrated below.
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew-redis.png)
![Provision Lease](/images/platform/dynamic-secrets/dynamic-secret-lease-renew.png)
<Warning>
Lease renewals cannot exceed the maximum TTL set when configuring the dynamic secret

View File

@@ -93,7 +93,12 @@ In the following steps, we explore how to create and use identities to access th
- Access Token Max TTL (default is `2592000` equivalent to 30 days): The maximum lifetime for an acccess token in seconds. This value will be referenced at renewal time.
- Access Token Max Number of Uses (default is `0`): The maximum number of times that an access token can be used; a value of `0` implies infinite number of uses.
- Access Token Trusted IPs: The IPs or CIDR ranges that access tokens can be used from. By default, each token is given the `0.0.0.0/0`, allowing usage from any network address.
<Info>
The `subject`, `audiences`, and `claims` fields support glob pattern matching; however, we highly recommend using hardcoded values whenever possible.
</Info>
</Step>
<Step title="Adding an identity to a project">
To enable the identity to access project-level resources such as secrets within a specific project, you should add it to that project.

View File

@@ -92,8 +92,8 @@ In the following steps, we explore how to create and use identities to access th
- Access Token Max TTL (default is `2592000` equivalent to 30 days): The maximum lifetime for an acccess token in seconds. This value will be referenced at renewal time.
- Access Token Max Number of Uses (default is `0`): The maximum number of times that an access token can be used; a value of `0` implies infinite number of uses.
- Access Token Trusted IPs: The IPs or CIDR ranges that access tokens can be used from. By default, each token is given the `0.0.0.0/0`, allowing usage from any network address.
<Tip>If you are unsure about what to configure for the subject, audience, and claims fields you can use [github/actions-oidc-debugger](https://github.com/github/actions-oidc-debugger) to get the appropriate values. Alternatively, you can fetch the JWT from the workflow and inspect the fields manually.</Tip>
<Info>The `subject`, `audiences`, and `claims` fields support glob pattern matching; however, we highly recommend using hardcoded values whenever possible.</Info>
</Step>
<Step title="Adding an identity to a project">
To enable the identity to access project-level resources such as secrets within a specific project, you should add it to that project.

View File

@@ -1,111 +0,0 @@
---
title: "Certificate Templates"
sidebarTitle: "Certificate Templates"
description: "Learn how to use certificate templates to enforce policies."
---
## Concept
In order to ensure your certificates follow certain policies, you can use certificate templates during the issuance and signing flows.
A certificate template is linked to a certificate authority. It contains custom policies for certificate fields, allowing you to define rules based on your security policies.
## Workflow
The typical workflow for using certificate templates consists of the following steps:
1. Creating a certificate template attached to an existing CA along with defining custom rules for certificate fields.
2. Selecting the certificate template during the creation of new certificates.
<Note>
Note that this workflow can be executed via the Infisical UI or manually such
as via API.
</Note>
## Guide to using Certificate Templates
In the following steps, we explore how to issue a X.509 certificate using a certificate template.
<Tabs>
<Tab title="Infisical UI">
<Steps>
<Step title="Creating the certificate template">
To create a certificate template, head to your Project > Internal PKI > Certificate Templates and press **Create Certificate Template**.
![certificate-template create template dashboard](/images/platform/pki/certificate-template/create-template-dashboard.png)
Here, set the **Issuing CA** to the CA you want to issue certificates under when the certificate template is used.
![certificate-template create template modal](/images/platform/pki/certificate-template/create-template-form.png)
Here's some guidance on each field:
- Template Name: A descriptive name for the certificate template.
- Issuing CA: The Certificate Authority (CA) that will issue certificates based on this template.
- Certificate Collection: The collection where certificates issued with this template will be added.
- Common Name (CN): The regular expression used to validate the common name in certificate requests.
- Alternative Names (SANs): The regular expression used to validate subject alternative names in certificate requests.
- TTL: The maximum Time-to-Live (TTL) for certificates issued using this template.
</Step>
<Step title="Using the certificate template">
Once you have created the certificate template from step 1, you can select it when issuing certificates.
![certificate-template select template](/images/platform/pki/certificate-template/select-template.png)
</Step>
</Steps>
</Tab>
<Tab title="API">
<Steps>
<Step title="Creating the certificate template">
To create a certificate template, make an API request to the [Create Certificate Template](/api-reference/endpoints/certificate-templates/create) API endpoint.
### Sample request
```bash Request
curl --request POST \
--url https://app.infisical.com/api/v1/pki/certificate-templates \
--header 'Content-Type: application/json' \
--data '{
"caId": "<string>",
"pkiCollectionId": "<string>",
"name": "<string>",
"commonName": "<string>",
"subjectAlternativeName": "<string>",
"ttl": "<string>"
}'
```
### Sample response
```bash Response
{
"id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
"caId": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
"name": "certificate-template-1",
"commonName": "<string>",
...
}
```
</Step>
<Step title="Using the certificate template">
To use the certificate template, attach the certificate template ID when invoking the API endpoint for [issuing](/api-reference/endpoints/certificates/issue-certificate) or [signing](/api-reference/endpoints/certificates/sign-certificate) new certificates.
### Sample request
```bash Request
curl --request POST \
--url https://app.infisical.com/api/v1/pki/certificates/issue-certificate \
--header 'Content-Type: application/json' \
--data '{
"certificateTemplateId": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
"friendlyName": "my-new-certificate",
"commonName": "CERT",
...
}'
```
</Step>
</Steps>
</Tab>
</Tabs>

View File

@@ -25,7 +25,7 @@ graph TD
The typical workflow for managing certificates consists of the following steps:
1. Issuing a certificate under an intermediate CA with details like name and validity period.
1. Issuing a certificate under an intermediate CA with details like name and validity period. As part of certificate issuance, you can either issue a certificate directly from a CA or do it via a certificate template.
2. Managing certificate lifecycle events such as certificate renewal and revocation. As part of the certificate revocation flow,
you can also query for a Certificate Revocation List [CRL](https://en.wikipedia.org/wiki/Certificate_revocation_list), a time-stamped, signed
data structure issued by a CA containing a list of revoked certificates to check if a certificate has been revoked.
@@ -43,28 +43,51 @@ In the following steps, we explore how to issue a X.509 certificate under a CA.
<Tab title="Infisical UI">
<Steps>
<Step title="Creating a certificate template">
A certificate template is a set of policies for certificates issued under that template; each template is bound to a specific CA and can also be bound to a certificate collection for alerting such that any certificate issued under the template is automatically added to the collection.
With certificate templates, you can specify, for example, that issued certificates must have a common name (CN) adhering to a specific format like `.*.acme.com` or perhaps that the max TTL cannot be more than 1 year.
Head to your Project > Certificate Authorities > Your Issuing CA and create a certificate template.
![pki certificate template modal](/images/platform/pki/certificate/cert-template-modal.png)
Here's some guidance on each field:
- Template Name: A name for the certificate template.
- Issuing CA: The Certificate Authority (CA) that will issue certificates based on this template.
- Certificate Collection (Optional): The certificate collection that certificates should be added to when issued under the template.
- Common Name (CN): A regular expression used to validate the common name in certificate requests.
- Alternative Names (SANs): A regular expression used to validate subject alternative names in certificate requests.
- TTL: The maximum Time-to-Live (TTL) for certificates issued using this template.
</Step>
<Step title="Creating a certificate">
To create a certificate, head to your Project > Internal PKI > Certificates and press **Create Certificate**.
To create a certificate, head to your Project > Internal PKI > Certificates and press **Issue** under the Certificates section.
![pki issue certificate](/images/platform/pki/cert-issue.png)
![pki issue certificate](/images/platform/pki/certificate/cert-issue.png)
Here, set the **CA** to the CA you want to issue the certificate under and fill out details for the certificate.
Here, set the **Certificate Template** to the template from step 1 and fill out the rest of the details for the certificate to be issued.
![pki issue certificate modal](/images/platform/pki/cert-issue-modal.png)
![pki issue certificate modal](/images/platform/pki/certificate/cert-issue-modal.png)
Here's some guidance on each field:
- Issuing CA: The CA under which to issue the certificate.
- Friendly Name: A friendly name for the certificate; this is only for display and defaults to the common name of the certificate if left empty.
- Common Name (CN): The (common) name for the certificate like `service.acme.com`.
- Alternative Names (SANs): A comma-delimited list of Subject Alternative Names (SANs) for the certificate; these can be host names or email addresses like `app1.acme.com, app2.acme.com`.
- TTL: The lifetime of the certificate in seconds.
<Note>
Note that Infisical PKI supports issuing certificates without certificate templates as well. If this is desired, then you can set the **Certificate Template** field to **None**
and specify the **Issuing CA** and optional **Certificate Collection** fields; the rest of the fields for the issued certificate remain the same.
That said, we recommend using certificate templates to enforce policies and attach expiration monitoring on issued certificates.
</Note>
</Step>
<Step title="Copying the certificate details">
Once you have created the certificate from step 1, you'll be presented with the certificate details including the **Certificate Body**, **Certificate Chain**, and **Private Key**.
![pki certificate body](/images/platform/pki/cert-body.png)
![pki certificate body](/images/platform/pki/certificate/cert-body.png)
<Note>
Make sure to download and store the **Private Key** in a secure location as it will only be displayed once at the time of certificate issuance.
@@ -74,16 +97,54 @@ In the following steps, we explore how to issue a X.509 certificate under a CA.
</Steps>
</Tab>
<Tab title="API">
To create a certificate, make an API request to the [Issue Certificate](/api-reference/endpoints/certificates/issue-cert) API endpoint,
<Steps>
<Step title="Creating a certificate template">
A certificate template is a set of policies for certificates issued under that template; each template is bound to a specific CA and can also be bound to a certificate collection for alerting such that any certificate issued under the template is automatically added to the collection.
With certificate templates, you can specify, for example, that issued certificates must have a common name (CN) adhering to a specific format like .*.acme.com or perhaps that the max TTL cannot be more than 1 year.
To create a certificate template, make an API request to the [Create Certificate Template](/api-reference/endpoints/certificate-templates/create) API endpoint, specifying the issuing CA.
### Sample request
```bash Request
curl --location --request POST 'https://app.infisical.com/api/v1/pki/certificate-templates' \
--header 'Content-Type: application/json' \
--data-raw '{
"caId": "<ca-id>",
"name": "My Certificate Template",
"commonName": ".*.acme.com",
"subjectAlternativeName": ".*.acme.com",
"ttl": "1y",
}'
```
### Sample response
```bash Response
{
id: "...",
caId: "...",
name: "...",
commonName: "...",
subjectAlternativeName: "...",
ttl: "...",
}
```
</Step>
<Step title="Creating a certificate">
To create a certificate under the certificate template, make an API request to the [Issue Certificate](/api-reference/endpoints/certificates/issue-cert) API endpoint,
specifying the issuing CA.
### Sample request
```bash Request
curl --location --request POST 'https://app.infisical.com/api/v1/pki/ca/<ca-id>/issue-certificate' \
curl --location --request POST 'https://app.infisical.com/api/v1/pki/certificates/issue-certificate' \
--header 'Content-Type: application/json' \
--data-raw '{
"commonName": "My Certificate",
"certificateTemplateId": "<certificate-template-id>",
"commonName": "service.acme.com",
"ttl": "1y",
}'
```
@@ -100,18 +161,26 @@ In the following steps, we explore how to issue a X.509 certificate under a CA.
}
```
<Note>
Note that Infisical PKI supports issuing certificates without certificate templates as well. If this is desired, then you can set the **Certificate Template** field to **None**
and specify the **Issuing CA** and optional **Certificate Collection** fields; the rest of the fields for the issued certificate remain the same.
That said, we recommend using certificate templates to enforce policies and attach expiration monitoring on issued certificates.
</Note>
<Note>
Make sure to store the `privateKey` as it is only returned once here at the time of certificate issuance. The `certificate` and `certificateChain` will remain accessible and can be retrieved at any time.
</Note>
If you have an external private key, you can also create a certificate by making an API request containing a pem-encoded CSR (Certificate Signing Request) to the [Sign Certificate](/api-reference/endpoints/certificates/sign-cert) API endpoint, specifying the issuing CA.
If you have an external private key, you can also create a certificate by making an API request containing a pem-encoded CSR (Certificate Signing Request) to the [Sign Certificate](/api-reference/endpoints/certificates/sign-certificate) API endpoint, specifying the issuing CA.
### Sample request
```bash Request
curl --location --request POST 'https://app.infisical.com/api/v1/pki/ca/<ca-id>/sign-certificate' \
curl --location --request POST 'https://app.infisical.com/api/v1/pki/certificates/sign-certificate' \
--header 'Content-Type: application/json' \
--data-raw '{
"certificateTemplateId": "<certificate-template-id>",
"csr": "...",
"ttl": "1y",
}'
@@ -128,7 +197,8 @@ In the following steps, we explore how to issue a X.509 certificate under a CA.
serialNumber: "..."
}
```
</Step>
</Steps>
</Tab>
</Tabs>

View File

@@ -26,7 +26,7 @@ These endpoints are exposed on port 8443 under the .well-known/est path e.g.
## Guide to configuring EST
1. Set up a certificate template with your selected issuing CA. This template will define the policies and parameters for certificates issued through EST. For detailed instructions on configuring a certificate template, refer to the certificate templates [documentation](/documentation/platform/pki/certificate-templates).
1. Set up a certificate template with your selected issuing CA. This template will define the policies and parameters for certificates issued through EST. For detailed instructions on configuring a certificate template, refer to the certificate templates [documentation](/documentation/platform/pki/certificates#guide-to-issuing-certificates).
2. Proceed to the certificate template's enrollment settings
![est enrollment dashboard](/images/platform/pki/est/template-enroll-hover.png)

View File

@@ -214,10 +214,13 @@ In the following steps, we explore how to install the Infisical PKI Issuer using
Data
====
ca.crt: 1306 bytes
tls.crt: 2380 bytes
tls.key: 227 bytes
tls.crt: 912 bytes
```
Here, `ca.crt` is the Root CA certificate, `tls.crt` is the requested certificate followed by the certificate chain, and `tls.key` is the private key for the certificate.
We can decode the certificate and print it out using `openssl`:
```bash

View File

@@ -66,6 +66,7 @@ consisting of an (optional) root CA and an intermediate CA.
- State or Province Name: The state or province.
- Locality Name: The city or locality.
- Common Name: The name of the CA.
- Require Template for Certificate Issuance: Whether or not certificates for this CA can only be issued through certificate templates (recommended).
<Note>
The Organization, Country, State or Province Name, Locality Name, and Common Name make up the **Distinguished Name (DN)** or **subject** of the CA.

View File

@@ -49,8 +49,8 @@ description: "Learn how to configure Microsoft Entra ID for Infisical SSO."
Back in the **Set up Single Sign-On with SAML** screen, select **Edit** in the **Attributes & Claims** section and configure the following map:
- `email -> user.userprinciplename`
- `firstName -> user.firstName`
- `lastName -> user.lastName`
- `firstName -> user.givenname`
- `lastName -> user.surname`
![Azure SAML edit attributes and claims](../../../images/sso/azure/edit-attributes-claims.png)
@@ -62,7 +62,7 @@ description: "Learn how to configure Microsoft Entra ID for Infisical SSO."
![Azure SAML edit certificate signing option](../../../images/sso/azure/edit-saml-certificate-2.png)
</Step>
<Step title="Retrieve Identity Provider (IdP) Information from Okta">
<Step title="Retrieve Identity Provider (IdP) Information from Azure">
In the **Set up Single Sign-On with SAML** screen, copy the **Login URL** and **SAML Certificate** to use when finishing configuring Azure SAML in Infisical.
![Azure SAML identity provider values 1](../../../images/sso/azure/idp-values.png)
@@ -115,4 +115,4 @@ description: "Learn how to configure Microsoft Entra ID for Infisical SSO."
- `AUTH_SECRET`: A secret key used for signing and verifying JWT. This can be a random 32-byte base64 string generated with `openssl rand -base64 32`.
- `SITE_URL`: The URL of your self-hosted instance of Infisical - should be an absolute URL including the protocol (e.g. https://app.infisical.com)
</Note>
</Note>

View File

@@ -0,0 +1,142 @@
---
title: "Slack integration"
description: "Learn how to setup Slack integration"
---
This guide will provide step by step instructions on how to configure Slack integration for your Infisical projects.
<Tabs>
<Tab title="Infisical Cloud">
## Create Slack workflow integration
<Steps>
<Step title="Navigate to the Workflow Integrations tab in your organization settings">
In order to use Slack integration in your projects, you will first have to
configure a Slack workflow integration in your organization.
![org-slack-overview](/images/platform/workflow-integrations/slack-integration/org-slack-integration-overview.png)
</Step>
<Step title="Install Slack app to workspace">
Press "Add" and select "Slack" as the platform.
![org-slack-initial-add](/images/platform/workflow-integrations/slack-integration/org-slack-integration-initial-add.png)
Give your Slack integration a descriptive alias. You will use this to select the Slack integration for your project.
![org-slack-add-form](/images/platform/workflow-integrations/slack-integration/org-slack-integration-add-form.png)
Press **Connect Slack**. This opens up the Slack app installation flow. Select the Slack workspace you want to install the custom Slack app to and press **Allow**.
![org-slack-authenticate](/images/platform/workflow-integrations/slack-integration/cloud-org-slack-integration-authenticate.png)
This completes the workflow integration creation flow. The projects in your organization can now use this Slack integration to send real-time updates to your Slack workspace.
![org-slack-workspace](/images/platform/workflow-integrations/slack-integration/cloud-org-slack-integration-workspace.png)
![org-slack-created](/images/platform/workflow-integrations/slack-integration/org-slack-integration-created.png)
</Step>
</Steps>
## Configure project to use Slack workflow integration
<Steps>
<Step title="Navigate to the Workflow Integrations tab in the project settings">
![project-slack-overview](/images/platform/workflow-integrations/slack-integration/project-slack-integration-overview.png)
</Step>
<Step title="Select the Slack integration to use for the project">
Your project will send notifications to the connected Slack workspace of the
selected Slack integration when the configured events are triggered.
![project-slack-select](/images/platform/workflow-integrations/slack-integration/project-slack-integration-select.png)
</Step>
<Step title="Configure the Slack notification settings for the project and click Save.">
![project-slack-select](/images/platform/workflow-integrations/slack-integration/project-slack-integration-config.png)
<Info>
To enable notifications in private Slack channels, you need to invite the Infisical Slack bot to join those channels.
</Info>
You now have a working native integration with Slack!
</Step>
</Steps>
</Tab>
<Tab title="Self-hosted setup">
## Configure admin settings
Note that this step only has to be done once for the entire instance.
<Steps>
<Step title="Navigate to the Integrations tab in the Admin settings">
Before anything else, you need to setup the Slack app to be used by
your Infisical instance. Because you're self-hosting, you will need to
create this Slack application as demonstrated in the preceding step.
![admin-settings-slack-overview](/images/platform/workflow-integrations/slack-integration/admin-slack-integration-overview.png)
</Step>
<Step title="Create Slack app">
Click the "Create Slack app" button. This will open up a new window with the
custom app creation flow on Slack.
![admin-slack-create-app](/images/platform/workflow-integrations/slack-integration/admin-slack-integration-create-app.png)
Select the Slack workspace you want to integrate with Infisical.
![admin-slack-app-workspace-select](/images/platform/workflow-integrations/slack-integration/admin-slack-integration-app-workspace-select.png)
The configuration values of your custom Slack app will be pre-filled for you. You can view or edit the app manifest by clicking **Edit Configurations**.
![admin-slack-app-summary](/images/platform/workflow-integrations/slack-integration/admin-slack-integration-app-summary.png)
Once everything's confirmed, press Create.
</Step>
<Step title="Input app credentials from Slack">
Copy the Client ID and Client Secret values from your newly created custom Slack app and add them to Infisical.
![admin-slack-app-credentials](/images/platform/workflow-integrations/slack-integration/admin-slack-integration-app-credentials.png)
![admin-slack-app-credentials-form](/images/platform/workflow-integrations/slack-integration/admin-slack-integration-app-credential-form.png)
Complete the admin setup by pressing Save.
</Step>
</Steps>
## Create Slack workflow integration
<Steps>
<Step title="Navigate to the Workflow Integrations tab in your organization settings">
In order to use Slack integration in your projects, you will first have to
configure a Slack workflow integration in your organization.
![org-slack-overview](/images/platform/workflow-integrations/slack-integration/org-slack-integration-overview.png)
</Step>
<Step title="Install Slack app to workspace">
Press "Add" and select "Slack" as the platform.
![org-slack-initial-add](/images/platform/workflow-integrations/slack-integration/org-slack-integration-initial-add.png)
Give your Slack integration a descriptive alias. You will use this to select the Slack integration for your project.
![org-slack-add-form](/images/platform/workflow-integrations/slack-integration/org-slack-integration-add-form.png)
Press **Connect Slack**. This opens up the Slack app installation flow. Select the Slack workspace you want to install the custom Slack app to and press **Allow**.
![org-slack-authenticate](/images/platform/workflow-integrations/slack-integration/org-slack-integration-authenticate.png)
Your Slack bot will then be added to your selected Slack workspace. This completes the workflow integration creation flow. Your projects in the organization can now use this Slack integration to send real-time updates to your Slack workspace.
![org-slack-workspace](/images/platform/workflow-integrations/slack-integration/org-slack-integration-workspace.png)
![org-slack-created](/images/platform/workflow-integrations/slack-integration/org-slack-integration-created.png)
</Step>
</Steps>
## Configure project to use Slack workflow integration
<Steps>
<Step title="Navigate to the Workflow Integrations tab in the project settings">
![project-slack-overview](/images/platform/workflow-integrations/slack-integration/project-slack-integration-overview.png)
</Step>
<Step title="Select the Slack integration to use for the project">
Your project will send notifications to the connected Slack workspace of the
selected Slack integration when the configured events are triggered.
![project-slack-select](/images/platform/workflow-integrations/slack-integration/project-slack-integration-select.png)
</Step>
<Step title="Configure the Slack notification settings for the project and click Save.">
![project-slack-select](/images/platform/workflow-integrations/slack-integration/project-slack-integration-config.png)
<Info>
To enable notifications in private Slack channels, you need to invite your Slack bot to join those channels.
</Info>
You now have a working native integration with Slack!
</Step>
</Steps>
</Tab>
</Tabs>

Some files were not shown because too many files have changed in this diff Show More