Compare commits

...

197 Commits

Author SHA1 Message Date
fbfe694fc0 improvement: add overflow handling to integration filter dropdown 2024-12-05 09:13:39 -08:00
a078cb6059 improvement: add search to cloud integrations 2024-12-04 21:00:33 -08:00
132de1479d improvement: only sort by status if 1 or more integrations is failing to sync; otherwise sort by integration 2024-12-04 17:25:48 -08:00
d4a76b3621 improvement: add support for ordering by destination 2024-12-04 17:11:47 -08:00
331dcd4d79 improvement: support search by integration destination 2024-12-04 17:06:43 -08:00
025f64f068 improvement: hide secret suffix if not set 2024-12-04 17:02:01 -08:00
05d7f94518 improvement: add margin to integrations table view 2024-12-04 17:00:09 -08:00
b58e32c754 fix: actually implement env filter for integrations 2024-12-04 16:55:05 -08:00
fb6a085bf9 chore: remove comment and unused component 2024-12-03 15:01:35 -08:00
6c533f89d3 feature: high-level integrations refactor 2024-12-03 14:53:33 -08:00
1049f95952 Merge pull request #2816 from Infisical/create-secret-form-env-multi-select
Improvement: Multi-select for Environment Selection on Create Secret
2024-12-02 11:02:51 -08:00
e618d5ca5f Merge pull request #2821 from Infisical/secret-approval-filterable-selects
Improvement: Secret Approval Form Filterable Selects
2024-12-02 10:37:16 -08:00
d659250ce8 improvement: change selected project icon from eye to chevron 2024-12-02 10:20:57 -08:00
87363eabfe chore: remove comments 2024-12-02 09:46:53 -08:00
d1b9c316d8 improvement: use multi-select for environment selection on create secret 2024-12-02 09:45:43 -08:00
b9867c0d06 Merge branch 'main' into secret-approval-filterable-selects 2024-12-02 09:44:04 -08:00
afa2f383c5 improvement: address feedback 2024-12-02 09:35:03 -08:00
39f7354fec Merge pull request #2814 from Infisical/add-group-to-project-filterable-selects
Improvement: User/Group/Identity Modals Dropdown to Filterable Select Refactor + User Groups and Secret Tags Table Pagination
2024-12-02 08:20:01 -08:00
c46c0cb1e8 Merge pull request #2824 from Infisical/environment-select-refactor
Improvement: Copy Secrets Modal & Environment Selects Improvements
2024-12-02 08:05:12 -08:00
6905ffba4e improvement: handle overflow and improve ui 2024-11-29 13:43:06 -08:00
64fd423c61 improvement: update import secret env select 2024-11-29 13:34:36 -08:00
da1a7466d1 improvement: change label 2024-11-29 13:28:53 -08:00
d3f3f34129 improvement: update copy secrets from env select and secret selection 2024-11-29 13:27:24 -08:00
c8fba7ce4c improvement: align pagination left on grid view project overview 2024-11-29 11:17:54 -08:00
82c3e943eb Merge pull request #2822 from Infisical/daniel/fix-cli-tests-2
fix(cli): tests failing
2024-11-29 14:02:46 -05:00
dc3903ff15 fix(cli): disabled test 2024-11-29 23:02:18 +04:00
a9c01dcf1f Merge pull request #2810 from Infisical/project-sidebar-dropdown-filter
Improvement: Sidebar Project Selection Filter Support
2024-11-29 10:59:51 -08:00
ae51fbb8f2 chore: revert license 2024-11-29 10:53:22 -08:00
62910e93ca fix: remove labels for options(outdated) 2024-11-29 10:52:49 -08:00
586b9d9a56 fix(cli): tests failing 2024-11-29 22:48:39 +04:00
9e3c632a1f chore: revert license 2024-11-29 10:44:26 -08:00
bb094f60c1 improvement: update secret approval policy form to use filterable selects w/ UI revisions 2024-11-29 10:44:05 -08:00
6d709fba62 Merge pull request #2820 from Infisical/daniel/fix-cli-tests
fix(cli): CLI tests failing due to dynamic request ID
2024-11-29 13:43:13 -05:00
27beca7099 fix(cli): request filter bug 2024-11-29 22:38:28 +04:00
28e7e4c52d Merge pull request #2818 from Infisical/doc/k8-infisical-csi-provider
doc: added docs for infisical csi provider
2024-11-29 13:37:37 -05:00
cfc0ca1f03 fix(cli): filter out dynamically generated request ID 2024-11-29 22:31:31 +04:00
b96593d0ab fix(cli): re-enabled disabled test 2024-11-29 22:30:52 +04:00
2de5896ba4 fix(cli): update snapshots 2024-11-29 22:23:42 +04:00
3455ad3898 misc: correct faq 1 2024-11-30 02:17:11 +08:00
c7a32a3b05 misc: updated docs 2024-11-30 02:13:43 +08:00
1ebfed8c11 Merge pull request #2808 from Infisical/daniel/copy-secret-path
improvement: copy full secret path
2024-11-29 20:33:29 +04:00
a18f3c2919 progress 2024-11-29 08:19:02 -08:00
16d215b588 add todo(author) to previous existing comment 2024-11-29 08:17:34 -08:00
a852b15a1e improvement: move environment filters beneath static filters 2024-11-29 08:11:04 -08:00
cacd9041b0 Merge pull request #2790 from Infisical/daniel/paths-tip
improvement(ui): approval policy modal
2024-11-29 20:10:13 +04:00
cfeffebd46 Merge pull request #2819 from akhilmhdh/fix/cli-broken
fix: dynamic secret broken due to merge of another issue
2024-11-29 11:05:34 -05:00
=
1dceedcdb4 fix: dynamic secret broken due to merge of another issue 2024-11-29 21:27:11 +05:30
14f03c38c3 Merge pull request #2709 from akhilmhdh/feat/recursive-secret-test
Added testing for secret recursive operation
2024-11-29 21:09:17 +05:30
be9f096e75 Merge pull request #2817 from akhilmhdh/feat/org-permission-issue
feat: removed unusued permission from org admin
2024-11-29 10:28:52 -05:00
=
49133a044f feat: resolved an issue without recursive matching 2024-11-29 19:59:34 +05:30
=
b7fe3743db feat: resolved recursive testcase change failing test 2024-11-29 19:45:10 +05:30
=
c5fded361c feat: added e2ee test for recursive secret operation 2024-11-29 19:45:10 +05:30
=
e676acbadf feat: added e2ee test for recursive secret operation 2024-11-29 19:45:10 +05:30
9b31a7bbb1 misc: added important note 2024-11-29 22:13:47 +08:00
345be85825 misc: finalized flag desc 2024-11-29 21:39:37 +08:00
f82b11851a misc: made snippet into info 2024-11-29 21:10:55 +08:00
b466b3073b misc: updated snippet to be copy+paste friendly 2024-11-29 21:09:37 +08:00
46105fc315 doc: added docs for infisical csi provider 2024-11-29 20:33:03 +08:00
=
3cf8fd2ff8 feat: removed unusued permission from org admin 2024-11-29 15:12:20 +05:30
5277a50b3e Update NavHeader.tsx 2024-11-29 05:48:40 +04:00
dab8f0b261 improvement: secret tags table pagination 2024-11-28 14:29:41 -08:00
4293665130 improvement: user groups table pagination 2024-11-28 14:10:45 -08:00
8afa65c272 improvements: minor refactoring 2024-11-28 13:47:09 -08:00
4c739fd57f chore: revert license 2024-11-28 13:36:42 -08:00
bcc2840020 improvement: filterable role selection on create/edit group 2024-11-28 13:13:23 -08:00
8b3af92d23 improvement: edit user role filterable select 2024-11-28 12:58:03 -08:00
9ca58894f0 improvement: filter select for create identity role 2024-11-28 12:58:03 -08:00
d131314de0 improvement: filter select for invite users to org 2024-11-28 12:58:03 -08:00
9c03144f19 improvement: use filterable multi-select for add users to project role select 2024-11-28 12:58:03 -08:00
5495ffd78e improvement: update add group to project modal to use filterable selects 2024-11-28 12:58:03 -08:00
a200469c72 Merge pull request #2811 from Infisical/fix/address-custom-audience-kube-native-auth
fix: address custom audience issue
2024-11-29 03:55:50 +08:00
85c3074216 Merge pull request #2776 from Infisical/misc/unbinded-scim-from-saml
misc: unbinded scim from saml
2024-11-28 14:15:31 -05:00
cfc55ff283 Merge pull request #2804 from Infisical/users-projects-table-pagination
Improvement: Users, Groups and Projects Table Pagination
2024-11-28 09:40:19 -08:00
7179b7a540 Merge pull request #2798 from Infisical/project-overview-pagination
Improvement: Add Pagination to the Project Overview Page
2024-11-28 08:59:04 -08:00
6c4cb5e084 improvements: address feedback 2024-11-28 08:54:27 -08:00
9cfb044178 misc: removed outdated faq section 2024-11-28 20:41:30 +08:00
105eb70fd9 fix: address custom audience issue 2024-11-28 13:32:40 +08:00
18a2547b24 improvement: move user groups to own tab and add pagination/search/sort to groups tables 2024-11-27 20:35:15 -08:00
588b3c77f9 improvement: add pagination/sort to org members table 2024-11-27 19:23:54 -08:00
a04834c7c9 improvement: add pagination to project members table 2024-11-27 18:41:20 -08:00
9df9f4a5da improvement: adjust add project button margins 2024-11-27 17:54:00 -08:00
afdc704423 improvement: improve select styling 2024-11-27 17:50:42 -08:00
57261cf0c8 improvement: adjust contrast for selected project 2024-11-27 15:42:07 -08:00
06f6004993 improvement: refactor sidebar project select to support filtering with UI adjustments 2024-11-27 15:37:17 -08:00
f3bfb9cc5a Merge pull request #2802 from Infisical/audit-logs-project-select-filter
Improvement:  Filterable Project Select for Audit Logs
2024-11-27 13:10:53 -08:00
48fb77be49 improvement: typed date format 2024-11-27 12:17:30 -08:00
c3956c60e9 improvement: add pagination, sort and filtering to identity projects table with minor UI adjustments 2024-11-27 12:05:46 -08:00
f55bcb93ba improve: Folder name input validation text (#2809) 2024-11-27 19:55:46 +01:00
d3fb2a6a74 Merge pull request #2803 from Infisical/invite-users-project-multi-select-filter
Improvement: Filterable Multi-Select Project Input on Invite Users
2024-11-27 10:55:20 -08:00
6a23b74481 Merge pull request #2806 from Infisical/add-identity-to-project-modals-filter-selects
Improvement: Filter Selects on Add Identity to Project Modals
2024-11-27 10:48:10 -08:00
602cf4b3c4 improvement: disable tab select by default on filterable select 2024-11-27 08:40:00 -08:00
84ff71fef2 Merge pull request #2740 from akhilmhdh/feat/dynamic-secret-cli
Dynamic secret commands in CLI
2024-11-27 14:07:40 +05:30
add5742b8c improvement: change create to add 2024-11-26 18:48:59 -08:00
68f3964206 fix: incorrect plurals 2024-11-26 18:46:16 -08:00
90374971ae improvement: add filter select to add identity to project modals 2024-11-26 18:40:45 -08:00
3a1eadba8c Merge pull request #2805 from Infisical/vmatsiiako-leave-patch-1
Update time-off.mdx
2024-11-26 21:05:26 -05:00
5305017ce2 Update time-off.mdx 2024-11-26 18:01:55 -08:00
cf5f49d14e chore: use toggle order 2024-11-26 17:26:49 -08:00
4f4b5be8ea fix: lowercase name compare for sort 2024-11-26 17:25:13 -08:00
ecea79f040 fix: hide pagination when no search match 2024-11-26 17:20:49 -08:00
586b901318 improvement: add pagination, filtering and sort to users projects table with minor UI improvements 2024-11-26 17:17:18 -08:00
ad8d247cdc Merge pull request #2801 from Infisical/omar/eng-1952-address-key-vault-integration-failing-due-to-disabled-secret
Fix(Azure Key Vault): Ignore disabled secrets
2024-11-26 18:54:28 -05:00
3b47d7698b improvement: start align components 2024-11-26 15:51:06 -08:00
aa9a86df71 improvement: use filter multi-select for adding users to projects on invite 2024-11-26 15:47:23 -08:00
33411335ed avoid syncing disabled azure keys 2024-11-27 00:15:10 +01:00
ca55f19926 improvement: add placeholder 2024-11-26 15:10:05 -08:00
3794521c56 improvement: project select filterable on audit logs with minor UI revisions 2024-11-26 15:07:20 -08:00
728f023263 remove superfolous trycatch 2024-11-26 23:46:23 +01:00
229706f57f improve filtering 2024-11-26 23:35:32 +01:00
6cf2488326 Fix(Azure Key Vault): Ignore disabled secrets 2024-11-26 23:22:07 +01:00
2c402fbbb6 Update NavHeader.tsx 2024-11-27 01:31:11 +04:00
92ce05283b feat: Add new tag when creating secret (#2791)
* feat: Add new tag when creating secret
2024-11-26 21:10:14 +01:00
39d92ce6ff Merge pull request #2799 from Infisical/misc/finalize-env-default
misc: finalized env schema handling
2024-11-26 15:10:06 -05:00
44a026446e misc: finalized env schema handling of bool 2024-11-27 04:06:05 +08:00
bbf52c9a48 improvement: add pagination to the project overview page with minor UI adjustments 2024-11-26 11:47:59 -08:00
539e5b1907 Merge pull request #2782 from Infisical/fix-remove-payment-method
Fix: Resolve Remove Payment Method Error
2024-11-26 10:54:55 -08:00
44b02d5324 Merge pull request #2780 from Infisical/octopus-deploy-integration
Feature: Octopus Deploy Integration
2024-11-26 08:46:25 -08:00
71fb6f1d11 Merge branch 'main' into octopus-deploy-integration 2024-11-26 08:36:09 -08:00
e64100fab1 Merge pull request #2796 from akhilmhdh/feat/empty-env-stuck
Random patches
2024-11-26 10:54:16 -05:00
=
5bcf07b32b feat: resolved loading screen frozen on no environment and project switch causes forEach undefined error 2024-11-26 20:27:30 +05:30
=
3b0c48052b fix: frontend failing to give token back in cli login 2024-11-26 20:26:14 +05:30
df50e3b0f9 Merge pull request #2793 from akhilmhdh/fix/signup-allow-saml
feat: resolved saml failing when signup is disabled
2024-11-26 09:39:52 -05:00
bdf2ae40b6 Merge pull request #2751 from Infisical/daniel/sap-ase-db
feat(dynamic-secrets): SAP ASE
2024-11-26 15:13:02 +04:00
b6c05a2f25 improvements: address feedback/requests 2024-11-25 16:35:56 -08:00
960efb9cf9 docs(dynamic-secrets): SAP ASE 2024-11-26 01:54:16 +04:00
aa8d58abad feat: TDS driver docker support 2024-11-26 01:54:16 +04:00
cfb0cc4fea Update types.ts 2024-11-26 01:54:16 +04:00
7712df296c feat: SAP ASE Dynamic Secrets 2024-11-26 01:54:16 +04:00
7c38932121 fix: minor types improvement 2024-11-26 01:54:16 +04:00
69ad9845e1 improvement: added $ pattern to existing dynamic providers 2024-11-26 01:54:16 +04:00
7321c237d7 Merge pull request #2792 from Infisical/daniel/dynamic-secret-renewals
fix(dynamic-secrets): renewal 500 error
2024-11-26 01:52:28 +04:00
32430a6a16 feat: Add Project Descriptions (#2774)
* feat:  initial backend project description
2024-11-25 21:59:14 +01:00
=
3d6ea3251e feat: renamed dynamic_secrets to match with the command 2024-11-25 23:36:32 +05:30
=
f034adba76 feat: resolved saml failing when signup is disabled 2024-11-25 22:22:54 +05:30
463eb0014e fix(dynamic-secrets): renewal 500 error 2024-11-25 20:17:50 +04:00
=
be39e63832 feat: updated pr based on review 2024-11-25 20:28:43 +05:30
21403f6fe5 Merge pull request #2761 from Infisical/daniel/cli-login-domains-fix
fix: allow preset domains for `infisical login`
2024-11-25 16:16:08 +04:00
2f9e542b31 Merge pull request #2760 from Infisical/daniel/request-ids
feat: request ID support
2024-11-25 16:13:19 +04:00
089d6812fd Update ldap-fns.ts 2024-11-25 16:00:20 +04:00
464a3ccd53 Update AccessPolicyModal.tsx 2024-11-25 15:55:44 +04:00
71c9c0fa1e Merge pull request #2781 from Infisical/daniel/project-slug-500-error
fix: improve project DAL error handling
2024-11-24 19:43:26 -05:00
46ad1d47a9 fix: correct payment ID to remove payment method and add confirmation/notification for removal 2024-11-22 19:47:52 -08:00
2b977eeb33 fix: improve project error handling 2024-11-23 03:42:54 +04:00
a692148597 feat(integrations): Add AWS Secrets Manager IAM Role + Region (#2778) 2024-11-23 00:04:33 +01:00
b762816e66 chore: remove unused lib 2024-11-22 14:58:47 -08:00
cf275979ba feature: octopus deploy integration 2024-11-22 14:47:15 -08:00
64bfa4f334 Merge pull request #2779 from Infisical/fix-delete-project-role
Fix: Prevent Updating Identity/User Project Role to reserved "Custom" Slug
2024-11-22 16:23:22 -05:00
e3eb14bfd9 fix: add custom slug check to user 2024-11-22 13:09:47 -08:00
24b50651c9 fix: correct update role mapping for identity/user and prevent updating role slug to "custom" 2024-11-22 13:02:00 -08:00
1cd459fda7 Merge branch 'heads/main' into daniel/request-ids 2024-11-23 00:14:50 +04:00
38917327d9 feat: request lifecycle request ID 2024-11-22 23:19:07 +04:00
63fac39fff doc: added tip for SCIM 2024-11-23 03:11:06 +08:00
7c62a776fb misc: unbinded scim from saml 2024-11-23 01:59:07 +08:00
d7b494c6f8 Merge pull request #2775 from akhilmhdh/fix/patches-3
fix: db error on token auth and permission issue
2024-11-22 12:43:20 -05:00
=
93208afb36 fix: db error on token auth and permission issue 2024-11-22 22:41:53 +05:30
1a084d8fcf add direct link td provider 2024-11-21 21:26:46 -05:00
dd4f133c6c Merge pull request #2769 from Infisical/misc/made-identity-metadata-value-not-nullable-again
misc: made identity metadata value not nullable
2024-11-22 01:59:01 +08:00
c41d27e1ae misc: made identity metadata value not nullable 2024-11-21 21:27:56 +08:00
1866ed8d23 Merge pull request #2742 from Infisical/feat/totp-dynamic-secret
feat: TOTP dynamic secret provider
2024-11-21 12:00:12 +08:00
7b3b232dde replace loader with spinner 2024-11-20 14:09:26 -08:00
9d618b4ae9 minor text revisions/additions and add colors/icons to totp token expiry countdown 2024-11-20 14:01:40 -08:00
5330ab2171 Merge pull request #2768 from BnjmnZmmrmn/k8s_integration_docs_typo
fixing small typo in docs/integrations/platforms/kubernetes
2024-11-20 15:49:35 -05:00
662e588c22 misc: add handling for lease regen 2024-11-21 04:43:21 +08:00
90057d80ff Merge pull request #2767 from akhilmhdh/feat/permission-error
Detail error when permission validation error occurs
2024-11-21 02:00:51 +05:30
1eda7aaaac reverse license 2024-11-20 12:14:14 -08:00
00dcadbc08 misc: added timer 2024-11-21 04:09:19 +08:00
7a7289ebd0 fixing typo in docs/integrations/platforms/kubernetes 2024-11-20 11:50:13 -08:00
e5d4677fd6 improvements: minor UI/labeling adjustments, only show tags loading if can read, and remove rounded bottom on overview table 2024-11-20 11:50:10 -08:00
bce3f3d676 misc: addressed review comments 2024-11-21 02:37:56 +08:00
=
300372fa98 feat: resolve dependency cycle error 2024-11-20 23:59:49 +05:30
47a4f8bae9 Merge pull request #2766 from Infisical/omar/eng-1886-make-terraform-integration-secrets-marked-as-sensitive
Improvement(Terraform Cloud Integration): Synced secrets are hidden from Terraform UI
2024-11-20 13:16:45 -05:00
=
863719f296 feat: added action button for notification toast and one action each for forbidden error and validation error details 2024-11-20 22:55:14 +05:30
=
7317dc1cf5 feat: modified error handler to return possible rules for a validation failed rules 2024-11-20 22:50:21 +05:30
75df898e78 Merge pull request #2762 from Infisical/daniel/cli-installer-readme
chore(cli-installer): readme improvements
2024-11-20 20:29:23 +04:00
0de6add3f7 set all new and existing secrets to be sensitive: true 2024-11-20 17:28:09 +01:00
0c008b6393 Update README.md 2024-11-20 20:26:00 +04:00
0c3894496c feat: added support for configuring totp with secret key 2024-11-20 23:40:36 +08:00
35fbd5d49d Merge pull request #2764 from Infisical/daniel/pre-commit-cli-check
chore: check for CLI installation before pre-commit
2024-11-20 19:01:54 +04:00
d03b453e3d Merge pull request #2765 from Infisical/daniel/actor-id-mismatch
fix(audit-logs): actor / actor ID mismatch
2024-11-20 18:58:33 +04:00
96e331b678 fix(audit-logs): actor / actor ID mismatch 2024-11-20 18:50:29 +04:00
d4d468660d chore: check for CLI installation before pre-commit 2024-11-20 17:29:36 +04:00
75a4965928 requested changes 2024-11-20 16:23:59 +04:00
660c09ded4 Merge branch 'feat/totp-dynamic-secret' of https://github.com/Infisical/infisical into feat/totp-dynamic-secret 2024-11-20 18:56:56 +08:00
b5287d91c0 misc: addressed comments 2024-11-20 18:56:16 +08:00
6a17763237 docs: dynamic secret doc typos addressed 2024-11-19 19:58:01 -08:00
f2bd3daea2 Update README.md 2024-11-20 03:24:05 +04:00
7f70f96936 fix: allow preset domains for infisical login 2024-11-20 01:06:18 +04:00
73e0a54518 feat: request ID support 2024-11-20 00:01:25 +04:00
0d295a2824 fix: application crash on zod api error 2024-11-20 00:00:30 +04:00
9a62efea4f Merge pull request #2759 from Infisical/docs-update-note
update docs note
2024-11-19 23:51:44 +04:00
3be3d807d2 misc: added URL string validation 2024-11-18 19:32:57 +08:00
9f7ea3c4e5 doc: added docs for totp dynamic secret 2024-11-18 19:27:45 +08:00
e67218f170 misc: finalized option setting logic 2024-11-18 18:34:27 +08:00
269c40c67c Merge remote-tracking branch 'origin/main' into feat/totp-dynamic-secret 2024-11-18 17:31:19 +08:00
ba1fd8a3f7 feat: totp dynamic secret 2024-11-16 02:48:28 +08:00
=
ed7fc0e5cd docs: updated dynamic secret command cli docs 2024-11-15 20:33:06 +05:30
=
1ae6213387 feat: completed dynamic secret support in cli 2024-11-15 20:32:37 +05:30
266 changed files with 10615 additions and 3849 deletions

View File

@ -74,8 +74,8 @@ CAPTCHA_SECRET=
NEXT_PUBLIC_CAPTCHA_SITE_KEY=
OTEL_TELEMETRY_COLLECTION_ENABLED=
OTEL_EXPORT_TYPE=
OTEL_TELEMETRY_COLLECTION_ENABLED=false
OTEL_EXPORT_TYPE=prometheus
OTEL_EXPORT_OTLP_ENDPOINT=
OTEL_OTLP_PUSH_INTERVAL=

View File

@ -1,6 +1,12 @@
#!/usr/bin/env sh
. "$(dirname -- "$0")/_/husky.sh"
# Check if infisical is installed
if ! command -v infisical >/dev/null 2>&1; then
echo "\nError: Infisical CLI is not installed. Please install the Infisical CLI before comitting.\n You can refer to the documentation at https://infisical.com/docs/cli/overview\n\n"
exit 1
fi
npx lint-staged
infisical scan git-changes --staged -v

View File

@ -69,13 +69,21 @@ RUN groupadd -r -g 1001 nodejs && useradd -r -u 1001 -g nodejs non-root-user
WORKDIR /app
# Required for pkcs11js
# Required for pkcs11js and ODBC
RUN apt-get update && apt-get install -y \
python3 \
make \
g++ \
unixodbc \
unixodbc-dev \
freetds-dev \
freetds-bin \
tdsodbc \
&& rm -rf /var/lib/apt/lists/*
# Configure ODBC
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\nSetup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so\nFileUsage = 1\n" > /etc/odbcinst.ini
COPY backend/package*.json ./
RUN npm ci --only-production
@ -91,13 +99,21 @@ ENV ChrystokiConfigurationPath=/usr/safenet/lunaclient/
WORKDIR /app
# Required for pkcs11js
# Required for pkcs11js and ODBC
RUN apt-get update && apt-get install -y \
python3 \
make \
g++ \
unixodbc \
unixodbc-dev \
freetds-dev \
freetds-bin \
tdsodbc \
&& rm -rf /var/lib/apt/lists/*
# Configure ODBC
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\nSetup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so\nFileUsage = 1\n" > /etc/odbcinst.ini
COPY backend/package*.json ./
RUN npm ci --only-production
@ -108,13 +124,24 @@ RUN mkdir frontend-build
# Production stage
FROM base AS production
# Install necessary packages
# Install necessary packages including ODBC
RUN apt-get update && apt-get install -y \
ca-certificates \
curl \
git \
python3 \
make \
g++ \
unixodbc \
unixodbc-dev \
freetds-dev \
freetds-bin \
tdsodbc \
&& rm -rf /var/lib/apt/lists/*
# Configure ODBC in production
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\nSetup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so\nFileUsage = 1\n" > /etc/odbcinst.ini
# Install Infisical CLI
RUN curl -1sLf 'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.deb.sh' | bash \
&& apt-get update && apt-get install -y infisical=0.31.1 \

View File

@ -72,8 +72,16 @@ RUN addgroup --system --gid 1001 nodejs \
WORKDIR /app
# Required for pkcs11js
RUN apk add --no-cache python3 make g++
# Install all required dependencies for build
RUN apk --update add \
python3 \
make \
g++ \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
COPY backend/package*.json ./
RUN npm ci --only-production
@ -88,8 +96,19 @@ FROM base AS backend-runner
WORKDIR /app
# Required for pkcs11js
RUN apk add --no-cache python3 make g++
# Install all required dependencies for runtime
RUN apk --update add \
python3 \
make \
g++ \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
# Configure ODBC
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
COPY backend/package*.json ./
RUN npm ci --only-production
@ -100,11 +119,32 @@ RUN mkdir frontend-build
# Production stage
FROM base AS production
RUN apk add --upgrade --no-cache ca-certificates
RUN apk add --no-cache bash curl && curl -1sLf \
'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.alpine.sh' | bash \
&& apk add infisical=0.31.1 && apk add --no-cache git
WORKDIR /
# Install all required runtime dependencies
RUN apk --update add \
python3 \
make \
g++ \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev \
bash \
curl \
git
# Configure ODBC in production
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
# Setup user permissions
RUN addgroup --system --gid 1001 nodejs \
&& adduser --system --uid 1001 non-root-user
@ -127,7 +167,6 @@ ARG CAPTCHA_SITE_KEY
ENV NEXT_PUBLIC_CAPTCHA_SITE_KEY=$CAPTCHA_SITE_KEY \
BAKED_NEXT_PUBLIC_CAPTCHA_SITE_KEY=$CAPTCHA_SITE_KEY
WORKDIR /
COPY --from=backend-runner /app /backend
@ -149,4 +188,4 @@ EXPOSE 443
USER non-root-user
CMD ["./standalone-entrypoint.sh"]
CMD ["./standalone-entrypoint.sh"]

View File

@ -9,6 +9,15 @@ RUN apk --update add \
make \
g++
# install dependencies for TDS driver (required for SAP ASE dynamic secrets)
RUN apk add --no-cache \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
COPY package*.json ./
RUN npm ci --only-production
@ -28,6 +37,17 @@ RUN apk --update add \
make \
g++
# install dependencies for TDS driver (required for SAP ASE dynamic secrets)
RUN apk add --no-cache \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
RUN npm ci --only-production && npm cache clean --force
COPY --from=build /app .

View File

@ -7,7 +7,7 @@ ARG SOFTHSM2_VERSION=2.5.0
ENV SOFTHSM2_VERSION=${SOFTHSM2_VERSION} \
SOFTHSM2_SOURCES=/tmp/softhsm2
# install build dependencies including python3
# install build dependencies including python3 (required for pkcs11js and partially TDS driver)
RUN apk --update add \
alpine-sdk \
autoconf \
@ -19,7 +19,19 @@ RUN apk --update add \
make \
g++
# install dependencies for TDS driver (required for SAP ASE dynamic secrets)
RUN apk add --no-cache \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
# build and install SoftHSM2
RUN git clone https://github.com/opendnssec/SoftHSMv2.git ${SOFTHSM2_SOURCES}
WORKDIR ${SOFTHSM2_SOURCES}

View File

@ -0,0 +1,86 @@
import { createFolder, deleteFolder } from "e2e-test/testUtils/folders";
import { createSecretV2, deleteSecretV2, getSecretsV2 } from "e2e-test/testUtils/secrets";
import { seedData1 } from "@app/db/seed-data";
describe("Secret Recursive Testing", async () => {
const projectId = seedData1.projectV3.id;
const folderAndSecretNames = [
{ name: "deep1", path: "/", expectedSecretCount: 4 },
{ name: "deep21", path: "/deep1", expectedSecretCount: 2 },
{ name: "deep3", path: "/deep1/deep2", expectedSecretCount: 1 },
{ name: "deep22", path: "/deep2", expectedSecretCount: 1 }
];
beforeAll(async () => {
const rootFolderIds: string[] = [];
for (const folder of folderAndSecretNames) {
// eslint-disable-next-line no-await-in-loop
const createdFolder = await createFolder({
authToken: jwtAuthToken,
environmentSlug: "prod",
workspaceId: projectId,
secretPath: folder.path,
name: folder.name
});
if (folder.path === "/") {
rootFolderIds.push(createdFolder.id);
}
// eslint-disable-next-line no-await-in-loop
await createSecretV2({
secretPath: folder.path,
authToken: jwtAuthToken,
environmentSlug: "prod",
workspaceId: projectId,
key: folder.name,
value: folder.name
});
}
return async () => {
await Promise.all(
rootFolderIds.map((id) =>
deleteFolder({
authToken: jwtAuthToken,
secretPath: "/",
id,
workspaceId: projectId,
environmentSlug: "prod"
})
)
);
await deleteSecretV2({
authToken: jwtAuthToken,
secretPath: "/",
workspaceId: projectId,
environmentSlug: "prod",
key: folderAndSecretNames[0].name
});
};
});
test.each(folderAndSecretNames)("$path recursive secret fetching", async ({ path, expectedSecretCount }) => {
const secrets = await getSecretsV2({
authToken: jwtAuthToken,
secretPath: path,
workspaceId: projectId,
environmentSlug: "prod",
recursive: true
});
expect(secrets.secrets.length).toEqual(expectedSecretCount);
expect(secrets.secrets.sort((a, b) => a.secretKey.localeCompare(b.secretKey))).toEqual(
folderAndSecretNames
.filter((el) => el.path.startsWith(path))
.sort((a, b) => a.name.localeCompare(b.name))
.map((el) =>
expect.objectContaining({
secretKey: el.name,
secretValue: el.name
})
)
);
});
});

View File

@ -97,6 +97,7 @@ export const getSecretsV2 = async (dto: {
environmentSlug: string;
secretPath: string;
authToken: string;
recursive?: boolean;
}) => {
const getSecretsResponse = await testServer.inject({
method: "GET",
@ -109,7 +110,8 @@ export const getSecretsV2 = async (dto: {
environment: dto.environmentSlug,
secretPath: dto.secretPath,
expandSecretReferences: "true",
include_imports: "true"
include_imports: "true",
recursive: String(dto.recursive || false)
}
});
expect(getSecretsResponse.statusCode).toBe(200);

View File

@ -24,6 +24,7 @@
"@fastify/multipart": "8.3.0",
"@fastify/passport": "^2.4.0",
"@fastify/rate-limit": "^9.0.0",
"@fastify/request-context": "^5.1.0",
"@fastify/session": "^10.7.0",
"@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^2.1.0",
@ -32,6 +33,7 @@
"@octokit/plugin-retry": "^5.0.5",
"@octokit/rest": "^20.0.2",
"@octokit/webhooks-types": "^7.3.1",
"@octopusdeploy/api-client": "^3.4.1",
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/auto-instrumentations-node": "^0.53.0",
"@opentelemetry/exporter-metrics-otlp-proto": "^0.55.0",
@ -80,6 +82,7 @@
"mysql2": "^3.9.8",
"nanoid": "^3.3.4",
"nodemailer": "^6.9.9",
"odbc": "^2.4.9",
"openid-client": "^5.6.5",
"ora": "^7.0.1",
"oracledb": "^6.4.0",
@ -5528,6 +5531,15 @@
"toad-cache": "^3.3.0"
}
},
"node_modules/@fastify/request-context": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/@fastify/request-context/-/request-context-5.1.0.tgz",
"integrity": "sha512-PM7wrLJOEylVDpxabOFLaYsdAiaa0lpDUcP2HMFJ1JzgiWuC6k4r3duf6Pm9YLnzlGmT+Yp4tkQjqsu7V/pSOA==",
"license": "MIT",
"dependencies": {
"fastify-plugin": "^4.0.0"
}
},
"node_modules/@fastify/send": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/@fastify/send/-/send-2.1.0.tgz",
@ -6944,6 +6956,21 @@
"resolved": "https://registry.npmjs.org/@octokit/webhooks-types/-/webhooks-types-7.1.0.tgz",
"integrity": "sha512-y92CpG4kFFtBBjni8LHoV12IegJ+KFxLgKRengrVjKmGE5XMeCuGvlfRe75lTRrgXaG6XIWJlFpIDTlkoJsU8w=="
},
"node_modules/@octopusdeploy/api-client": {
"version": "3.4.1",
"resolved": "https://registry.npmjs.org/@octopusdeploy/api-client/-/api-client-3.4.1.tgz",
"integrity": "sha512-j6FRgDNzc6AQoT3CAguYLWxoMR4W5TKCT1BCPpqjEN9mknmdMSKfYORs3djn/Yj/BhqtITTydDpBoREbzKY5+g==",
"license": "Apache-2.0",
"dependencies": {
"adm-zip": "^0.5.9",
"axios": "^1.2.1",
"form-data": "^4.0.0",
"glob": "^8.0.3",
"lodash": "^4.17.21",
"semver": "^7.3.8",
"urijs": "^1.19.11"
}
},
"node_modules/@opentelemetry/api": {
"version": "1.9.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.9.0.tgz",
@ -17862,6 +17889,27 @@
"jsonwebtoken": "^9.0.2"
}
},
"node_modules/odbc": {
"version": "2.4.9",
"resolved": "https://registry.npmjs.org/odbc/-/odbc-2.4.9.tgz",
"integrity": "sha512-sHFWOKfyj4oFYds7YBlN+fq9ZjC2J6CsCN5CNMABpKLp+NZdb8bnanb57OaoDy1VFXEOTE91S+F900J/aIPu6w==",
"hasInstallScript": true,
"license": "MIT",
"dependencies": {
"@mapbox/node-pre-gyp": "^1.0.5",
"async": "^3.0.1",
"node-addon-api": "^3.0.2"
},
"engines": {
"node": ">=18.0.0"
}
},
"node_modules/odbc/node_modules/node-addon-api": {
"version": "3.2.1",
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-3.2.1.tgz",
"integrity": "sha512-mmcei9JghVNDYydghQmeDX8KoAm0FAiYyIcUt/N4nhyAipB17pllZQDOJD2fotxABnt4Mdz+dKTO7eftLg4d0A==",
"license": "MIT"
},
"node_modules/oidc-token-hash": {
"version": "5.0.3",
"resolved": "https://registry.npmjs.org/oidc-token-hash/-/oidc-token-hash-5.0.3.tgz",
@ -22365,6 +22413,12 @@
"punycode": "^2.1.0"
}
},
"node_modules/urijs": {
"version": "1.19.11",
"resolved": "https://registry.npmjs.org/urijs/-/urijs-1.19.11.tgz",
"integrity": "sha512-HXgFDgDommxn5/bIv0cnQZsPhHDA90NPHD6+c/v21U5+Sx5hoP8+dP9IZXBU1gIfvdRfhG8cel9QNPeionfcCQ==",
"license": "MIT"
},
"node_modules/url": {
"version": "0.10.3",
"resolved": "https://registry.npmjs.org/url/-/url-0.10.3.tgz",

View File

@ -132,6 +132,7 @@
"@fastify/multipart": "8.3.0",
"@fastify/passport": "^2.4.0",
"@fastify/rate-limit": "^9.0.0",
"@fastify/request-context": "^5.1.0",
"@fastify/session": "^10.7.0",
"@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^2.1.0",
@ -140,6 +141,7 @@
"@octokit/plugin-retry": "^5.0.5",
"@octokit/rest": "^20.0.2",
"@octokit/webhooks-types": "^7.3.1",
"@octopusdeploy/api-client": "^3.4.1",
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/auto-instrumentations-node": "^0.53.0",
"@opentelemetry/exporter-metrics-otlp-proto": "^0.55.0",
@ -188,6 +190,7 @@
"mysql2": "^3.9.8",
"nanoid": "^3.3.4",
"nodemailer": "^6.9.9",
"odbc": "^2.4.9",
"openid-client": "^5.6.5",
"ora": "^7.0.1",
"oracledb": "^6.4.0",

View File

@ -0,0 +1,7 @@
import "@fastify/request-context";
declare module "@fastify/request-context" {
interface RequestContextData {
requestId: string;
}
}

View File

@ -1,6 +1,6 @@
import { FastifyInstance, RawReplyDefaultExpression, RawRequestDefaultExpression, RawServerDefault } from "fastify";
import { Logger } from "pino";
import { CustomLogger } from "@app/lib/logger/logger";
import { ZodTypeProvider } from "@app/server/plugins/fastify-zod";
declare global {
@ -8,7 +8,7 @@ declare global {
RawServerDefault,
RawRequestDefaultExpression<RawServerDefault>,
RawReplyDefaultExpression<RawServerDefault>,
Readonly<Logger>,
Readonly<CustomLogger>,
ZodTypeProvider
>;

View File

@ -0,0 +1,23 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
const hasProjectDescription = await knex.schema.hasColumn(TableName.Project, "description");
if (!hasProjectDescription) {
await knex.schema.alterTable(TableName.Project, (t) => {
t.string("description");
});
}
}
export async function down(knex: Knex): Promise<void> {
const hasProjectDescription = await knex.schema.hasColumn(TableName.Project, "description");
if (hasProjectDescription) {
await knex.schema.alterTable(TableName.Project, (t) => {
t.dropColumn("description");
});
}
}

View File

@ -0,0 +1,20 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
if (await knex.schema.hasColumn(TableName.IdentityMetadata, "value")) {
await knex(TableName.IdentityMetadata).whereNull("value").delete();
await knex.schema.alterTable(TableName.IdentityMetadata, (t) => {
t.string("value", 1020).notNullable().alter();
});
}
}
export async function down(knex: Knex): Promise<void> {
if (await knex.schema.hasColumn(TableName.IdentityMetadata, "value")) {
await knex.schema.alterTable(TableName.IdentityMetadata, (t) => {
t.string("value", 1020).alter();
});
}
}

View File

@ -12,7 +12,7 @@ import { TImmutableDBKeys } from "./models";
export const KmsRootConfigSchema = z.object({
id: z.string().uuid(),
encryptedRootKey: zodBuffer,
encryptionStrategy: z.string(),
encryptionStrategy: z.string().default("SOFTWARE").nullable().optional(),
createdAt: z.date(),
updatedAt: z.date()
});

View File

@ -23,7 +23,8 @@ export const ProjectsSchema = z.object({
kmsCertificateKeyId: z.string().uuid().nullable().optional(),
auditLogsRetentionDays: z.number().nullable().optional(),
kmsSecretManagerKeyId: z.string().uuid().nullable().optional(),
kmsSecretManagerEncryptedDataKey: zodBuffer.nullable().optional()
kmsSecretManagerEncryptedDataKey: zodBuffer.nullable().optional(),
description: z.string().nullable().optional()
});
export type TProjects = z.infer<typeof ProjectsSchema>;

View File

@ -112,7 +112,7 @@ export const dynamicSecretLeaseServiceFactory = ({
})
) as object;
const selectedTTL = ttl ?? dynamicSecretCfg.defaultTTL;
const selectedTTL = ttl || dynamicSecretCfg.defaultTTL;
const { maxTTL } = dynamicSecretCfg;
const expireAt = new Date(new Date().getTime() + ms(selectedTTL));
if (maxTTL) {
@ -187,7 +187,7 @@ export const dynamicSecretLeaseServiceFactory = ({
})
) as object;
const selectedTTL = ttl ?? dynamicSecretCfg.defaultTTL;
const selectedTTL = ttl || dynamicSecretCfg.defaultTTL;
const { maxTTL } = dynamicSecretCfg;
const expireAt = new Date(dynamicSecretLease.expireAt.getTime() + ms(selectedTTL));
if (maxTTL) {

View File

@ -80,7 +80,7 @@ const ElastiCacheUserManager = (credentials: TBasicAWSCredentials, region: strin
}
};
const addUserToInfisicalGroup = async (userId: string) => {
const $addUserToInfisicalGroup = async (userId: string) => {
// figure out if the default user is already in the group, if it is, then we shouldn't add it again
const addUserToGroupCommand = new ModifyUserGroupCommand({
@ -96,7 +96,7 @@ const ElastiCacheUserManager = (credentials: TBasicAWSCredentials, region: strin
await ensureInfisicalGroupExists(clusterName);
await elastiCache.send(new CreateUserCommand(creationInput)); // First create the user
await addUserToInfisicalGroup(creationInput.UserId); // Then add the user to the group. We know the group is already a part of the cluster because of ensureInfisicalGroupExists()
await $addUserToInfisicalGroup(creationInput.UserId); // Then add the user to the group. We know the group is already a part of the cluster because of ensureInfisicalGroupExists()
return {
userId: creationInput.UserId,
@ -212,7 +212,7 @@ export const AwsElastiCacheDatabaseProvider = (): TDynamicProviderFns => {
};
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
// No renewal necessary
return { entityId };
};

View File

@ -33,7 +33,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretAwsIamSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretAwsIamSchema>) => {
const client = new IAMClient({
region: providerInputs.region,
credentials: {
@ -47,7 +47,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const isConnected = await client.send(new GetUserCommand({})).then(() => true);
return isConnected;
@ -55,7 +55,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = generateUsername();
const { policyArns, userGroups, policyDocument, awsPath, permissionBoundaryPolicyArn } = providerInputs;
@ -118,7 +118,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = entityId;
@ -179,9 +179,8 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
};
const renew = async (_inputs: unknown, entityId: string) => {
// do nothing
const username = entityId;
return { entityId: username };
// No renewal necessary
return { entityId };
};
return {

View File

@ -23,7 +23,7 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
return providerInputs;
};
const getToken = async (
const $getToken = async (
tenantId: string,
applicationId: string,
clientSecret: string
@ -51,18 +51,13 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const data = await getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret);
const data = await $getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret);
return data.success;
};
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
return { entityId };
};
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const data = await getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret);
const data = await $getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret);
if (!data.success) {
throw new BadRequestError({ message: "Failed to authorize to Microsoft Entra ID" });
}
@ -98,7 +93,7 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
};
const fetchAzureEntraIdUsers = async (tenantId: string, applicationId: string, clientSecret: string) => {
const data = await getToken(tenantId, applicationId, clientSecret);
const data = await $getToken(tenantId, applicationId, clientSecret);
if (!data.success) {
throw new BadRequestError({ message: "Failed to authorize to Microsoft Entra ID" });
}
@ -127,6 +122,11 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
return users;
};
const renew = async (inputs: unknown, entityId: string) => {
// No renewal necessary
return { entityId };
};
return {
validateProviderInputs,
validateConnection,

View File

@ -27,7 +27,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretCassandraSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretCassandraSchema>) => {
const sslOptions = providerInputs.ca ? { rejectUnauthorized: false, ca: providerInputs.ca } : undefined;
const client = new cassandra.Client({
sslOptions,
@ -47,7 +47,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const isConnected = await client.execute("SELECT * FROM system_schema.keyspaces").then(() => true);
await client.shutdown();
@ -56,7 +56,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -82,7 +82,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = entityId;
const { keyspace } = providerInputs;
@ -99,20 +99,24 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
if (!providerInputs.renewStatement) return { entityId };
const client = await $getClient(providerInputs);
const username = entityId;
const expiration = new Date(expireAt).toISOString();
const { keyspace } = providerInputs;
const renewStatement = handlebars.compile(providerInputs.revocationStatement)({ username, keyspace, expiration });
const renewStatement = handlebars.compile(providerInputs.renewStatement)({
username: entityId,
keyspace,
expiration
});
const queries = renewStatement.toString().split(";").filter(Boolean);
for (const query of queries) {
// eslint-disable-next-line
for await (const query of queries) {
await client.execute(query);
}
await client.shutdown();
return { entityId: username };
return { entityId };
};
return {

View File

@ -24,7 +24,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretElasticSearchSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretElasticSearchSchema>) => {
const connection = new ElasticSearchClient({
node: {
url: new URL(`${providerInputs.host}:${providerInputs.port}`),
@ -55,7 +55,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const infoResponse = await connection
.info()
@ -67,7 +67,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -85,7 +85,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
await connection.security.deleteUser({
username: entityId
@ -96,7 +96,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
};
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
// No renewal necessary
return { entityId };
};

View File

@ -6,15 +6,17 @@ import { AzureEntraIDProvider } from "./azure-entra-id";
import { CassandraProvider } from "./cassandra";
import { ElasticSearchProvider } from "./elastic-search";
import { LdapProvider } from "./ldap";
import { DynamicSecretProviders } from "./models";
import { DynamicSecretProviders, TDynamicProviderFns } from "./models";
import { MongoAtlasProvider } from "./mongo-atlas";
import { MongoDBProvider } from "./mongo-db";
import { RabbitMqProvider } from "./rabbit-mq";
import { RedisDatabaseProvider } from "./redis";
import { SapAseProvider } from "./sap-ase";
import { SapHanaProvider } from "./sap-hana";
import { SqlDatabaseProvider } from "./sql-database";
import { TotpProvider } from "./totp";
export const buildDynamicSecretProviders = () => ({
export const buildDynamicSecretProviders = (): Record<DynamicSecretProviders, TDynamicProviderFns> => ({
[DynamicSecretProviders.SqlDatabase]: SqlDatabaseProvider(),
[DynamicSecretProviders.Cassandra]: CassandraProvider(),
[DynamicSecretProviders.AwsIam]: AwsIamProvider(),
@ -27,5 +29,7 @@ export const buildDynamicSecretProviders = () => ({
[DynamicSecretProviders.AzureEntraID]: AzureEntraIDProvider(),
[DynamicSecretProviders.Ldap]: LdapProvider(),
[DynamicSecretProviders.SapHana]: SapHanaProvider(),
[DynamicSecretProviders.Snowflake]: SnowflakeProvider()
[DynamicSecretProviders.Snowflake]: SnowflakeProvider(),
[DynamicSecretProviders.Totp]: TotpProvider(),
[DynamicSecretProviders.SapAse]: SapAseProvider()
});

View File

@ -52,7 +52,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof LdapSchema>): Promise<ldapjs.Client> => {
const $getClient = async (providerInputs: z.infer<typeof LdapSchema>): Promise<ldapjs.Client> => {
return new Promise((resolve, reject) => {
const client = ldapjs.createClient({
url: providerInputs.url,
@ -83,7 +83,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
return client.connected;
};
@ -191,7 +191,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
if (providerInputs.credentialType === LdapCredentialType.Static) {
const dnMatch = providerInputs.rotationLdif.match(/^dn:\s*(.+)/m);
@ -235,7 +235,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
if (providerInputs.credentialType === LdapCredentialType.Static) {
const dnMatch = providerInputs.rotationLdif.match(/^dn:\s*(.+)/m);
@ -268,7 +268,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
};
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
// No renewal necessary
return { entityId };
};

View File

@ -4,7 +4,8 @@ export enum SqlProviders {
Postgres = "postgres",
MySQL = "mysql2",
Oracle = "oracledb",
MsSQL = "mssql"
MsSQL = "mssql",
SapAse = "sap-ase"
}
export enum ElasticSearchAuthTypes {
@ -17,6 +18,17 @@ export enum LdapCredentialType {
Static = "static"
}
export enum TotpConfigType {
URL = "url",
MANUAL = "manual"
}
export enum TotpAlgorithm {
SHA1 = "sha1",
SHA256 = "sha256",
SHA512 = "sha512"
}
export const DynamicSecretRedisDBSchema = z.object({
host: z.string().trim().toLowerCase(),
port: z.number(),
@ -107,6 +119,16 @@ export const DynamicSecretCassandraSchema = z.object({
ca: z.string().optional()
});
export const DynamicSecretSapAseSchema = z.object({
host: z.string().trim().toLowerCase(),
port: z.number(),
database: z.string().trim(),
username: z.string().trim(),
password: z.string().trim(),
creationStatement: z.string().trim(),
revocationStatement: z.string().trim()
});
export const DynamicSecretAwsIamSchema = z.object({
accessKey: z.string().trim().min(1),
secretAccessKey: z.string().trim().min(1),
@ -221,6 +243,34 @@ export const LdapSchema = z.union([
})
]);
export const DynamicSecretTotpSchema = z.discriminatedUnion("configType", [
z.object({
configType: z.literal(TotpConfigType.URL),
url: z
.string()
.url()
.trim()
.min(1)
.refine((val) => {
const urlObj = new URL(val);
const secret = urlObj.searchParams.get("secret");
return Boolean(secret);
}, "OTP URL must contain secret field")
}),
z.object({
configType: z.literal(TotpConfigType.MANUAL),
secret: z
.string()
.trim()
.min(1)
.transform((val) => val.replace(/\s+/g, "")),
period: z.number().optional(),
algorithm: z.nativeEnum(TotpAlgorithm).optional(),
digits: z.number().optional()
})
]);
export enum DynamicSecretProviders {
SqlDatabase = "sql-database",
Cassandra = "cassandra",
@ -234,12 +284,15 @@ export enum DynamicSecretProviders {
AzureEntraID = "azure-entra-id",
Ldap = "ldap",
SapHana = "sap-hana",
Snowflake = "snowflake"
Snowflake = "snowflake",
Totp = "totp",
SapAse = "sap-ase"
}
export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(DynamicSecretProviders.SqlDatabase), inputs: DynamicSecretSqlDBSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Cassandra), inputs: DynamicSecretCassandraSchema }),
z.object({ type: z.literal(DynamicSecretProviders.SapAse), inputs: DynamicSecretSapAseSchema }),
z.object({ type: z.literal(DynamicSecretProviders.AwsIam), inputs: DynamicSecretAwsIamSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Redis), inputs: DynamicSecretRedisDBSchema }),
z.object({ type: z.literal(DynamicSecretProviders.SapHana), inputs: DynamicSecretSapHanaSchema }),
@ -250,7 +303,8 @@ export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(DynamicSecretProviders.RabbitMq), inputs: DynamicSecretRabbitMqSchema }),
z.object({ type: z.literal(DynamicSecretProviders.AzureEntraID), inputs: AzureEntraIDSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Ldap), inputs: LdapSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Snowflake), inputs: DynamicSecretSnowflakeSchema })
z.object({ type: z.literal(DynamicSecretProviders.Snowflake), inputs: DynamicSecretSnowflakeSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Totp), inputs: DynamicSecretTotpSchema })
]);
export type TDynamicProviderFns = {

View File

@ -22,7 +22,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoAtlasSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoAtlasSchema>) => {
const client = axios.create({
baseURL: "https://cloud.mongodb.com/api/atlas",
headers: {
@ -40,7 +40,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const isConnected = await client({
method: "GET",
@ -59,7 +59,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -87,7 +87,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = entityId;
const isExisting = await client({
@ -114,7 +114,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = entityId;
const expiration = new Date(expireAt).toISOString();

View File

@ -23,7 +23,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoDBSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoDBSchema>) => {
const isSrv = !providerInputs.port;
const uri = isSrv
? `mongodb+srv://${providerInputs.host}`
@ -42,7 +42,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const isConnected = await client
.db(providerInputs.database)
@ -55,7 +55,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -74,7 +74,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = entityId;
@ -88,6 +88,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
};
const renew = async (_inputs: unknown, entityId: string) => {
// No renewal necessary
return { entityId };
};

View File

@ -84,7 +84,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretRabbitMqSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretRabbitMqSchema>) => {
const axiosInstance = axios.create({
baseURL: `${removeTrailingSlash(providerInputs.host)}:${providerInputs.port}/api`,
auth: {
@ -105,7 +105,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const infoResponse = await connection.get("/whoami").then(() => true);
@ -114,7 +114,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -134,7 +134,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
await deleteRabbitMqUser({ axiosInstance: connection, usernameToDelete: entityId });
@ -142,7 +142,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
};
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
// No renewal necessary
return { entityId };
};

View File

@ -55,7 +55,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretRedisDBSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretRedisDBSchema>) => {
let connection: Redis | null = null;
try {
connection = new Redis({
@ -92,7 +92,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const pingResponse = await connection
.ping()
@ -104,7 +104,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -126,7 +126,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
const connection = await $getClient(providerInputs);
const username = entityId;
@ -141,7 +141,9 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs);
if (!providerInputs.renewStatement) return { entityId };
const connection = await $getClient(providerInputs);
const username = entityId;
const expiration = new Date(expireAt).toISOString();

View File

@ -0,0 +1,145 @@
import handlebars from "handlebars";
import { customAlphabet } from "nanoid";
import odbc from "odbc";
import { z } from "zod";
import { BadRequestError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { verifyHostInputValidity } from "../dynamic-secret-fns";
import { DynamicSecretSapAseSchema, TDynamicProviderFns } from "./models";
const generatePassword = (size = 48) => {
const charset = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
return customAlphabet(charset, 48)(size);
};
const generateUsername = () => {
return alphaNumericNanoId(25);
};
enum SapCommands {
CreateLogin = "sp_addlogin",
DropLogin = "sp_droplogin"
}
export const SapAseProvider = (): TDynamicProviderFns => {
const validateProviderInputs = async (inputs: unknown) => {
const providerInputs = await DynamicSecretSapAseSchema.parseAsync(inputs);
verifyHostInputValidity(providerInputs.host);
return providerInputs;
};
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSapAseSchema>, useMaster?: boolean) => {
const connectionString =
`DRIVER={FreeTDS};` +
`SERVER=${providerInputs.host};` +
`PORT=${providerInputs.port};` +
`DATABASE=${useMaster ? "master" : providerInputs.database};` +
`UID=${providerInputs.username};` +
`PWD=${providerInputs.password};` +
`TDS_VERSION=5.0`;
const client = await odbc.connect(connectionString);
return client;
};
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const masterClient = await $getClient(providerInputs, true);
const client = await $getClient(providerInputs);
const [resultFromMasterDatabase] = await masterClient.query<{ version: string }>("SELECT @@VERSION AS version");
const [resultFromSelectedDatabase] = await client.query<{ version: string }>("SELECT @@VERSION AS version");
if (!resultFromSelectedDatabase.version) {
throw new BadRequestError({
message: "Failed to validate SAP ASE connection, version query failed"
});
}
if (resultFromMasterDatabase.version !== resultFromSelectedDatabase.version) {
throw new BadRequestError({
message: "Failed to validate SAP ASE connection (master), version mismatch"
});
}
return true;
};
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const username = `inf_${generateUsername()}`;
const password = `${generatePassword()}`;
const client = await $getClient(providerInputs);
const masterClient = await $getClient(providerInputs, true);
const creationStatement = handlebars.compile(providerInputs.creationStatement, { noEscape: true })({
username,
password
});
const queries = creationStatement.trim().replace(/\n/g, "").split(";").filter(Boolean);
for await (const query of queries) {
// If it's an adduser query, we need to first call sp_addlogin on the MASTER database.
// If not done, then the newly created user won't be able to authenticate.
await (query.startsWith(SapCommands.CreateLogin) ? masterClient : client).query(query);
}
await masterClient.close();
await client.close();
return { entityId: username, data: { DB_USERNAME: username, DB_PASSWORD: password } };
};
const revoke = async (inputs: unknown, username: string) => {
const providerInputs = await validateProviderInputs(inputs);
const revokeStatement = handlebars.compile(providerInputs.revocationStatement, { noEscape: true })({
username
});
const queries = revokeStatement.trim().replace(/\n/g, "").split(";").filter(Boolean);
const client = await $getClient(providerInputs);
const masterClient = await $getClient(providerInputs, true);
// Get all processes for this login and kill them. If there are active connections to the database when drop login happens, it will throw an error.
const result = await masterClient.query<{ spid?: string }>(`sp_who '${username}'`);
if (result && result.length > 0) {
for await (const row of result) {
if (row.spid) {
await masterClient.query(`KILL ${row.spid.trim()}`);
}
}
}
for await (const query of queries) {
await (query.startsWith(SapCommands.DropLogin) ? masterClient : client).query(query);
}
await masterClient.close();
await client.close();
return { entityId: username };
};
const renew = async (_: unknown, username: string) => {
// No need for renewal
return { entityId: username };
};
return {
validateProviderInputs,
validateConnection,
create,
revoke,
renew
};
};

View File

@ -32,7 +32,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretSapHanaSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSapHanaSchema>) => {
const client = hdb.createClient({
host: providerInputs.host,
port: providerInputs.port,
@ -64,9 +64,9 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const testResult: boolean = await new Promise((resolve, reject) => {
const testResult = await new Promise<boolean>((resolve, reject) => {
client.exec("SELECT 1 FROM DUMMY;", (err: any) => {
if (err) {
reject();
@ -86,7 +86,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
const password = generatePassword();
const expiration = new Date(expireAt).toISOString();
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const creationStatement = handlebars.compile(providerInputs.creationStatement, { noEscape: true })({
username,
password,
@ -114,7 +114,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, username: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const revokeStatement = handlebars.compile(providerInputs.revocationStatement)({ username });
const queries = revokeStatement.toString().split(";").filter(Boolean);
for await (const query of queries) {
@ -135,13 +135,15 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
return { entityId: username };
};
const renew = async (inputs: unknown, username: string, expireAt: number) => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
if (!providerInputs.renewStatement) return { entityId };
const client = await $getClient(providerInputs);
try {
const expiration = new Date(expireAt).toISOString();
const renewStatement = handlebars.compile(providerInputs.renewStatement)({ username, expiration });
const renewStatement = handlebars.compile(providerInputs.renewStatement)({ username: entityId, expiration });
const queries = renewStatement.toString().split(";").filter(Boolean);
for await (const query of queries) {
await new Promise((resolve, reject) => {
@ -161,7 +163,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
client.disconnect();
}
return { entityId: username };
return { entityId };
};
return {

View File

@ -34,7 +34,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretSnowflakeSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSnowflakeSchema>) => {
const client = snowflake.createConnection({
account: `${providerInputs.orgId}-${providerInputs.accountId}`,
username: providerInputs.username,
@ -49,7 +49,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
let isValidConnection: boolean;
@ -72,7 +72,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
const username = generateUsername();
const password = generatePassword();
@ -107,7 +107,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, username: string) => {
const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
try {
const revokeStatement = handlebars.compile(providerInputs.revocationStatement)({ username });
@ -131,17 +131,16 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
return { entityId: username };
};
const renew = async (inputs: unknown, username: string, expireAt: number) => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
if (!providerInputs.renewStatement) return { entityId };
if (!providerInputs.renewStatement) return { entityId: username };
const client = await getClient(providerInputs);
const client = await $getClient(providerInputs);
try {
const expiration = getDaysToExpiry(new Date(expireAt));
const renewStatement = handlebars.compile(providerInputs.renewStatement)({
username,
username: entityId,
expiration
});
@ -161,7 +160,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
client.destroy(noop);
}
return { entityId: username };
return { entityId };
};
return {

View File

@ -32,7 +32,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
return providerInputs;
};
const getClient = async (providerInputs: z.infer<typeof DynamicSecretSqlDBSchema>) => {
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSqlDBSchema>) => {
const ssl = providerInputs.ca ? { rejectUnauthorized: false, ca: providerInputs.ca } : undefined;
const db = knex({
client: providerInputs.client,
@ -52,7 +52,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs);
const db = await $getClient(providerInputs);
// oracle needs from keyword
const testStatement = providerInputs.client === SqlProviders.Oracle ? "SELECT 1 FROM DUAL" : "SELECT 1";
@ -63,7 +63,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs);
const db = await $getClient(providerInputs);
const username = generateUsername(providerInputs.client);
const password = generatePassword(providerInputs.client);
@ -90,7 +90,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs);
const db = await $getClient(providerInputs);
const username = entityId;
const { database } = providerInputs;
@ -110,13 +110,19 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs);
if (!providerInputs.renewStatement) return { entityId };
const db = await $getClient(providerInputs);
const username = entityId;
const expiration = new Date(expireAt).toISOString();
const { database } = providerInputs;
const renewStatement = handlebars.compile(providerInputs.renewStatement)({ username, expiration, database });
const renewStatement = handlebars.compile(providerInputs.renewStatement)({
username: entityId,
expiration,
database
});
if (renewStatement) {
const queries = renewStatement.toString().split(";").filter(Boolean);
await db.transaction(async (tx) => {
@ -128,7 +134,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
}
await db.destroy();
return { entityId: username };
return { entityId };
};
return {

View File

@ -0,0 +1,90 @@
import { authenticator } from "otplib";
import { HashAlgorithms } from "otplib/core";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { DynamicSecretTotpSchema, TDynamicProviderFns, TotpConfigType } from "./models";
export const TotpProvider = (): TDynamicProviderFns => {
const validateProviderInputs = async (inputs: unknown) => {
const providerInputs = await DynamicSecretTotpSchema.parseAsync(inputs);
return providerInputs;
};
const validateConnection = async () => {
return true;
};
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const entityId = alphaNumericNanoId(32);
const authenticatorInstance = authenticator.clone();
let secret: string;
let period: number | null | undefined;
let digits: number | null | undefined;
let algorithm: HashAlgorithms | null | undefined;
if (providerInputs.configType === TotpConfigType.URL) {
const urlObj = new URL(providerInputs.url);
secret = urlObj.searchParams.get("secret") as string;
const periodFromUrl = urlObj.searchParams.get("period");
const digitsFromUrl = urlObj.searchParams.get("digits");
const algorithmFromUrl = urlObj.searchParams.get("algorithm");
if (periodFromUrl) {
period = +periodFromUrl;
}
if (digitsFromUrl) {
digits = +digitsFromUrl;
}
if (algorithmFromUrl) {
algorithm = algorithmFromUrl.toLowerCase() as HashAlgorithms;
}
} else {
secret = providerInputs.secret;
period = providerInputs.period;
digits = providerInputs.digits;
algorithm = providerInputs.algorithm as unknown as HashAlgorithms;
}
if (digits) {
authenticatorInstance.options = { digits };
}
if (algorithm) {
authenticatorInstance.options = { algorithm };
}
if (period) {
authenticatorInstance.options = { step: period };
}
return {
entityId,
data: { TOTP: authenticatorInstance.generate(secret), TIME_REMAINING: authenticatorInstance.timeRemaining() }
};
};
const revoke = async (_inputs: unknown, entityId: string) => {
return { entityId };
};
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const renew = async (_inputs: unknown, entityId: string) => {
// No renewal necessary
return { entityId };
};
return {
validateProviderInputs,
validateConnection,
create,
revoke,
renew
};
};

View File

@ -27,7 +27,7 @@ export const initializeHsmModule = () => {
logger.info("PKCS#11 module initialized");
} catch (err) {
logger.error("Failed to initialize PKCS#11 module:", err);
logger.error(err, "Failed to initialize PKCS#11 module");
throw err;
}
};
@ -39,7 +39,7 @@ export const initializeHsmModule = () => {
isInitialized = false;
logger.info("PKCS#11 module finalized");
} catch (err) {
logger.error("Failed to finalize PKCS#11 module:", err);
logger.error(err, "Failed to finalize PKCS#11 module");
throw err;
}
}

View File

@ -36,8 +36,7 @@ export const testLDAPConfig = async (ldapConfig: TLDAPConfig): Promise<boolean>
});
ldapClient.on("error", (err) => {
logger.error("LDAP client error:", err);
logger.error(err);
logger.error(err, "LDAP client error");
resolve(false);
});

View File

@ -161,8 +161,8 @@ export const licenseServiceFactory = ({
}
} catch (error) {
logger.error(
`getPlan: encountered an error when fetching pan [orgId=${orgId}] [projectId=${projectId}] [error]`,
error
error,
`getPlan: encountered an error when fetching pan [orgId=${orgId}] [projectId=${projectId}] [error]`
);
await keyStore.setItemWithExpiry(
FEATURE_CACHE_KEY(orgId),

View File

@ -31,7 +31,6 @@ export enum OrgPermissionSubjects {
}
export type OrgPermissionSet =
| [OrgPermissionActions.Read, OrgPermissionSubjects.Workspace]
| [OrgPermissionActions.Create, OrgPermissionSubjects.Workspace]
| [OrgPermissionActions, OrgPermissionSubjects.Role]
| [OrgPermissionActions, OrgPermissionSubjects.Member]
@ -52,7 +51,6 @@ export type OrgPermissionSet =
const buildAdminPermission = () => {
const { can, rules } = new AbilityBuilder<MongoAbility<OrgPermissionSet>>(createMongoAbility);
// ws permissions
can(OrgPermissionActions.Read, OrgPermissionSubjects.Workspace);
can(OrgPermissionActions.Create, OrgPermissionSubjects.Workspace);
// role permission
can(OrgPermissionActions.Read, OrgPermissionSubjects.Role);
@ -135,7 +133,6 @@ export const orgAdminPermissions = buildAdminPermission();
const buildMemberPermission = () => {
const { can, rules } = new AbilityBuilder<MongoAbility<OrgPermissionSet>>(createMongoAbility);
can(OrgPermissionActions.Read, OrgPermissionSubjects.Workspace);
can(OrgPermissionActions.Create, OrgPermissionSubjects.Workspace);
can(OrgPermissionActions.Read, OrgPermissionSubjects.Member);
can(OrgPermissionActions.Read, OrgPermissionSubjects.Groups);

View File

@ -127,14 +127,15 @@ export const permissionDALFactory = (db: TDbClient) => {
const getProjectPermission = async (userId: string, projectId: string) => {
try {
const subQueryUserGroups = db(TableName.UserGroupMembership).where("userId", userId).select("groupId");
const docs = await db
.replicaNode()(TableName.Users)
.where(`${TableName.Users}.id`, userId)
.leftJoin(TableName.UserGroupMembership, `${TableName.UserGroupMembership}.userId`, `${TableName.Users}.id`)
.leftJoin(TableName.GroupProjectMembership, (queryBuilder) => {
void queryBuilder
.on(`${TableName.GroupProjectMembership}.projectId`, db.raw("?", [projectId]))
.andOn(`${TableName.GroupProjectMembership}.groupId`, `${TableName.UserGroupMembership}.groupId`);
// @ts-expect-error akhilmhdh: this is valid knexjs query. Its just ts type argument is missing it
.andOnIn(`${TableName.GroupProjectMembership}.groupId`, subQueryUserGroups);
})
.leftJoin(
TableName.GroupProjectMembershipRole,

View File

@ -1,14 +1,7 @@
import picomatch from "picomatch";
import { z } from "zod";
export enum PermissionConditionOperators {
$IN = "$in",
$ALL = "$all",
$REGEX = "$regex",
$EQ = "$eq",
$NEQ = "$ne",
$GLOB = "$glob"
}
import { PermissionConditionOperators } from "@app/lib/casl";
export const PermissionConditionSchema = {
[PermissionConditionOperators.$IN]: z.string().trim().min(1).array(),

View File

@ -1,10 +1,10 @@
import { AbilityBuilder, createMongoAbility, ForcedSubject, MongoAbility } from "@casl/ability";
import { z } from "zod";
import { conditionsMatcher } from "@app/lib/casl";
import { conditionsMatcher, PermissionConditionOperators } from "@app/lib/casl";
import { UnpackedPermissionSchema } from "@app/server/routes/santizedSchemas/permission";
import { PermissionConditionOperators, PermissionConditionSchema } from "./permission-types";
import { PermissionConditionSchema } from "./permission-types";
export enum ProjectPermissionActions {
Read = "read",

View File

@ -46,7 +46,7 @@ export const rateLimitServiceFactory = ({ rateLimitDAL, licenseService }: TRateL
}
return rateLimit;
} catch (err) {
logger.error("Error fetching rate limits %o", err);
logger.error(err, "Error fetching rate limits");
return undefined;
}
};
@ -69,12 +69,12 @@ export const rateLimitServiceFactory = ({ rateLimitDAL, licenseService }: TRateL
mfaRateLimit: rateLimit.mfaRateLimit
};
logger.info(`syncRateLimitConfiguration: rate limit configuration: %o`, newRateLimitMaxConfiguration);
logger.info(newRateLimitMaxConfiguration, "syncRateLimitConfiguration: rate limit configuration");
Object.freeze(newRateLimitMaxConfiguration);
rateLimitMaxConfiguration = newRateLimitMaxConfiguration;
}
} catch (error) {
logger.error(`Error syncing rate limit configurations: %o`, error);
logger.error(error, "Error syncing rate limit configurations");
}
};

View File

@ -18,6 +18,7 @@ import { TGroupProjectDALFactory } from "@app/services/group-project/group-proje
import { TOrgDALFactory } from "@app/services/org/org-dal";
import { deleteOrgMembershipFn } from "@app/services/org/org-fns";
import { getDefaultOrgMembershipRole } from "@app/services/org/org-role-fns";
import { OrgAuthMethod } from "@app/services/org/org-types";
import { TOrgMembershipDALFactory } from "@app/services/org-membership/org-membership-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
@ -71,6 +72,7 @@ type TScimServiceFactoryDep = {
| "deleteMembershipById"
| "transaction"
| "updateMembershipById"
| "findOrgById"
>;
orgMembershipDAL: Pick<
TOrgMembershipDALFactory,
@ -288,8 +290,7 @@ export const scimServiceFactory = ({
const createScimUser = async ({ externalId, email, firstName, lastName, orgId }: TCreateScimUserDTO) => {
if (!email) throw new ScimRequestError({ detail: "Invalid request. Missing email.", status: 400 });
const org = await orgDAL.findById(orgId);
const org = await orgDAL.findOrgById(orgId);
if (!org)
throw new ScimRequestError({
detail: "Organization not found",
@ -302,13 +303,24 @@ export const scimServiceFactory = ({
status: 403
});
if (!org.orgAuthMethod) {
throw new ScimRequestError({
detail: "Neither SAML or OIDC SSO is configured",
status: 400
});
}
const appCfg = getConfig();
const serverCfg = await getServerCfg();
const aliasType = org.orgAuthMethod === OrgAuthMethod.OIDC ? UserAliasType.OIDC : UserAliasType.SAML;
const trustScimEmails =
org.orgAuthMethod === OrgAuthMethod.OIDC ? serverCfg.trustOidcEmails : serverCfg.trustSamlEmails;
const userAlias = await userAliasDAL.findOne({
externalId,
orgId,
aliasType: UserAliasType.SAML
aliasType
});
const { user: createdUser, orgMembership: createdOrgMembership } = await userDAL.transaction(async (tx) => {
@ -349,7 +361,7 @@ export const scimServiceFactory = ({
);
}
} else {
if (serverCfg.trustSamlEmails) {
if (trustScimEmails) {
user = await userDAL.findOne(
{
email,
@ -367,9 +379,9 @@ export const scimServiceFactory = ({
);
user = await userDAL.create(
{
username: serverCfg.trustSamlEmails ? email : uniqueUsername,
username: trustScimEmails ? email : uniqueUsername,
email,
isEmailVerified: serverCfg.trustSamlEmails,
isEmailVerified: trustScimEmails,
firstName,
lastName,
authMethods: [],
@ -382,7 +394,7 @@ export const scimServiceFactory = ({
await userAliasDAL.create(
{
userId: user.id,
aliasType: UserAliasType.SAML,
aliasType,
externalId,
emails: email ? [email] : [],
orgId
@ -437,7 +449,7 @@ export const scimServiceFactory = ({
recipients: [email],
substitutions: {
organizationName: org.name,
callback_url: `${appCfg.SITE_URL}/api/v1/sso/redirect/saml2/organizations/${org.slug}`
callback_url: `${appCfg.SITE_URL}/api/v1/sso/redirect/organizations/${org.slug}`
}
});
}
@ -456,6 +468,14 @@ export const scimServiceFactory = ({
// partial
const updateScimUser = async ({ orgMembershipId, orgId, operations }: TUpdateScimUserDTO) => {
const org = await orgDAL.findOrgById(orgId);
if (!org.orgAuthMethod) {
throw new ScimRequestError({
detail: "Neither SAML or OIDC SSO is configured",
status: 400
});
}
const [membership] = await orgDAL
.findMembership({
[`${TableName.OrgMembership}.id` as "id"]: orgMembershipId,
@ -493,6 +513,9 @@ export const scimServiceFactory = ({
scimPatch(scimUser, operations);
const serverCfg = await getServerCfg();
const trustScimEmails =
org.orgAuthMethod === OrgAuthMethod.OIDC ? serverCfg.trustOidcEmails : serverCfg.trustSamlEmails;
await userDAL.transaction(async (tx) => {
await orgMembershipDAL.updateById(
membership.id,
@ -508,7 +531,7 @@ export const scimServiceFactory = ({
firstName: scimUser.name.givenName,
email: scimUser.emails[0].value,
lastName: scimUser.name.familyName,
isEmailVerified: hasEmailChanged ? serverCfg.trustSamlEmails : true
isEmailVerified: hasEmailChanged ? trustScimEmails : true
},
tx
);
@ -526,6 +549,14 @@ export const scimServiceFactory = ({
email,
externalId
}: TReplaceScimUserDTO) => {
const org = await orgDAL.findOrgById(orgId);
if (!org.orgAuthMethod) {
throw new ScimRequestError({
detail: "Neither SAML or OIDC SSO is configured",
status: 400
});
}
const [membership] = await orgDAL
.findMembership({
[`${TableName.OrgMembership}.id` as "id"]: orgMembershipId,
@ -555,7 +586,7 @@ export const scimServiceFactory = ({
await userAliasDAL.update(
{
orgId,
aliasType: UserAliasType.SAML,
aliasType: org.orgAuthMethod === OrgAuthMethod.OIDC ? UserAliasType.OIDC : UserAliasType.SAML,
userId: membership.userId
},
{
@ -576,7 +607,8 @@ export const scimServiceFactory = ({
firstName,
email,
lastName,
isEmailVerified: serverCfg.trustSamlEmails
isEmailVerified:
org.orgAuthMethod === OrgAuthMethod.OIDC ? serverCfg.trustOidcEmails : serverCfg.trustSamlEmails
},
tx
);

View File

@ -238,11 +238,11 @@ export const secretScanningQueueFactory = ({
});
queueService.listen(QueueName.SecretPushEventScan, "failed", (job, err) => {
logger.error("Failed to secret scan on push", job?.data, err);
logger.error(err, "Failed to secret scan on push", job?.data);
});
queueService.listen(QueueName.SecretFullRepoScan, "failed", (job, err) => {
logger.error("Failed to do full repo secret scan", job?.data, err);
logger.error(err, "Failed to do full repo secret scan", job?.data);
});
return { startFullRepoScan, startPushEventScan };

View File

@ -391,6 +391,7 @@ export const PROJECTS = {
CREATE: {
organizationSlug: "The slug of the organization to create the project in.",
projectName: "The name of the project to create.",
projectDescription: "An optional description label for the project.",
slug: "An optional slug for the project.",
template: "The name of the project template, if specified, to apply to this project."
},
@ -403,6 +404,7 @@ export const PROJECTS = {
UPDATE: {
workspaceId: "The ID of the project to update.",
name: "The new name of the project.",
projectDescription: "An optional description label for the project.",
autoCapitalization: "Disable or enable auto-capitalization for the project."
},
GET_KEY: {
@ -1080,7 +1082,8 @@ export const INTEGRATION = {
shouldDisableDelete: "The flag to disable deletion of secrets in AWS Parameter Store.",
shouldMaskSecrets: "Specifies if the secrets synced from Infisical to Gitlab should be marked as 'Masked'.",
shouldProtectSecrets: "Specifies if the secrets synced from Infisical to Gitlab should be marked as 'Protected'.",
shouldEnableDelete: "The flag to enable deletion of secrets."
shouldEnableDelete: "The flag to enable deletion of secrets.",
octopusDeployScopeValues: "Specifies the scope values to set on synced secrets to Octopus Deploy."
}
},
UPDATE: {

View File

@ -54,3 +54,12 @@ export const isAtLeastAsPrivileged = (permissions1: MongoAbility, permissions2:
return set1.size >= set2.size;
};
export enum PermissionConditionOperators {
$IN = "$in",
$ALL = "$all",
$REGEX = "$regex",
$EQ = "$eq",
$NEQ = "$ne",
$GLOB = "$glob"
}

View File

@ -1,7 +1,7 @@
import { Logger } from "pino";
import { z } from "zod";
import { removeTrailingSlash } from "../fn";
import { CustomLogger } from "../logger/logger";
import { zpStr } from "../zod";
export const GITLAB_URL = "https://gitlab.com";
@ -10,7 +10,7 @@ export const GITLAB_URL = "https://gitlab.com";
export const IS_PACKAGED = (process as any)?.pkg !== undefined;
const zodStrBool = z
.enum(["true", "false"])
.string()
.optional()
.transform((val) => val === "true");
@ -212,7 +212,7 @@ let envCfg: Readonly<z.infer<typeof envSchema>>;
export const getConfig = () => envCfg;
// cannot import singleton logger directly as it needs config to load various transport
export const initEnvConfig = (logger?: Logger) => {
export const initEnvConfig = (logger?: CustomLogger) => {
const parsedEnv = envSchema.safeParse(process.env);
if (!parsedEnv.success) {
(logger ?? console).error("Invalid environment variables. Check the error below");

View File

@ -1,6 +1,8 @@
/* eslint-disable @typescript-eslint/no-unsafe-argument */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
// logger follows a singleton pattern
// easier to use it that's all.
import { requestContext } from "@fastify/request-context";
import pino, { Logger } from "pino";
import { z } from "zod";
@ -13,14 +15,37 @@ const logLevelToSeverityLookup: Record<string, string> = {
"60": "CRITICAL"
};
// eslint-disable-next-line import/no-mutable-exports
export let logger: Readonly<Logger>;
// akhilmhdh:
// The logger is not placed in the main app config to avoid a circular dependency.
// The config requires the logger to display errors when an invalid environment is supplied.
// On the other hand, the logger needs the config to obtain credentials for AWS or other transports.
// By keeping the logger separate, it becomes an independent package.
// We define our own custom logger interface to enforce structure to the logging methods.
export interface CustomLogger extends Omit<Logger, "info" | "error" | "warn" | "debug"> {
info: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
error: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
warn: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
debug: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
}
// eslint-disable-next-line import/no-mutable-exports
export let logger: Readonly<CustomLogger>;
const loggerConfig = z.object({
AWS_CLOUDWATCH_LOG_GROUP_NAME: z.string().default("infisical-log-stream"),
AWS_CLOUDWATCH_LOG_REGION: z.string().default("us-east-1"),
@ -62,6 +87,17 @@ const redactedKeys = [
"config"
];
const UNKNOWN_REQUEST_ID = "UNKNOWN_REQUEST_ID";
const extractRequestId = () => {
try {
return requestContext.get("requestId") || UNKNOWN_REQUEST_ID;
} catch (err) {
console.log("failed to get request context", err);
return UNKNOWN_REQUEST_ID;
}
};
export const initLogger = async () => {
const cfg = loggerConfig.parse(process.env);
const targets: pino.TransportMultiOptions["targets"][number][] = [
@ -94,6 +130,30 @@ export const initLogger = async () => {
targets
});
const wrapLogger = (originalLogger: Logger): CustomLogger => {
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.info = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).info(obj, msg, ...args);
};
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.error = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).error(obj, msg, ...args);
};
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.warn = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).warn(obj, msg, ...args);
};
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.debug = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).debug(obj, msg, ...args);
};
return originalLogger;
};
logger = pino(
{
mixin(_context, level) {
@ -113,5 +173,6 @@ export const initLogger = async () => {
// eslint-disable-next-line @typescript-eslint/no-unsafe-argument
transport
);
return logger;
return wrapLogger(logger);
};

View File

@ -10,13 +10,15 @@ import fastifyFormBody from "@fastify/formbody";
import helmet from "@fastify/helmet";
import type { FastifyRateLimitOptions } from "@fastify/rate-limit";
import ratelimiter from "@fastify/rate-limit";
import { fastifyRequestContext } from "@fastify/request-context";
import fastify from "fastify";
import { Knex } from "knex";
import { Logger } from "pino";
import { HsmModule } from "@app/ee/services/hsm/hsm-types";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig, IS_PACKAGED } from "@app/lib/config/env";
import { CustomLogger } from "@app/lib/logger/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TQueueServiceFactory } from "@app/queue";
import { TSmtpService } from "@app/services/smtp/smtp-service";
@ -35,7 +37,7 @@ type TMain = {
auditLogDb?: Knex;
db: Knex;
smtp: TSmtpService;
logger?: Logger;
logger?: CustomLogger;
queue: TQueueServiceFactory;
keyStore: TKeyStoreFactory;
hsmModule: HsmModule;
@ -47,7 +49,9 @@ export const main = async ({ db, hsmModule, auditLogDb, smtp, logger, queue, key
const server = fastify({
logger: appCfg.NODE_ENV === "test" ? false : logger,
genReqId: () => `req-${alphaNumericNanoId(14)}`,
trustProxy: true,
connectionTimeout: appCfg.isHsmConfigured ? 90_000 : 30_000,
ignoreTrailingSlash: true,
pluginTimeout: 40_000
@ -104,6 +108,13 @@ export const main = async ({ db, hsmModule, auditLogDb, smtp, logger, queue, key
await server.register(maintenanceMode);
await server.register(fastifyRequestContext, {
defaultStoreValues: (request) => ({
requestId: request.id,
log: request.log.child({ requestId: request.id })
})
});
await server.register(registerRoutes, { smtp, queue, db, auditLogDb, keyStore, hsmModule });
if (appCfg.isProductionMode) {

View File

@ -1,4 +1,4 @@
import { ForbiddenError } from "@casl/ability";
import { ForbiddenError, PureAbility } from "@casl/ability";
import fastifyPlugin from "fastify-plugin";
import jwt from "jsonwebtoken";
import { ZodError } from "zod";
@ -39,77 +39,102 @@ export const fastifyErrHandler = fastifyPlugin(async (server: FastifyZodProvider
if (error instanceof BadRequestError) {
void res
.status(HttpStatusCodes.BadRequest)
.send({ statusCode: HttpStatusCodes.BadRequest, message: error.message, error: error.name });
.send({ requestId: req.id, statusCode: HttpStatusCodes.BadRequest, message: error.message, error: error.name });
} else if (error instanceof NotFoundError) {
void res
.status(HttpStatusCodes.NotFound)
.send({ statusCode: HttpStatusCodes.NotFound, message: error.message, error: error.name });
.send({ requestId: req.id, statusCode: HttpStatusCodes.NotFound, message: error.message, error: error.name });
} else if (error instanceof UnauthorizedError) {
void res
.status(HttpStatusCodes.Unauthorized)
.send({ statusCode: HttpStatusCodes.Unauthorized, message: error.message, error: error.name });
void res.status(HttpStatusCodes.Unauthorized).send({
requestId: req.id,
statusCode: HttpStatusCodes.Unauthorized,
message: error.message,
error: error.name
});
} else if (error instanceof DatabaseError || error instanceof InternalServerError) {
void res
.status(HttpStatusCodes.InternalServerError)
.send({ statusCode: HttpStatusCodes.InternalServerError, message: "Something went wrong", error: error.name });
void res.status(HttpStatusCodes.InternalServerError).send({
requestId: req.id,
statusCode: HttpStatusCodes.InternalServerError,
message: "Something went wrong",
error: error.name
});
} else if (error instanceof GatewayTimeoutError) {
void res
.status(HttpStatusCodes.GatewayTimeout)
.send({ statusCode: HttpStatusCodes.GatewayTimeout, message: error.message, error: error.name });
void res.status(HttpStatusCodes.GatewayTimeout).send({
requestId: req.id,
statusCode: HttpStatusCodes.GatewayTimeout,
message: error.message,
error: error.name
});
} else if (error instanceof ZodError) {
void res
.status(HttpStatusCodes.Unauthorized)
.send({ statusCode: HttpStatusCodes.Unauthorized, error: "ValidationFailure", message: error.issues });
void res.status(HttpStatusCodes.Unauthorized).send({
requestId: req.id,
statusCode: HttpStatusCodes.Unauthorized,
error: "ValidationFailure",
message: error.issues
});
} else if (error instanceof ForbiddenError) {
void res.status(HttpStatusCodes.Forbidden).send({
requestId: req.id,
statusCode: HttpStatusCodes.Forbidden,
error: "PermissionDenied",
message: `You are not allowed to ${error.action} on ${error.subjectType} - ${JSON.stringify(error.subject)}`
message: `You are not allowed to ${error.action} on ${error.subjectType}`,
details: (error.ability as PureAbility).rulesFor(error.action as string, error.subjectType).map((el) => ({
action: el.action,
inverted: el.inverted,
subject: el.subject,
conditions: el.conditions
}))
});
} else if (error instanceof ForbiddenRequestError) {
void res.status(HttpStatusCodes.Forbidden).send({
requestId: req.id,
statusCode: HttpStatusCodes.Forbidden,
message: error.message,
error: error.name
});
} else if (error instanceof RateLimitError) {
void res.status(HttpStatusCodes.TooManyRequests).send({
requestId: req.id,
statusCode: HttpStatusCodes.TooManyRequests,
message: error.message,
error: error.name
});
} else if (error instanceof ScimRequestError) {
void res.status(error.status).send({
requestId: req.id,
schemas: error.schemas,
status: error.status,
detail: error.detail
});
} else if (error instanceof OidcAuthError) {
void res
.status(HttpStatusCodes.InternalServerError)
.send({ statusCode: HttpStatusCodes.InternalServerError, message: error.message, error: error.name });
void res.status(HttpStatusCodes.InternalServerError).send({
requestId: req.id,
statusCode: HttpStatusCodes.InternalServerError,
message: error.message,
error: error.name
});
} else if (error instanceof jwt.JsonWebTokenError) {
const message = (() => {
if (error.message === JWTErrors.JwtExpired) {
return "Your token has expired. Please re-authenticate.";
}
if (error.message === JWTErrors.JwtMalformed) {
return "The provided access token is malformed. Please use a valid token or generate a new one and try again.";
}
if (error.message === JWTErrors.InvalidAlgorithm) {
return "The access token is signed with an invalid algorithm. Please provide a valid token and try again.";
}
let errorMessage = error.message;
return error.message;
})();
if (error.message === JWTErrors.JwtExpired) {
errorMessage = "Your token has expired. Please re-authenticate.";
} else if (error.message === JWTErrors.JwtMalformed) {
errorMessage =
"The provided access token is malformed. Please use a valid token or generate a new one and try again.";
} else if (error.message === JWTErrors.InvalidAlgorithm) {
errorMessage =
"The access token is signed with an invalid algorithm. Please provide a valid token and try again.";
}
void res.status(HttpStatusCodes.Forbidden).send({
requestId: req.id,
statusCode: HttpStatusCodes.Forbidden,
error: "TokenError",
message
message: errorMessage
});
} else {
void res.status(HttpStatusCodes.InternalServerError).send({
requestId: req.id,
statusCode: HttpStatusCodes.InternalServerError,
error: "InternalServerError",
message: "Something went wrong"

View File

@ -19,7 +19,7 @@ export const registerSecretScannerGhApp = async (server: FastifyZodProvider) =>
app.on("installation", async (context) => {
const { payload } = context;
logger.info("Installed secret scanner to:", { repositories: payload.repositories });
logger.info({ repositories: payload.repositories }, "Installed secret scanner to");
});
app.on("push", async (context) => {

View File

@ -30,26 +30,32 @@ export const integrationAuthPubSchema = IntegrationAuthsSchema.pick({
export const DefaultResponseErrorsSchema = {
400: z.object({
requestId: z.string(),
statusCode: z.literal(400),
message: z.string(),
error: z.string()
}),
404: z.object({
requestId: z.string(),
statusCode: z.literal(404),
message: z.string(),
error: z.string()
}),
401: z.object({
requestId: z.string(),
statusCode: z.literal(401),
message: z.any(),
error: z.string()
}),
403: z.object({
requestId: z.string(),
statusCode: z.literal(403),
message: z.string(),
details: z.any().optional(),
error: z.string()
}),
500: z.object({
requestId: z.string(),
statusCode: z.literal(500),
message: z.string(),
error: z.string()
@ -206,6 +212,7 @@ export const SanitizedAuditLogStreamSchema = z.object({
export const SanitizedProjectSchema = ProjectsSchema.pick({
id: true,
name: true,
description: true,
slug: true,
autoCapitalization: true,
orgId: true,

View File

@ -5,6 +5,7 @@ import { INTEGRATION_AUTH } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { OctopusDeployScope } from "@app/services/integration-auth/integration-auth-types";
import { integrationAuthPubSchema } from "../sanitizedSchemas";
@ -1008,4 +1009,118 @@ export const registerIntegrationAuthRouter = async (server: FastifyZodProvider)
return { buildConfigs };
}
});
server.route({
method: "GET",
url: "/:integrationAuthId/octopus-deploy/scope-values",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
integrationAuthId: z.string().trim()
}),
querystring: z.object({
scope: z.nativeEnum(OctopusDeployScope),
spaceId: z.string().trim(),
resourceId: z.string().trim()
}),
response: {
200: z.object({
Environments: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Machines: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Actions: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Roles: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Channels: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
TenantTags: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Processes: z
.object({
ProcessType: z.string(),
Name: z.string(),
Id: z.string()
})
.array()
})
}
},
handler: async (req) => {
const scopeValues = await server.services.integrationAuth.getOctopusDeployScopeValues({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationAuthId,
scope: req.query.scope,
spaceId: req.query.spaceId,
resourceId: req.query.resourceId
});
return scopeValues;
}
});
server.route({
method: "GET",
url: "/:integrationAuthId/octopus-deploy/spaces",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
integrationAuthId: z.string().trim()
}),
response: {
200: z.object({
spaces: z
.object({
Name: z.string(),
Id: z.string(),
IsDefault: z.boolean()
})
.array()
})
}
},
handler: async (req) => {
const spaces = await server.services.integrationAuth.getOctopusDeploySpaces({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationAuthId
});
return { spaces };
}
});
};

View File

@ -9,6 +9,7 @@ import { getTelemetryDistinctId } from "@app/server/lib/telemetry";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { IntegrationMetadataSchema } from "@app/services/integration/integration-schema";
import { Integrations } from "@app/services/integration-auth/integration-list";
import { PostHogEventTypes, TIntegrationCreatedEvent } from "@app/services/telemetry/telemetry-types";
import {} from "../sanitizedSchemas";
@ -206,6 +207,33 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
id: req.params.integrationId
});
if (integration.region) {
integration.metadata = {
...(integration.metadata || {}),
region: integration.region
};
}
if (
integration.integration === Integrations.AWS_SECRET_MANAGER ||
integration.integration === Integrations.AWS_PARAMETER_STORE
) {
const awsRoleDetails = await server.services.integration.getIntegrationAWSIamRole({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationId
});
if (awsRoleDetails) {
integration.metadata = {
...(integration.metadata || {}),
awsIamRole: awsRoleDetails.role
};
}
}
return { integration };
}
});

View File

@ -296,6 +296,12 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
.max(64, { message: "Name must be 64 or fewer characters" })
.optional()
.describe(PROJECTS.UPDATE.name),
description: z
.string()
.trim()
.max(256, { message: "Description must be 256 or fewer characters" })
.optional()
.describe(PROJECTS.UPDATE.projectDescription),
autoCapitalization: z.boolean().optional().describe(PROJECTS.UPDATE.autoCapitalization)
}),
response: {
@ -313,6 +319,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
},
update: {
name: req.body.name,
description: req.body.description,
autoCapitalization: req.body.autoCapitalization
},
actorAuthMethod: req.permission.authMethod,

View File

@ -14,10 +14,12 @@ import { Strategy as GoogleStrategy } from "passport-google-oauth20";
import { z } from "zod";
import { getConfig } from "@app/lib/config/env";
import { NotFoundError } from "@app/lib/errors";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { logger } from "@app/lib/logger";
import { fetchGithubEmails } from "@app/lib/requests/github";
import { authRateLimit } from "@app/server/config/rateLimiter";
import { AuthMethod } from "@app/services/auth/auth-type";
import { OrgAuthMethod } from "@app/services/org/org-types";
export const registerSsoRouter = async (server: FastifyZodProvider) => {
const appCfg = getConfig();
@ -196,6 +198,44 @@ export const registerSsoRouter = async (server: FastifyZodProvider) => {
handler: () => {}
});
server.route({
url: "/redirect/organizations/:orgSlug",
method: "GET",
config: {
rateLimit: authRateLimit
},
schema: {
params: z.object({
orgSlug: z.string().trim()
}),
querystring: z.object({
callback_port: z.string().optional()
})
},
handler: async (req, res) => {
const org = await server.services.org.findOrgBySlug(req.params.orgSlug);
if (org.orgAuthMethod === OrgAuthMethod.SAML) {
return res.redirect(
`${appCfg.SITE_URL}/api/v1/sso/redirect/saml2/organizations/${org.slug}?${
req.query.callback_port ? `callback_port=${req.query.callback_port}` : ""
}`
);
}
if (org.orgAuthMethod === OrgAuthMethod.OIDC) {
return res.redirect(
`${appCfg.SITE_URL}/api/v1/sso/oidc/login?orgSlug=${org.slug}${
req.query.callback_port ? `&callbackPort=${req.query.callback_port}` : ""
}`
);
}
throw new BadRequestError({
message: "The organization does not have any SSO configured."
});
}
});
server.route({
url: "/github",
method: "GET",

View File

@ -161,6 +161,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
],
body: z.object({
projectName: z.string().trim().describe(PROJECTS.CREATE.projectName),
projectDescription: z.string().trim().optional().describe(PROJECTS.CREATE.projectDescription),
slug: z
.string()
.min(5)
@ -194,6 +195,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
workspaceName: req.body.projectName,
workspaceDescription: req.body.projectDescription,
slug: req.body.slug,
kmsKeyId: req.body.kmsKeyId,
template: req.body.template
@ -312,8 +314,9 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
slug: slugSchema.describe("The slug of the project to update.")
}),
body: z.object({
name: z.string().trim().optional().describe("The new name of the project."),
autoCapitalization: z.boolean().optional().describe("The new auto-capitalization setting.")
name: z.string().trim().optional().describe(PROJECTS.UPDATE.name),
description: z.string().trim().optional().describe(PROJECTS.UPDATE.projectDescription),
autoCapitalization: z.boolean().optional().describe(PROJECTS.UPDATE.autoCapitalization)
}),
response: {
200: SanitizedProjectSchema
@ -330,6 +333,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
},
update: {
name: req.body.name,
description: req.body.description,
autoCapitalization: req.body.autoCapitalization
},
actorId: req.permission.id,

View File

@ -119,13 +119,6 @@ export const registerSignupRouter = async (server: FastifyZodProvider) => {
if (!userAgent) throw new Error("user agent header is required");
const appCfg = getConfig();
const serverCfg = await getServerCfg();
if (!serverCfg.allowSignUp) {
throw new ForbiddenRequestError({
message: "Signup's are disabled"
});
}
const { user, accessToken, refreshToken, organizationId } =
await server.services.signup.completeEmailAccountSignup({
...req.body,

View File

@ -9,7 +9,7 @@ import { isAuthMethodSaml } from "@app/ee/services/permission/permission-fns";
import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricDecrypt, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { generateUserSrpKeys, getUserPrivateKey } from "@app/lib/crypto/srp";
import { NotFoundError } from "@app/lib/errors";
import { ForbiddenRequestError, NotFoundError } from "@app/lib/errors";
import { isDisposableEmail } from "@app/lib/validator";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
@ -23,6 +23,7 @@ import { TOrgServiceFactory } from "../org/org-service";
import { TProjectMembershipDALFactory } from "../project-membership/project-membership-dal";
import { TProjectUserMembershipRoleDALFactory } from "../project-membership/project-user-membership-role-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { getServerCfg } from "../super-admin/super-admin-service";
import { TUserDALFactory } from "../user/user-dal";
import { UserEncryption } from "../user/user-types";
import { TAuthDALFactory } from "./auth-dal";
@ -151,6 +152,8 @@ export const authSignupServiceFactory = ({
authorization
}: TCompleteAccountSignupDTO) => {
const appCfg = getConfig();
const serverCfg = await getServerCfg();
const user = await userDAL.findOne({ username: email });
if (!user || (user && user.isAccepted)) {
throw new Error("Failed to complete account for complete user");
@ -163,6 +166,12 @@ export const authSignupServiceFactory = ({
authMethod = userAuthMethod;
organizationId = orgId;
} else {
// disallow signup if disabled. we are not doing this for providerAuthToken because we allow signups via saml or sso
if (!serverCfg.allowSignUp) {
throw new ForbiddenRequestError({
message: "Signup's are disabled"
});
}
validateSignUpAuthorization(authorization, user.id);
}

View File

@ -120,7 +120,8 @@ export const identityKubernetesAuthServiceFactory = ({
apiVersion: "authentication.k8s.io/v1",
kind: "TokenReview",
spec: {
token: serviceAccountJwt
token: serviceAccountJwt,
...(identityKubernetesAuth.allowedAudience ? { audiences: [identityKubernetesAuth.allowedAudience] } : {})
}
},
{

View File

@ -182,7 +182,12 @@ export const identityProjectServiceFactory = ({
// validate custom roles input
const customInputRoles = roles.filter(
({ role }) => !Object.values(ProjectMembershipRole).includes(role as ProjectMembershipRole)
({ role }) =>
!Object.values(ProjectMembershipRole)
// we don't want to include custom in this check;
// this unintentionally enables setting slug to custom which is reserved
.filter((r) => r !== ProjectMembershipRole.Custom)
.includes(role as ProjectMembershipRole)
);
const hasCustomRole = Boolean(customInputRoles.length);
const customRoles = hasCustomRole

View File

@ -385,8 +385,8 @@ export const identityTokenAuthServiceFactory = ({
actorOrgId
}: TUpdateTokenAuthTokenDTO) => {
const foundToken = await identityAccessTokenDAL.findOne({
id: tokenId,
authMethod: IdentityAuthMethod.TOKEN_AUTH
[`${TableName.IdentityAccessToken}.id` as "id"]: tokenId,
[`${TableName.IdentityAccessToken}.authMethod` as "authMethod"]: IdentityAuthMethod.TOKEN_AUTH
});
if (!foundToken) throw new NotFoundError({ message: `Token with ID ${tokenId} not found` });
@ -444,8 +444,8 @@ export const identityTokenAuthServiceFactory = ({
}: TRevokeTokenAuthTokenDTO) => {
const identityAccessToken = await identityAccessTokenDAL.findOne({
[`${TableName.IdentityAccessToken}.id` as "id"]: tokenId,
isAccessTokenRevoked: false,
authMethod: IdentityAuthMethod.TOKEN_AUTH
[`${TableName.IdentityAccessToken}.isAccessTokenRevoked` as "isAccessTokenRevoked"]: false,
[`${TableName.IdentityAccessToken}.authMethod` as "authMethod"]: IdentityAuthMethod.TOKEN_AUTH
});
if (!identityAccessToken)
throw new NotFoundError({

View File

@ -1,6 +1,7 @@
/* eslint-disable no-await-in-loop */
import { createAppAuth } from "@octokit/auth-app";
import { Octokit } from "@octokit/rest";
import { Client as OctopusDeployClient, ProjectRepository as OctopusDeployRepository } from "@octopusdeploy/api-client";
import { TIntegrationAuths } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env";
@ -1087,6 +1088,33 @@ const getAppsAzureDevOps = async ({ accessToken, orgName }: { accessToken: strin
return apps;
};
const getAppsOctopusDeploy = async ({
apiKey,
instanceURL,
spaceName = "Default"
}: {
apiKey: string;
instanceURL: string;
spaceName?: string;
}) => {
const client = await OctopusDeployClient.create({
instanceURL,
apiKey,
userAgentApp: "Infisical Integration"
});
const repository = new OctopusDeployRepository(client, spaceName);
const projects = await repository.list({
take: 1000
});
return projects.Items.map((project) => ({
name: project.Name,
appId: project.Id
}));
};
export const getApps = async ({
integration,
integrationAuth,
@ -1260,6 +1288,13 @@ export const getApps = async ({
orgName: azureDevOpsOrgName as string
});
case Integrations.OCTOPUS_DEPLOY:
return getAppsOctopusDeploy({
apiKey: accessToken,
instanceURL: url!,
spaceName: workspaceSlug
});
default:
throw new NotFoundError({ message: `Integration '${integration}' not found` });
}

View File

@ -1,6 +1,7 @@
import { ForbiddenError } from "@casl/ability";
import { createAppAuth } from "@octokit/auth-app";
import { Octokit } from "@octokit/rest";
import { Client as OctopusClient, SpaceRepository as OctopusSpaceRepository } from "@octopusdeploy/api-client";
import AWS from "aws-sdk";
import { SecretEncryptionAlgo, SecretKeyEncoding, TIntegrationAuths, TIntegrationAuthsInsert } from "@app/db/schemas";
@ -9,7 +10,7 @@ import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services
import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { decryptSymmetric128BitHexKeyUTF8, encryptSymmetric128BitHexKeyUTF8 } from "@app/lib/crypto";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import { BadRequestError, InternalServerError, NotFoundError } from "@app/lib/errors";
import { TGenericPermission, TProjectPermission } from "@app/lib/types";
import { TIntegrationDALFactory } from "../integration/integration-dal";
@ -20,6 +21,7 @@ import { getApps } from "./integration-app-list";
import { TIntegrationAuthDALFactory } from "./integration-auth-dal";
import { IntegrationAuthMetadataSchema, TIntegrationAuthMetadata } from "./integration-auth-schema";
import {
OctopusDeployScope,
TBitbucketEnvironment,
TBitbucketWorkspace,
TChecklyGroups,
@ -38,6 +40,8 @@ import {
TIntegrationAuthGithubOrgsDTO,
TIntegrationAuthHerokuPipelinesDTO,
TIntegrationAuthNorthflankSecretGroupDTO,
TIntegrationAuthOctopusDeployProjectScopeValuesDTO,
TIntegrationAuthOctopusDeploySpacesDTO,
TIntegrationAuthQoveryEnvironmentsDTO,
TIntegrationAuthQoveryOrgsDTO,
TIntegrationAuthQoveryProjectDTO,
@ -48,6 +52,7 @@ import {
TIntegrationAuthVercelBranchesDTO,
TNorthflankSecretGroup,
TOauthExchangeDTO,
TOctopusDeployVariableSet,
TSaveIntegrationAccessTokenDTO,
TTeamCityBuildConfig,
TVercelBranches
@ -1521,6 +1526,88 @@ export const integrationAuthServiceFactory = ({
return integrationAuthDAL.create(newIntegrationAuth);
};
const getOctopusDeploySpaces = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id
}: TIntegrationAuthOctopusDeploySpacesDTO) => {
const integrationAuth = await integrationAuthDAL.findById(id);
if (!integrationAuth) throw new NotFoundError({ message: `Integration auth with ID '${id}' not found` });
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integrationAuth.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const { shouldUseSecretV2Bridge, botKey } = await projectBotService.getBotKey(integrationAuth.projectId);
const { accessToken } = await getIntegrationAccessToken(integrationAuth, shouldUseSecretV2Bridge, botKey);
const client = await OctopusClient.create({
apiKey: accessToken,
instanceURL: integrationAuth.url!,
userAgentApp: "Infisical Integration"
});
const spaceRepository = new OctopusSpaceRepository(client);
const spaces = await spaceRepository.list({
partialName: "", // throws error if no string is present...
take: 1000
});
return spaces.Items;
};
const getOctopusDeployScopeValues = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id,
scope,
spaceId,
resourceId
}: TIntegrationAuthOctopusDeployProjectScopeValuesDTO) => {
const integrationAuth = await integrationAuthDAL.findById(id);
if (!integrationAuth) throw new NotFoundError({ message: `Integration auth with ID '${id}' not found` });
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integrationAuth.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const { shouldUseSecretV2Bridge, botKey } = await projectBotService.getBotKey(integrationAuth.projectId);
const { accessToken } = await getIntegrationAccessToken(integrationAuth, shouldUseSecretV2Bridge, botKey);
let url: string;
switch (scope) {
case OctopusDeployScope.Project:
url = `${integrationAuth.url}/api/${spaceId}/projects/${resourceId}/variables`;
break;
// future support tenant, variable set etc.
default:
throw new InternalServerError({ message: `Unhandled Octopus Deploy scope` });
}
// SDK doesn't support variable set...
const { data: variableSet } = await request.get<TOctopusDeployVariableSet>(url, {
headers: {
"X-NuGet-ApiKey": accessToken,
Accept: "application/json"
}
});
return variableSet.ScopeValues;
};
return {
listIntegrationAuthByProjectId,
listOrgIntegrationAuth,
@ -1552,6 +1639,8 @@ export const integrationAuthServiceFactory = ({
getBitbucketWorkspaces,
getBitbucketEnvironments,
getIntegrationAccessToken,
duplicateIntegrationAuth
duplicateIntegrationAuth,
getOctopusDeploySpaces,
getOctopusDeployScopeValues
};
};

View File

@ -193,3 +193,72 @@ export type TIntegrationsWithEnvironment = TIntegrations & {
| null
| undefined;
};
export type TIntegrationAuthOctopusDeploySpacesDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;
export type TIntegrationAuthOctopusDeployProjectScopeValuesDTO = {
id: string;
spaceId: string;
resourceId: string;
scope: OctopusDeployScope;
} & Omit<TProjectPermission, "projectId">;
export enum OctopusDeployScope {
Project = "project"
// add tenant, variable set, etc.
}
export type TOctopusDeployVariableSet = {
Id: string;
OwnerId: string;
Version: number;
Variables: {
Id: string;
Name: string;
Value: string;
Description: string;
Scope: {
Environment?: string[];
Machine?: string[];
Role?: string[];
TargetRole?: string[];
Action?: string[];
User?: string[];
Trigger?: string[];
ParentDeployment?: string[];
Private?: string[];
Channel?: string[];
TenantTag?: string[];
Tenant?: string[];
ProcessOwner?: string[];
};
IsEditable: boolean;
Prompt: {
Description: string;
DisplaySettings: Record<string, string>;
Label: string;
Required: boolean;
} | null;
Type: "String";
IsSensitive: boolean;
}[];
ScopeValues: {
Environments: { Id: string; Name: string }[];
Machines: { Id: string; Name: string }[];
Actions: { Id: string; Name: string }[];
Roles: { Id: string; Name: string }[];
Channels: { Id: string; Name: string }[];
TenantTags: { Id: string; Name: string }[];
Processes: {
ProcessType: string;
Id: string;
Name: string;
}[];
};
SpaceId: string;
Links: {
Self: string;
};
};

View File

@ -34,7 +34,8 @@ export enum Integrations {
HASURA_CLOUD = "hasura-cloud",
RUNDECK = "rundeck",
AZURE_DEVOPS = "azure-devops",
AZURE_APP_CONFIGURATION = "azure-app-configuration"
AZURE_APP_CONFIGURATION = "azure-app-configuration",
OCTOPUS_DEPLOY = "octopus-deploy"
}
export enum IntegrationType {
@ -413,6 +414,15 @@ export const getIntegrationOptions = async () => {
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Octopus Deploy",
slug: "octopus-deploy",
image: "Octopus Deploy.png",
isAvailable: true,
type: "sat",
clientId: "",
docsLink: ""
}
];

View File

@ -32,14 +32,14 @@ import { z } from "zod";
import { SecretType, TIntegrationAuths, TIntegrations } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { BadRequestError } from "@app/lib/errors";
import { BadRequestError, InternalServerError } from "@app/lib/errors";
import { logger } from "@app/lib/logger";
import { TCreateManySecretsRawFn, TUpdateManySecretsRawFn } from "@app/services/secret/secret-types";
import { TIntegrationDALFactory } from "../integration/integration-dal";
import { IntegrationMetadataSchema } from "../integration/integration-schema";
import { IntegrationAuthMetadataSchema } from "./integration-auth-schema";
import { TIntegrationsWithEnvironment } from "./integration-auth-types";
import { OctopusDeployScope, TIntegrationsWithEnvironment, TOctopusDeployVariableSet } from "./integration-auth-types";
import {
IntegrationInitialSyncBehavior,
IntegrationMappingBehavior,
@ -473,7 +473,7 @@ const syncSecretsAzureKeyVault = async ({
id: string; // secret URI
value: string;
attributes: {
enabled: true;
enabled: boolean;
created: number;
updated: number;
recoveryLevel: string;
@ -509,10 +509,19 @@ const syncSecretsAzureKeyVault = async ({
const getAzureKeyVaultSecrets = await paginateAzureKeyVaultSecrets(`${integration.app}/secrets?api-version=7.3`);
const enabledAzureKeyVaultSecrets = getAzureKeyVaultSecrets.filter((secret) => secret.attributes.enabled);
// disabled keys to skip sending updates to
const disabledAzureKeyVaultSecretKeys = getAzureKeyVaultSecrets
.filter(({ attributes }) => !attributes.enabled)
.map((getAzureKeyVaultSecret) => {
return getAzureKeyVaultSecret.id.substring(getAzureKeyVaultSecret.id.lastIndexOf("/") + 1);
});
let lastSlashIndex: number;
const res = (
await Promise.all(
getAzureKeyVaultSecrets.map(async (getAzureKeyVaultSecret) => {
enabledAzureKeyVaultSecrets.map(async (getAzureKeyVaultSecret) => {
if (!lastSlashIndex) {
lastSlashIndex = getAzureKeyVaultSecret.id.lastIndexOf("/");
}
@ -658,6 +667,7 @@ const syncSecretsAzureKeyVault = async ({
}) => {
let isSecretSet = false;
let maxTries = 6;
if (disabledAzureKeyVaultSecretKeys.includes(key)) return;
while (!isSecretSet && maxTries > 0) {
// try to set secret
@ -3075,7 +3085,7 @@ const syncSecretsTerraformCloud = async ({
}) => {
// get secrets from Terraform Cloud
const terraformSecrets = (
await request.get<{ data: { attributes: { key: string; value: string }; id: string }[] }>(
await request.get<{ data: { attributes: { key: string; value: string; sensitive: boolean }; id: string }[] }>(
`${IntegrationUrls.TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars`,
{
headers: {
@ -3089,7 +3099,7 @@ const syncSecretsTerraformCloud = async ({
...obj,
[secret.attributes.key]: secret
}),
{} as Record<string, { attributes: { key: string; value: string }; id: string }>
{} as Record<string, { attributes: { key: string; value: string; sensitive: boolean }; id: string }>
);
const secretsToAdd: { [key: string]: string } = {};
@ -3170,7 +3180,8 @@ const syncSecretsTerraformCloud = async ({
attributes: {
key,
value: secrets[key]?.value,
category: integration.targetService
category: integration.targetService,
sensitive: true
}
}
},
@ -3183,7 +3194,11 @@ const syncSecretsTerraformCloud = async ({
}
);
// case: secret exists in Terraform Cloud
} else if (secrets[key]?.value !== terraformSecrets[key].attributes.value) {
} else if (
// we now set secrets to sensitive in Terraform Cloud, this checks if existing secrets are not sensitive and updates them accordingly
!terraformSecrets[key].attributes.sensitive ||
secrets[key]?.value !== terraformSecrets[key].attributes.value
) {
// -> update secret
await request.patch(
`${IntegrationUrls.TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars/${terraformSecrets[key].id}`,
@ -3193,7 +3208,8 @@ const syncSecretsTerraformCloud = async ({
id: terraformSecrets[key].id,
attributes: {
...terraformSecrets[key],
value: secrets[key]?.value
value: secrets[key]?.value,
sensitive: true
}
}
},
@ -4195,6 +4211,61 @@ const syncSecretsRundeck = async ({
}
};
const syncSecretsOctopusDeploy = async ({
integration,
integrationAuth,
secrets,
accessToken
}: {
integration: TIntegrations;
integrationAuth: TIntegrationAuths;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
let url: string;
switch (integration.scope) {
case OctopusDeployScope.Project:
url = `${integrationAuth.url}/api/${integration.targetEnvironmentId}/projects/${integration.appId}/variables`;
break;
// future support tenant, variable set, etc.
default:
throw new InternalServerError({ message: `Unhandled Octopus Deploy scope: ${integration.scope}` });
}
// SDK doesn't support variable set...
const { data: variableSet } = await request.get<TOctopusDeployVariableSet>(url, {
headers: {
"X-NuGet-ApiKey": accessToken,
Accept: "application/json"
}
});
await request.put(
url,
{
...variableSet,
Variables: Object.entries(secrets).map(([key, value]) => ({
Name: key,
Value: value.value,
Description: value.comment ?? "",
Scope:
(integration.metadata as { octopusDeployScopeValues: TOctopusDeployVariableSet["ScopeValues"] })
?.octopusDeployScopeValues ?? {},
IsEditable: false,
Prompt: null,
Type: "String",
IsSensitive: true
}))
} as unknown as TOctopusDeployVariableSet,
{
headers: {
"X-NuGet-ApiKey": accessToken,
Accept: "application/json"
}
}
);
};
/**
* Sync/push [secrets] to [app] in integration named [integration]
*
@ -4507,6 +4578,14 @@ export const syncIntegrationSecrets = async ({
accessToken
});
break;
case Integrations.OCTOPUS_DEPLOY:
await syncSecretsOctopusDeploy({
integration,
integrationAuth,
secrets,
accessToken
});
break;
default:
throw new BadRequestError({ message: "Invalid integration" });
}

View File

@ -46,5 +46,18 @@ export const IntegrationMetadataSchema = z.object({
shouldDisableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldDisableDelete),
shouldEnableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldEnableDelete),
shouldMaskSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldMaskSecrets),
shouldProtectSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldProtectSecrets)
shouldProtectSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldProtectSecrets),
octopusDeployScopeValues: z
.object({
// in Octopus Deploy Scope Value Format
Environment: z.string().array().optional(),
Action: z.string().array().optional(),
Channel: z.string().array().optional(),
Machine: z.string().array().optional(),
ProcessOwner: z.string().array().optional(),
Role: z.string().array().optional()
})
.optional()
.describe(INTEGRATION.CREATE.metadata.octopusDeployScopeValues)
});

View File

@ -9,6 +9,7 @@ import { TIntegrationAuthDALFactory } from "../integration-auth/integration-auth
import { TIntegrationAuthServiceFactory } from "../integration-auth/integration-auth-service";
import { deleteIntegrationSecrets } from "../integration-auth/integration-delete-secret";
import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types";
import { TProjectBotServiceFactory } from "../project-bot/project-bot-service";
import { TSecretDALFactory } from "../secret/secret-dal";
import { TSecretQueueFactory } from "../secret/secret-queue";
@ -237,6 +238,46 @@ export const integrationServiceFactory = ({
return { ...integration, envId: integration.environment.id };
};
const getIntegrationAWSIamRole = async ({ id, actor, actorAuthMethod, actorId, actorOrgId }: TGetIntegrationDTO) => {
const integration = await integrationDAL.findById(id);
if (!integration) {
throw new NotFoundError({
message: `Integration with ID '${id}' not found`
});
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integration?.projectId || "",
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const integrationAuth = await integrationAuthDAL.findById(integration.integrationAuthId);
const { decryptor: secretManagerDecryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.SecretManager,
projectId: integration.projectId
});
let awsIamRole: string | null = null;
if (integrationAuth.encryptedAwsAssumeIamRoleArn) {
const awsAssumeRoleArn = secretManagerDecryptor({
cipherTextBlob: Buffer.from(integrationAuth.encryptedAwsAssumeIamRoleArn)
}).toString();
if (awsAssumeRoleArn) {
const [, role] = awsAssumeRoleArn.split(":role/");
awsIamRole = role;
}
}
return {
role: awsIamRole
};
};
const deleteIntegration = async ({
actorId,
id,
@ -329,6 +370,7 @@ export const integrationServiceFactory = ({
deleteIntegration,
listIntegrationByProject,
getIntegration,
getIntegrationAWSIamRole,
syncIntegration
};
};

View File

@ -14,6 +14,8 @@ import { DatabaseError } from "@app/lib/errors";
import { buildFindFilter, ormify, selectAllTableCols, TFindFilter, TFindOpt, withTransaction } from "@app/lib/knex";
import { generateKnexQueryFromScim } from "@app/lib/knex/scim";
import { OrgAuthMethod } from "./org-types";
export type TOrgDALFactory = ReturnType<typeof orgDALFactory>;
export const orgDALFactory = (db: TDbClient) => {
@ -21,13 +23,78 @@ export const orgDALFactory = (db: TDbClient) => {
const findOrgById = async (orgId: string) => {
try {
const org = await db.replicaNode()(TableName.Organization).where({ id: orgId }).first();
const org = (await db
.replicaNode()(TableName.Organization)
.where({ [`${TableName.Organization}.id` as "id"]: orgId })
.leftJoin(TableName.SamlConfig, (qb) => {
qb.on(`${TableName.SamlConfig}.orgId`, "=", `${TableName.Organization}.id`).andOn(
`${TableName.SamlConfig}.isActive`,
"=",
db.raw("true")
);
})
.leftJoin(TableName.OidcConfig, (qb) => {
qb.on(`${TableName.OidcConfig}.orgId`, "=", `${TableName.Organization}.id`).andOn(
`${TableName.OidcConfig}.isActive`,
"=",
db.raw("true")
);
})
.select(selectAllTableCols(TableName.Organization))
.select(
db.raw(`
CASE
WHEN ${TableName.SamlConfig}."orgId" IS NOT NULL THEN '${OrgAuthMethod.SAML}'
WHEN ${TableName.OidcConfig}."orgId" IS NOT NULL THEN '${OrgAuthMethod.OIDC}'
ELSE ''
END as "orgAuthMethod"
`)
)
.first()) as TOrganizations & { orgAuthMethod?: string };
return org;
} catch (error) {
throw new DatabaseError({ error, name: "Find org by id" });
}
};
const findOrgBySlug = async (orgSlug: string) => {
try {
const org = (await db
.replicaNode()(TableName.Organization)
.where({ [`${TableName.Organization}.slug` as "slug"]: orgSlug })
.leftJoin(TableName.SamlConfig, (qb) => {
qb.on(`${TableName.SamlConfig}.orgId`, "=", `${TableName.Organization}.id`).andOn(
`${TableName.SamlConfig}.isActive`,
"=",
db.raw("true")
);
})
.leftJoin(TableName.OidcConfig, (qb) => {
qb.on(`${TableName.OidcConfig}.orgId`, "=", `${TableName.Organization}.id`).andOn(
`${TableName.OidcConfig}.isActive`,
"=",
db.raw("true")
);
})
.select(selectAllTableCols(TableName.Organization))
.select(
db.raw(`
CASE
WHEN ${TableName.SamlConfig}."orgId" IS NOT NULL THEN '${OrgAuthMethod.SAML}'
WHEN ${TableName.OidcConfig}."orgId" IS NOT NULL THEN '${OrgAuthMethod.OIDC}'
ELSE ''
END as "orgAuthMethod"
`)
)
.first()) as TOrganizations & { orgAuthMethod?: string };
return org;
} catch (error) {
throw new DatabaseError({ error, name: "Find org by slug" });
}
};
// special query
const findAllOrgsByUserId = async (userId: string): Promise<(TOrganizations & { orgAuthMethod: string })[]> => {
try {
@ -398,6 +465,7 @@ export const orgDALFactory = (db: TDbClient) => {
findAllOrgMembers,
countAllOrgMembers,
findOrgById,
findOrgBySlug,
findAllOrgsByUserId,
ghostUserExists,
findOrgMembersByUsername,

View File

@ -187,6 +187,15 @@ export const orgServiceFactory = ({
return members;
};
const findOrgBySlug = async (slug: string) => {
const org = await orgDAL.findOrgBySlug(slug);
if (!org) {
throw new NotFoundError({ message: `Organization with slug '${slug}' not found` });
}
return org;
};
const findAllWorkspaces = async ({ actor, actorId, orgId }: TFindAllWorkspacesDTO) => {
const organizationWorkspaceIds = new Set((await projectDAL.find({ orgId })).map((workspace) => workspace.id));
@ -275,6 +284,7 @@ export const orgServiceFactory = ({
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Settings);
const plan = await licenseService.getPlan(orgId);
const currentOrg = await orgDAL.findOrgById(actorOrgId);
if (enforceMfa !== undefined) {
if (!plan.enforceMfa) {
@ -305,6 +315,11 @@ export const orgServiceFactory = ({
"Failed to enable/disable SCIM provisioning due to plan restriction. Upgrade plan to enable/disable SCIM provisioning."
});
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Scim);
if (scimEnabled && !currentOrg.orgAuthMethod) {
throw new BadRequestError({
message: "Cannot enable SCIM when neither SAML or OIDC is configured."
});
}
}
if (authEnforced) {
@ -1132,6 +1147,7 @@ export const orgServiceFactory = ({
createIncidentContact,
deleteIncidentContact,
getOrgGroups,
listProjectMembershipsByOrgMembershipId
listProjectMembershipsByOrgMembershipId,
findOrgBySlug
};
};

View File

@ -74,3 +74,8 @@ export type TGetOrgGroupsDTO = TOrgPermission;
export type TListProjectMembershipsByOrgMembershipIdDTO = {
orgMembershipId: string;
} & TOrgPermission;
export enum OrgAuthMethod {
OIDC = "oidc",
SAML = "saml"
}

View File

@ -280,7 +280,12 @@ export const projectMembershipServiceFactory = ({
// validate custom roles input
const customInputRoles = roles.filter(
({ role }) => !Object.values(ProjectMembershipRole).includes(role as ProjectMembershipRole)
({ role }) =>
!Object.values(ProjectMembershipRole)
// we don't want to include custom in this check;
// this unintentionally enables setting slug to custom which is reserved
.filter((r) => r !== ProjectMembershipRole.Custom)
.includes(role as ProjectMembershipRole)
);
const hasCustomRole = Boolean(customInputRoles.length);
if (hasCustomRole) {

View File

@ -191,6 +191,10 @@ export const projectDALFactory = (db: TDbClient) => {
return project;
} catch (error) {
if (error instanceof NotFoundError) {
throw error;
}
throw new DatabaseError({ error, name: "Find all projects" });
}
};
@ -240,6 +244,10 @@ export const projectDALFactory = (db: TDbClient) => {
return project;
} catch (error) {
if (error instanceof NotFoundError || error instanceof UnauthorizedError) {
throw error;
}
throw new DatabaseError({ error, name: "Find project by slug" });
}
};
@ -260,7 +268,7 @@ export const projectDALFactory = (db: TDbClient) => {
}
throw new BadRequestError({ message: "Invalid filter type" });
} catch (error) {
if (error instanceof BadRequestError) {
if (error instanceof BadRequestError || error instanceof NotFoundError || error instanceof UnauthorizedError) {
throw error;
}
throw new DatabaseError({ error, name: `Failed to find project by ${filter.type}` });

View File

@ -285,11 +285,14 @@ export const projectQueueFactory = ({
if (!orgMembership) {
// This can happen. Since we don't remove project memberships and project keys when a user is removed from an org, this is a valid case.
logger.info("User is not in organization", {
userId: key.receiverId,
orgId: project.orgId,
projectId: project.id
});
logger.info(
{
userId: key.receiverId,
orgId: project.orgId,
projectId: project.id
},
"User is not in organization"
);
// eslint-disable-next-line no-continue
continue;
}
@ -551,10 +554,10 @@ export const projectQueueFactory = ({
.catch(() => [null]);
if (!project) {
logger.error("Failed to upgrade project, because no project was found", data);
logger.error(data, "Failed to upgrade project, because no project was found");
} else {
await projectDAL.setProjectUpgradeStatus(data.projectId, ProjectUpgradeStatus.Failed);
logger.error("Failed to upgrade project", err, {
logger.error(err, "Failed to upgrade project", {
extra: {
project,
jobData: data

View File

@ -149,6 +149,7 @@ export const projectServiceFactory = ({
actorOrgId,
actorAuthMethod,
workspaceName,
workspaceDescription,
slug: projectSlug,
kmsKeyId,
tx: trx,
@ -206,6 +207,7 @@ export const projectServiceFactory = ({
const project = await projectDAL.create(
{
name: workspaceName,
description: workspaceDescription,
orgId: organization.id,
slug: projectSlug || slugify(`${workspaceName}-${alphaNumericNanoId(4)}`),
kmsSecretManagerKeyId: kmsKeyId,
@ -496,6 +498,7 @@ export const projectServiceFactory = ({
const updatedProject = await projectDAL.updateById(project.id, {
name: update.name,
description: update.description,
autoCapitalization: update.autoCapitalization
});
return updatedProject;

View File

@ -29,6 +29,7 @@ export type TCreateProjectDTO = {
actorId: string;
actorOrgId?: string;
workspaceName: string;
workspaceDescription?: string;
slug?: string;
kmsKeyId?: string;
createDefaultEnvs?: boolean;
@ -69,6 +70,7 @@ export type TUpdateProjectDTO = {
filter: Filter;
update: {
name?: string;
description?: string;
autoCapitalization?: boolean;
};
} & Omit<TProjectPermission, "projectId">;

View File

@ -365,9 +365,8 @@ export const recursivelyGetSecretPaths = async ({
folderId: p.folderId
}));
const pathsInCurrentDirectory = paths.filter((folder) =>
folder.path.startsWith(currentPath === "/" ? "" : currentPath)
);
// path relative will start with ../ if its outside directory
const pathsInCurrentDirectory = paths.filter((folder) => !path.relative(currentPath, folder.path).startsWith(".."));
return pathsInCurrentDirectory;
};

View File

@ -142,7 +142,7 @@ export const fnTriggerWebhook = async ({
!isDisabled && picomatch.isMatch(secretPath, hookSecretPath, { strictSlashes: false })
);
if (!toBeTriggeredHooks.length) return;
logger.info("Secret webhook job started", { environment, secretPath, projectId });
logger.info({ environment, secretPath, projectId }, "Secret webhook job started");
const project = await projectDAL.findById(projectId);
const webhooksTriggered = await Promise.allSettled(
toBeTriggeredHooks.map((hook) =>
@ -195,5 +195,5 @@ export const fnTriggerWebhook = async ({
);
}
});
logger.info("Secret webhook job ended", { environment, secretPath, projectId });
logger.info({ environment, secretPath, projectId }, "Secret webhook job ended");
};

View File

@ -10,7 +10,7 @@ require (
github.com/fatih/semgroup v1.2.0
github.com/gitleaks/go-gitdiff v0.8.0
github.com/h2non/filetype v1.1.3
github.com/infisical/go-sdk v0.3.8
github.com/infisical/go-sdk v0.4.3
github.com/mattn/go-isatty v0.0.20
github.com/muesli/ansi v0.0.0-20221106050444-61f0cd9a192a
github.com/muesli/mango-cobra v1.2.0

View File

@ -265,8 +265,8 @@ github.com/ianlancetaylor/demangle v0.0.0-20181102032728-5e5cf60278f6/go.mod h1:
github.com/ianlancetaylor/demangle v0.0.0-20200824232613-28f6c0f3b639/go.mod h1:aSSvb/t6k1mPoxDqO4vJh6VOCGPwU4O0C2/Eqndh1Sc=
github.com/inconshreveable/mousetrap v1.0.1 h1:U3uMjPSQEBMNp1lFxmllqCPM6P5u/Xq7Pgzkat/bFNc=
github.com/inconshreveable/mousetrap v1.0.1/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
github.com/infisical/go-sdk v0.3.8 h1:0dGOhF3cwt0q5QzpnUs4lxwBiEza+DQYOyvEn7AfrM0=
github.com/infisical/go-sdk v0.3.8/go.mod h1:HHW7DgUqoolyQIUw/9HdpkZ3bDLwWyZ0HEtYiVaDKQw=
github.com/infisical/go-sdk v0.4.3 h1:O5ZJ2eCBAZDE9PIAfBPq9Utb2CgQKrhmj9R0oFTRu4U=
github.com/infisical/go-sdk v0.4.3/go.mod h1:6fWzAwTPIoKU49mQ2Oxu+aFnJu9n7k2JcNrZjzhHM2M=
github.com/jedib0t/go-pretty v4.3.0+incompatible h1:CGs8AVhEKg/n9YbUenWmNStRW2PHJzaeDodcfvRAbIo=
github.com/jedib0t/go-pretty v4.3.0+incompatible/go.mod h1:XemHduiw8R651AF9Pt4FwCTKeG3oo7hrHJAoznj9nag=
github.com/json-iterator/go v1.1.11/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=

View File

@ -205,6 +205,25 @@ func CallGetAllWorkSpacesUserBelongsTo(httpClient *resty.Client) (GetWorkSpacesR
return workSpacesResponse, nil
}
func CallGetProjectById(httpClient *resty.Client, id string) (Project, error) {
var projectResponse GetProjectByIdResponse
response, err := httpClient.
R().
SetResult(&projectResponse).
SetHeader("User-Agent", USER_AGENT).
Get(fmt.Sprintf("%v/v1/workspace/%s", config.INFISICAL_URL, id))
if err != nil {
return Project{}, err
}
if response.IsError() {
return Project{}, fmt.Errorf("CallGetProjectById: Unsuccessful response: [response=%v]", response)
}
return projectResponse.Project, nil
}
func CallIsAuthenticated(httpClient *resty.Client) bool {
var workSpacesResponse GetWorkSpacesResponse
response, err := httpClient.

View File

@ -128,6 +128,10 @@ type GetWorkSpacesResponse struct {
} `json:"workspaces"`
}
type GetProjectByIdResponse struct {
Project Project `json:"workspace"`
}
type GetOrganizationsResponse struct {
Organizations []struct {
ID string `json:"id"`
@ -163,6 +167,12 @@ type Secret struct {
PlainTextKey string `json:"plainTextKey"`
}
type Project struct {
ID string `json:"id"`
Name string `json:"name"`
Slug string `json:"slug"`
}
type RawSecret struct {
SecretKey string `json:"secretKey,omitempty"`
SecretValue string `json:"secretValue,omitempty"`

View File

@ -0,0 +1,571 @@
/*
Copyright (c) 2023 Infisical Inc.
*/
package cmd
import (
"context"
"fmt"
"github.com/Infisical/infisical-merge/packages/api"
"github.com/Infisical/infisical-merge/packages/config"
"github.com/Infisical/infisical-merge/packages/visualize"
// "github.com/Infisical/infisical-merge/packages/models"
"github.com/Infisical/infisical-merge/packages/util"
// "github.com/Infisical/infisical-merge/packages/visualize"
"github.com/go-resty/resty/v2"
"github.com/posthog/posthog-go"
"github.com/spf13/cobra"
infisicalSdk "github.com/infisical/go-sdk"
infisicalSdkModels "github.com/infisical/go-sdk/packages/models"
)
var dynamicSecretCmd = &cobra.Command{
Example: `infisical dynamic-secrets`,
Short: "Used to list dynamic secrets",
Use: "dynamic-secrets",
DisableFlagsInUseLine: true,
Args: cobra.NoArgs,
Run: getDynamicSecretList,
}
func getDynamicSecretList(cmd *cobra.Command, args []string) {
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
token, err := util.GetInfisicalToken(cmd)
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
projectId, err := cmd.Flags().GetString("projectId")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse path flag")
}
var infisicalToken string
httpClient := resty.New()
if projectId == "" {
workspaceFile, err := util.GetWorkSpaceFromFile()
if err != nil {
util.HandleError(err, "Unable to get local project details")
}
projectId = workspaceFile.WorkspaceId
}
if token != nil && (token.Type == util.SERVICE_TOKEN_IDENTIFIER || token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER) {
infisicalToken = token.Token
} else {
util.RequireLogin()
util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to authenticate")
}
if loggedInUserDetails.LoginExpired {
util.PrintErrorMessageAndExit("Your login session has expired, please run [infisical login] and try again")
}
infisicalToken = loggedInUserDetails.UserCredentials.JTWToken
}
httpClient.SetAuthToken(infisicalToken)
infisicalClient := infisicalSdk.NewInfisicalClient(context.Background(), infisicalSdk.Config{
SiteUrl: config.INFISICAL_URL,
UserAgent: api.USER_AGENT,
AutoTokenRefresh: false,
})
infisicalClient.Auth().SetAccessToken(infisicalToken)
projectDetails, err := api.CallGetProjectById(httpClient, projectId)
if err != nil {
util.HandleError(err, "To fetch project details")
}
dynamicSecretRootCredentials, err := infisicalClient.DynamicSecrets().List(infisicalSdk.ListDynamicSecretsRootCredentialsOptions{
ProjectSlug: projectDetails.Slug,
SecretPath: secretsPath,
EnvironmentSlug: environmentName,
})
if err != nil {
util.HandleError(err, "To fetch dynamic secret root credentials details")
}
visualize.PrintAllDynamicRootCredentials(dynamicSecretRootCredentials)
Telemetry.CaptureEvent("cli-command:dynamic-secrets", posthog.NewProperties().Set("count", len(dynamicSecretRootCredentials)).Set("version", util.CLI_VERSION))
}
var dynamicSecretLeaseCmd = &cobra.Command{
Example: `lease`,
Short: "Manage leases for dynamic secrets",
Use: "lease",
DisableFlagsInUseLine: true,
}
var dynamicSecretLeaseCreateCmd = &cobra.Command{
Example: `lease create <dynamic secret name>"`,
Short: "Used to lease dynamic secret by name",
Use: "create [dynamic-secret]",
DisableFlagsInUseLine: true,
Args: cobra.ExactArgs(1),
Run: createDynamicSecretLeaseByName,
}
func createDynamicSecretLeaseByName(cmd *cobra.Command, args []string) {
dynamicSecretRootCredentialName := args[0]
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
token, err := util.GetInfisicalToken(cmd)
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
projectId, err := cmd.Flags().GetString("projectId")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
ttl, err := cmd.Flags().GetString("ttl")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse path flag")
}
plainOutput, err := cmd.Flags().GetBool("plain")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
var infisicalToken string
httpClient := resty.New()
if projectId == "" {
workspaceFile, err := util.GetWorkSpaceFromFile()
if err != nil {
util.HandleError(err, "Unable to get local project details")
}
projectId = workspaceFile.WorkspaceId
}
if token != nil && (token.Type == util.SERVICE_TOKEN_IDENTIFIER || token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER) {
infisicalToken = token.Token
} else {
util.RequireLogin()
util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to authenticate")
}
if loggedInUserDetails.LoginExpired {
util.PrintErrorMessageAndExit("Your login session has expired, please run [infisical login] and try again")
}
infisicalToken = loggedInUserDetails.UserCredentials.JTWToken
}
httpClient.SetAuthToken(infisicalToken)
infisicalClient := infisicalSdk.NewInfisicalClient(context.Background(), infisicalSdk.Config{
SiteUrl: config.INFISICAL_URL,
UserAgent: api.USER_AGENT,
AutoTokenRefresh: false,
})
infisicalClient.Auth().SetAccessToken(infisicalToken)
projectDetails, err := api.CallGetProjectById(httpClient, projectId)
if err != nil {
util.HandleError(err, "To fetch project details")
}
dynamicSecretRootCredential, err := infisicalClient.DynamicSecrets().GetByName(infisicalSdk.GetDynamicSecretRootCredentialByNameOptions{
DynamicSecretName: dynamicSecretRootCredentialName,
ProjectSlug: projectDetails.Slug,
SecretPath: secretsPath,
EnvironmentSlug: environmentName,
})
if err != nil {
util.HandleError(err, "To fetch dynamic secret root credentials details")
}
leaseCredentials, _, leaseDetails, err := infisicalClient.DynamicSecrets().Leases().Create(infisicalSdk.CreateDynamicSecretLeaseOptions{
DynamicSecretName: dynamicSecretRootCredential.Name,
ProjectSlug: projectDetails.Slug,
TTL: ttl,
SecretPath: secretsPath,
EnvironmentSlug: environmentName,
})
if err != nil {
util.HandleError(err, "To lease dynamic secret")
}
if plainOutput {
for key, value := range leaseCredentials {
if cred, ok := value.(string); ok {
fmt.Printf("%s=%s\n", key, cred)
}
}
} else {
fmt.Println("Dynamic Secret Leasing")
fmt.Printf("Name: %s\n", dynamicSecretRootCredential.Name)
fmt.Printf("Provider: %s\n", dynamicSecretRootCredential.Type)
fmt.Printf("Lease ID: %s\n", leaseDetails.Id)
fmt.Printf("Expire At: %s\n", leaseDetails.ExpireAt.Local().Format("02-Jan-2006 03:04:05 PM"))
visualize.PrintAllDyamicSecretLeaseCredentials(leaseCredentials)
}
Telemetry.CaptureEvent("cli-command:dynamic-secrets lease", posthog.NewProperties().Set("type", dynamicSecretRootCredential.Type).Set("version", util.CLI_VERSION))
}
var dynamicSecretLeaseRenewCmd = &cobra.Command{
Example: `lease renew <dynamic secret name>"`,
Short: "Used to renew dynamic secret lease by name",
Use: "renew [lease-id]",
DisableFlagsInUseLine: true,
Args: cobra.ExactArgs(1),
Run: renewDynamicSecretLeaseByName,
}
func renewDynamicSecretLeaseByName(cmd *cobra.Command, args []string) {
dynamicSecretLeaseId := args[0]
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
token, err := util.GetInfisicalToken(cmd)
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
projectId, err := cmd.Flags().GetString("projectId")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
ttl, err := cmd.Flags().GetString("ttl")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse path flag")
}
var infisicalToken string
httpClient := resty.New()
if projectId == "" {
workspaceFile, err := util.GetWorkSpaceFromFile()
if err != nil {
util.HandleError(err, "Unable to get local project details")
}
projectId = workspaceFile.WorkspaceId
}
if token != nil && (token.Type == util.SERVICE_TOKEN_IDENTIFIER || token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER) {
infisicalToken = token.Token
} else {
util.RequireLogin()
util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to authenticate")
}
if loggedInUserDetails.LoginExpired {
util.PrintErrorMessageAndExit("Your login session has expired, please run [infisical login] and try again")
}
infisicalToken = loggedInUserDetails.UserCredentials.JTWToken
}
httpClient.SetAuthToken(infisicalToken)
infisicalClient := infisicalSdk.NewInfisicalClient(context.Background(), infisicalSdk.Config{
SiteUrl: config.INFISICAL_URL,
UserAgent: api.USER_AGENT,
AutoTokenRefresh: false,
})
infisicalClient.Auth().SetAccessToken(infisicalToken)
projectDetails, err := api.CallGetProjectById(httpClient, projectId)
if err != nil {
util.HandleError(err, "To fetch project details")
}
if err != nil {
util.HandleError(err, "To fetch dynamic secret root credentials details")
}
leaseDetails, err := infisicalClient.DynamicSecrets().Leases().RenewById(infisicalSdk.RenewDynamicSecretLeaseOptions{
ProjectSlug: projectDetails.Slug,
TTL: ttl,
SecretPath: secretsPath,
EnvironmentSlug: environmentName,
LeaseId: dynamicSecretLeaseId,
})
if err != nil {
util.HandleError(err, "To renew dynamic secret lease")
}
fmt.Println("Successfully renewed dynamic secret lease")
visualize.PrintAllDynamicSecretLeases([]infisicalSdkModels.DynamicSecretLease{leaseDetails})
Telemetry.CaptureEvent("cli-command:dynamic-secrets lease renew", posthog.NewProperties().Set("version", util.CLI_VERSION))
}
var dynamicSecretLeaseRevokeCmd = &cobra.Command{
Example: `lease delete <dynamic secret name>"`,
Short: "Used to delete dynamic secret lease by name",
Use: "delete [lease-id]",
DisableFlagsInUseLine: true,
Args: cobra.ExactArgs(1),
Run: revokeDynamicSecretLeaseByName,
}
func revokeDynamicSecretLeaseByName(cmd *cobra.Command, args []string) {
dynamicSecretLeaseId := args[0]
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
token, err := util.GetInfisicalToken(cmd)
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
projectId, err := cmd.Flags().GetString("projectId")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse path flag")
}
var infisicalToken string
httpClient := resty.New()
if projectId == "" {
workspaceFile, err := util.GetWorkSpaceFromFile()
if err != nil {
util.HandleError(err, "Unable to get local project details")
}
projectId = workspaceFile.WorkspaceId
}
if token != nil && (token.Type == util.SERVICE_TOKEN_IDENTIFIER || token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER) {
infisicalToken = token.Token
} else {
util.RequireLogin()
util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to authenticate")
}
if loggedInUserDetails.LoginExpired {
util.PrintErrorMessageAndExit("Your login session has expired, please run [infisical login] and try again")
}
infisicalToken = loggedInUserDetails.UserCredentials.JTWToken
}
httpClient.SetAuthToken(infisicalToken)
infisicalClient := infisicalSdk.NewInfisicalClient(context.Background(), infisicalSdk.Config{
SiteUrl: config.INFISICAL_URL,
UserAgent: api.USER_AGENT,
AutoTokenRefresh: false,
})
infisicalClient.Auth().SetAccessToken(infisicalToken)
projectDetails, err := api.CallGetProjectById(httpClient, projectId)
if err != nil {
util.HandleError(err, "To fetch project details")
}
if err != nil {
util.HandleError(err, "To fetch dynamic secret root credentials details")
}
leaseDetails, err := infisicalClient.DynamicSecrets().Leases().DeleteById(infisicalSdk.DeleteDynamicSecretLeaseOptions{
ProjectSlug: projectDetails.Slug,
SecretPath: secretsPath,
EnvironmentSlug: environmentName,
LeaseId: dynamicSecretLeaseId,
})
if err != nil {
util.HandleError(err, "To revoke dynamic secret lease")
}
fmt.Println("Successfully revoked dynamic secret lease")
visualize.PrintAllDynamicSecretLeases([]infisicalSdkModels.DynamicSecretLease{leaseDetails})
Telemetry.CaptureEvent("cli-command:dynamic-secrets lease revoke", posthog.NewProperties().Set("version", util.CLI_VERSION))
}
var dynamicSecretLeaseListCmd = &cobra.Command{
Example: `lease list <dynamic secret name>"`,
Short: "Used to list leases of a dynamic secret by name",
Use: "list [dynamic-secret]",
DisableFlagsInUseLine: true,
Args: cobra.ExactArgs(1),
Run: listDynamicSecretLeaseByName,
}
func listDynamicSecretLeaseByName(cmd *cobra.Command, args []string) {
dynamicSecretRootCredentialName := args[0]
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
token, err := util.GetInfisicalToken(cmd)
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
projectId, err := cmd.Flags().GetString("projectId")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse path flag")
}
var infisicalToken string
httpClient := resty.New()
if projectId == "" {
workspaceFile, err := util.GetWorkSpaceFromFile()
if err != nil {
util.HandleError(err, "Unable to get local project details")
}
projectId = workspaceFile.WorkspaceId
}
if token != nil && (token.Type == util.SERVICE_TOKEN_IDENTIFIER || token.Type == util.UNIVERSAL_AUTH_TOKEN_IDENTIFIER) {
infisicalToken = token.Token
} else {
util.RequireLogin()
util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to authenticate")
}
if loggedInUserDetails.LoginExpired {
util.PrintErrorMessageAndExit("Your login session has expired, please run [infisical login] and try again")
}
infisicalToken = loggedInUserDetails.UserCredentials.JTWToken
}
httpClient.SetAuthToken(infisicalToken)
infisicalClient := infisicalSdk.NewInfisicalClient(context.Background(), infisicalSdk.Config{
SiteUrl: config.INFISICAL_URL,
UserAgent: api.USER_AGENT,
AutoTokenRefresh: false,
})
infisicalClient.Auth().SetAccessToken(infisicalToken)
projectDetails, err := api.CallGetProjectById(httpClient, projectId)
if err != nil {
util.HandleError(err, "To fetch project details")
}
dynamicSecretLeases, err := infisicalClient.DynamicSecrets().Leases().List(infisicalSdk.ListDynamicSecretLeasesOptions{
DynamicSecretName: dynamicSecretRootCredentialName,
ProjectSlug: projectDetails.Slug,
SecretPath: secretsPath,
EnvironmentSlug: environmentName,
})
if err != nil {
util.HandleError(err, "To fetch dynamic secret leases list")
}
visualize.PrintAllDynamicSecretLeases(dynamicSecretLeases)
Telemetry.CaptureEvent("cli-command:dynamic-secrets lease list", posthog.NewProperties().Set("lease-count", len(dynamicSecretLeases)).Set("version", util.CLI_VERSION))
}
func init() {
dynamicSecretLeaseCreateCmd.Flags().StringP("path", "p", "/", "The path from where dynamic secret should be leased from")
dynamicSecretLeaseCreateCmd.Flags().String("token", "", "Create dynamic secret leases using machine identity access token")
dynamicSecretLeaseCreateCmd.Flags().String("projectId", "", "Manually set the projectId to fetch leased from when using machine identity based auth")
dynamicSecretLeaseCreateCmd.Flags().String("ttl", "", "The lease lifetime TTL. If not provided the default TTL of dynamic secret will be used.")
dynamicSecretLeaseCreateCmd.Flags().Bool("plain", false, "Print leased credentials without formatting, one per line")
dynamicSecretLeaseCmd.AddCommand(dynamicSecretLeaseCreateCmd)
dynamicSecretLeaseListCmd.Flags().StringP("path", "p", "/", "The path from where dynamic secret should be leased from")
dynamicSecretLeaseListCmd.Flags().String("token", "", "Fetch dynamic secret leases machine identity access token")
dynamicSecretLeaseListCmd.Flags().String("projectId", "", "Manually set the projectId to fetch leased from when using machine identity based auth")
dynamicSecretLeaseCmd.AddCommand(dynamicSecretLeaseListCmd)
dynamicSecretLeaseRenewCmd.Flags().StringP("path", "p", "/", "The path from where dynamic secret should be leased from")
dynamicSecretLeaseRenewCmd.Flags().String("token", "", "Renew dynamic secrets machine identity access token")
dynamicSecretLeaseRenewCmd.Flags().String("projectId", "", "Manually set the projectId to fetch leased from when using machine identity based auth")
dynamicSecretLeaseRenewCmd.Flags().String("ttl", "", "The lease lifetime TTL. If not provided the default TTL of dynamic secret will be used.")
dynamicSecretLeaseCmd.AddCommand(dynamicSecretLeaseRenewCmd)
dynamicSecretLeaseRevokeCmd.Flags().StringP("path", "p", "/", "The path from where dynamic secret should be leased from")
dynamicSecretLeaseRevokeCmd.Flags().String("token", "", "Delete dynamic secrets using machine identity access token")
dynamicSecretLeaseRevokeCmd.Flags().String("projectId", "", "Manually set the projectId to fetch leased from when using machine identity based auth")
dynamicSecretLeaseCmd.AddCommand(dynamicSecretLeaseRevokeCmd)
dynamicSecretCmd.AddCommand(dynamicSecretLeaseCmd)
dynamicSecretCmd.Flags().String("token", "", "Fetch secrets using service token or machine identity access token")
dynamicSecretCmd.Flags().String("projectId", "", "Manually set the projectId to fetch dynamic-secret when using machine identity based auth")
dynamicSecretCmd.PersistentFlags().String("env", "dev", "Used to select the environment name on which actions should be taken on")
dynamicSecretCmd.Flags().String("path", "/", "get dynamic secret within a folder path")
rootCmd.AddCommand(dynamicSecretCmd)
}

View File

@ -111,7 +111,7 @@ var exportCmd = &cobra.Command{
accessToken = token.Token
} else {
log.Debug().Msg("GetAllEnvironmentVariables: Trying to fetch secrets using logged in details")
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err)
}

View File

@ -41,7 +41,7 @@ var initCmd = &cobra.Command{
}
}
userCreds, err := util.GetCurrentLoggedInUserDetails()
userCreds, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to get your login details")
}

View File

@ -154,6 +154,8 @@ var loginCmd = &cobra.Command{
DisableFlagsInUseLine: true,
Run: func(cmd *cobra.Command, args []string) {
presetDomain := config.INFISICAL_URL
clearSelfHostedDomains, err := cmd.Flags().GetBool("clear-domains")
if err != nil {
util.HandleError(err)
@ -198,7 +200,7 @@ var loginCmd = &cobra.Command{
// standalone user auth
if loginMethod == "user" {
currentLoggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
currentLoggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
// if the key can't be found or there is an error getting current credentials from key ring, allow them to override
if err != nil && (strings.Contains(err.Error(), "we couldn't find your logged in details")) {
log.Debug().Err(err)
@ -216,11 +218,19 @@ var loginCmd = &cobra.Command{
return
}
}
usePresetDomain, err := usePresetDomain(presetDomain)
if err != nil {
util.HandleError(err)
}
//override domain
domainQuery := true
if config.INFISICAL_URL_MANUAL_OVERRIDE != "" &&
config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_EU_URL) &&
config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_US_URL) {
config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_US_URL) &&
!usePresetDomain {
overrideDomain, err := DomainOverridePrompt()
if err != nil {
util.HandleError(err)
@ -228,7 +238,7 @@ var loginCmd = &cobra.Command{
//if not override set INFISICAL_URL to exported var
//set domainQuery to false
if !overrideDomain {
if !overrideDomain && !usePresetDomain {
domainQuery = false
config.INFISICAL_URL = util.AppendAPIEndpoint(config.INFISICAL_URL_MANUAL_OVERRIDE)
config.INFISICAL_LOGIN_URL = fmt.Sprintf("%s/login", strings.TrimSuffix(config.INFISICAL_URL, "/api"))
@ -237,7 +247,7 @@ var loginCmd = &cobra.Command{
}
//prompt user to select domain between Infisical cloud and self-hosting
if domainQuery {
if domainQuery && !usePresetDomain {
err = askForDomain()
if err != nil {
util.HandleError(err, "Unable to parse domain url")
@ -526,6 +536,45 @@ func DomainOverridePrompt() (bool, error) {
return selectedOption == OVERRIDE, err
}
func usePresetDomain(presetDomain string) (bool, error) {
infisicalConfig, err := util.GetConfigFile()
if err != nil {
return false, fmt.Errorf("askForDomain: unable to get config file because [err=%s]", err)
}
preconfiguredUrl := strings.TrimSuffix(presetDomain, "/api")
if preconfiguredUrl != "" && preconfiguredUrl != util.INFISICAL_DEFAULT_US_URL && preconfiguredUrl != util.INFISICAL_DEFAULT_EU_URL {
parsedDomain := strings.TrimSuffix(strings.Trim(preconfiguredUrl, "/"), "/api")
_, err := url.ParseRequestURI(parsedDomain)
if err != nil {
return false, errors.New(fmt.Sprintf("Invalid domain URL: '%s'", parsedDomain))
}
config.INFISICAL_URL = fmt.Sprintf("%s/api", parsedDomain)
config.INFISICAL_LOGIN_URL = fmt.Sprintf("%s/login", parsedDomain)
if !slices.Contains(infisicalConfig.Domains, parsedDomain) {
infisicalConfig.Domains = append(infisicalConfig.Domains, parsedDomain)
err = util.WriteConfigFile(&infisicalConfig)
if err != nil {
return false, fmt.Errorf("askForDomain: unable to write domains to config file because [err=%s]", err)
}
}
whilte := color.New(color.FgGreen)
boldWhite := whilte.Add(color.Bold)
time.Sleep(time.Second * 1)
boldWhite.Printf("[INFO] Using domain '%s' from domain flag or INFISICAL_API_URL environment variable\n", parsedDomain)
return true, nil
}
return false, nil
}
func askForDomain() error {
// query user to choose between Infisical cloud or self-hosting

View File

@ -54,7 +54,7 @@ func init() {
util.CheckForUpdate()
}
loggedInDetails, err := util.GetCurrentLoggedInUserDetails()
loggedInDetails, err := util.GetCurrentLoggedInUserDetails(false)
if !silent && err == nil && loggedInDetails.IsUserLoggedIn && !loggedInDetails.LoginExpired {
token, err := util.GetInfisicalToken(cmd)

View File

@ -194,7 +194,7 @@ var secretsSetCmd = &cobra.Command{
projectId = workspaceFile.WorkspaceId
}
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "unable to authenticate [err=%v]")
}
@ -278,7 +278,7 @@ var secretsDeleteCmd = &cobra.Command{
util.RequireLogin()
util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to authenticate")
}

View File

@ -41,7 +41,7 @@ var tokensCreateCmd = &cobra.Command{
},
Run: func(cmd *cobra.Command, args []string) {
// get plain text workspace key
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil {
util.HandleError(err, "Unable to retrieve your logged in your details. Please login in then try again")

View File

@ -55,7 +55,7 @@ func GetUserCredsFromKeyRing(userEmail string) (credentials models.UserCredentia
return userCredentials, err
}
func GetCurrentLoggedInUserDetails() (LoggedInUserDetails, error) {
func GetCurrentLoggedInUserDetails(setConfigVariables bool) (LoggedInUserDetails, error) {
if ConfigFileExists() {
configFile, err := GetConfigFile()
if err != nil {
@ -75,18 +75,20 @@ func GetCurrentLoggedInUserDetails() (LoggedInUserDetails, error) {
}
}
if setConfigVariables {
config.INFISICAL_URL_MANUAL_OVERRIDE = config.INFISICAL_URL
//configFile.LoggedInUserDomain
//if not empty set as infisical url
if configFile.LoggedInUserDomain != "" {
config.INFISICAL_URL = AppendAPIEndpoint(configFile.LoggedInUserDomain)
}
}
// check to to see if the JWT is still valid
httpClient := resty.New().
SetAuthToken(userCreds.JTWToken).
SetHeader("Accept", "application/json")
config.INFISICAL_URL_MANUAL_OVERRIDE = config.INFISICAL_URL
//configFile.LoggedInUserDomain
//if not empty set as infisical url
if configFile.LoggedInUserDomain != "" {
config.INFISICAL_URL = AppendAPIEndpoint(configFile.LoggedInUserDomain)
}
isAuthenticated := api.CallIsAuthenticated(httpClient)
// TODO: add refresh token
// if !isAuthenticated {

View File

@ -20,7 +20,7 @@ func GetAllFolders(params models.GetAllFoldersParameters) ([]models.SingleFolder
log.Debug().Msg("GetAllFolders: Trying to fetch folders using logged in details")
loggedInUserDetails, err := GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
if err != nil {
return nil, err
}
@ -177,7 +177,7 @@ func CreateFolder(params models.CreateFolderParameters) (models.SingleFolder, er
if params.InfisicalToken == "" {
RequireLogin()
RequireLocalWorkspaceFile()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
if err != nil {
return models.SingleFolder{}, err
@ -224,7 +224,7 @@ func DeleteFolder(params models.DeleteFolderParameters) ([]models.SingleFolder,
RequireLogin()
RequireLocalWorkspaceFile()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
if err != nil {
return nil, err

View File

@ -246,7 +246,7 @@ func GetAllEnvironmentVariables(params models.GetAllSecretsParameters, projectCo
log.Debug().Msg("GetAllEnvironmentVariables: Trying to fetch secrets using logged in details")
loggedInUserDetails, err := GetCurrentLoggedInUserDetails()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
isConnected := ValidateInfisicalAPIConnection()
if isConnected {

View File

@ -0,0 +1,39 @@
package visualize
import infisicalModels "github.com/infisical/go-sdk/packages/models"
func PrintAllDyamicSecretLeaseCredentials(leaseCredentials map[string]any) {
rows := [][]string{}
for key, value := range leaseCredentials {
if cred, ok := value.(string); ok {
rows = append(rows, []string{key, cred})
}
}
headers := []string{"Key", "Value"}
GenericTable(headers, rows)
}
func PrintAllDynamicRootCredentials(dynamicRootCredentials []infisicalModels.DynamicSecret) {
rows := [][]string{}
for _, el := range dynamicRootCredentials {
rows = append(rows, []string{el.Name, el.Type, el.DefaultTTL, el.MaxTTL})
}
headers := []string{"Name", "Provider", "Default TTL", "Max TTL"}
GenericTable(headers, rows)
}
func PrintAllDynamicSecretLeases(dynamicSecretLeases []infisicalModels.DynamicSecretLease) {
rows := [][]string{}
const timeformat = "02-Jan-2006 03:04:05 PM"
for _, el := range dynamicSecretLeases {
rows = append(rows, []string{el.Id, el.ExpireAt.Local().Format(timeformat), el.CreatedAt.Local().Format(timeformat)})
}
headers := []string{"ID", "Expire At", "Created At"}
GenericTable(headers, rows)
}

View File

@ -94,6 +94,33 @@ func getLongestValues(rows [][3]string) (longestSecretName, longestSecretType in
return
}
func GenericTable(headers []string, rows [][]string) {
t := table.NewWriter()
t.SetOutputMirror(os.Stdout)
t.SetStyle(table.StyleLight)
// t.SetTitle(tableOptions.Title)
t.Style().Options.DrawBorder = true
t.Style().Options.SeparateHeader = true
t.Style().Options.SeparateColumns = true
tableHeaders := table.Row{}
for _, header := range headers {
tableHeaders = append(tableHeaders, header)
}
t.AppendHeader(tableHeaders)
for _, row := range rows {
tableRow := table.Row{}
for _, val := range row {
tableRow = append(tableRow, val)
}
t.AppendRow(tableRow)
}
t.Render()
}
// stringWidth returns the width of a string.
// ANSI escape sequences are ignored and double-width characters are handled correctly.
func stringWidth(str string) (width int) {

View File

@ -1,4 +1,4 @@
error: CallGetRawSecretsV3: Unsuccessful response [GET https://app.infisical.com/api/v3/secrets/raw?environment=invalid-env&expandSecretReferences=true&include_imports=true&recursive=true&secretPath=%2F&workspaceId=bef697d4-849b-4a75-b284-0922f87f8ba2] [status-code=404] [response={"statusCode":404,"message":"Environment with slug 'invalid-env' in project with ID bef697d4-849b-4a75-b284-0922f87f8ba2 not found","error":"NotFound"}]
error: CallGetRawSecretsV3: Unsuccessful response [GET https://app.infisical.com/api/v3/secrets/raw?environment=invalid-env&expandSecretReferences=true&include_imports=true&recursive=true&secretPath=%2F&workspaceId=bef697d4-849b-4a75-b284-0922f87f8ba2] [status-code=404] [response={"error":"NotFound","message":"Environment with slug 'invalid-env' in project with ID bef697d4-849b-4a75-b284-0922f87f8ba2 not found","statusCode":404}]
If this issue continues, get support at https://infisical.com/slack

View File

@ -1,4 +1,4 @@
Warning: Unable to fetch the latest secret(s) due to connection error, serving secrets from last successful fetch. For more info, run with --debug
Warning: Unable to fetch the latest secret(s) due to connection error, serving secrets from last successful fetch. For more info, run with --debug
┌───────────────┬──────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├───────────────┼──────────────┼─────────────┤

View File

@ -1,6 +1,7 @@
package tests
import (
"encoding/json"
"fmt"
"log"
"os"
@ -41,11 +42,12 @@ var creds = Credentials{
func ExecuteCliCommand(command string, args ...string) (string, error) {
cmd := exec.Command(command, args...)
output, err := cmd.CombinedOutput()
if err != nil {
fmt.Println(fmt.Sprint(err) + ": " + string(output))
return strings.TrimSpace(string(output)), err
fmt.Println(fmt.Sprint(err) + ": " + FilterRequestID(strings.TrimSpace(string(output))))
return FilterRequestID(strings.TrimSpace(string(output))), err
}
return strings.TrimSpace(string(output)), nil
return FilterRequestID(strings.TrimSpace(string(output))), nil
}
func SetupCli() {
@ -67,3 +69,34 @@ func SetupCli() {
}
}
func FilterRequestID(input string) string {
// Find the JSON part of the error message
start := strings.Index(input, "{")
end := strings.LastIndex(input, "}") + 1
if start == -1 || end == -1 {
return input
}
jsonPart := input[:start] // Pre-JSON content
// Parse the JSON object
var errorObj map[string]interface{}
if err := json.Unmarshal([]byte(input[start:end]), &errorObj); err != nil {
return input
}
// Remove requestId field
delete(errorObj, "requestId")
delete(errorObj, "reqId")
// Convert back to JSON
filtered, err := json.Marshal(errorObj)
if err != nil {
return input
}
// Reconstruct the full string
return jsonPart + string(filtered) + input[end:]
}

View File

@ -3,7 +3,6 @@ package tests
import (
"testing"
"github.com/Infisical/infisical-merge/packages/util"
"github.com/bradleyjkemp/cupaloy/v2"
)
@ -96,28 +95,29 @@ func TestUserAuth_SecretsGetAll(t *testing.T) {
// testUserAuth_SecretsGetAllWithoutConnection(t)
}
func testUserAuth_SecretsGetAllWithoutConnection(t *testing.T) {
originalConfigFile, err := util.GetConfigFile()
if err != nil {
t.Fatalf("error getting config file")
}
newConfigFile := originalConfigFile
// disabled for the time being
// func testUserAuth_SecretsGetAllWithoutConnection(t *testing.T) {
// originalConfigFile, err := util.GetConfigFile()
// if err != nil {
// t.Fatalf("error getting config file")
// }
// newConfigFile := originalConfigFile
// set it to a URL that will always be unreachable
newConfigFile.LoggedInUserDomain = "http://localhost:4999"
util.WriteConfigFile(&newConfigFile)
// // set it to a URL that will always be unreachable
// newConfigFile.LoggedInUserDomain = "http://localhost:4999"
// util.WriteConfigFile(&newConfigFile)
// restore config file
defer util.WriteConfigFile(&originalConfigFile)
// // restore config file
// defer util.WriteConfigFile(&originalConfigFile)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--include-imports=false", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--include-imports=false", "--silent")
// if err != nil {
// t.Fatalf("error running CLI command: %v", err)
// }
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
// // Use cupaloy to snapshot test the output
// err = cupaloy.Snapshot(output)
// if err != nil {
// t.Fatalf("snapshot failed: %v", err)
// }
// }

View File

@ -12,6 +12,18 @@ To request time off, just submit a request in Rippling and let Maidul know at le
Since Infisical's team is globally distributed, it is hard for us to keep track of all the various national holidays across many different countries. Whether you'd like to celebrate Christmas or National Brisket Day (which, by the way, is on May 28th), you are welcome to take PTO on those days  just let Maidul know at least a week ahead so that we can adjust our planning.
## Winter Break
## Winter break
Every year, Infisical team goes on a company-wide vacation during winter holidays. This year, the winter break period starts on December 21st, 2024 and ends on January 5th, 2025. You should expect to do no scheduled work during this period, but we will have a rotation process for [high and urgent service disruptions](https://infisical.com/sla).
Every year, Infisical team goes on a company-wide vacation during winter holidays. This year, the winter break period starts on December 21st, 2024 and ends on January 5th, 2025. You should expect to do no scheduled work during this period, but we will have a rotation process for [high and urgent service disruptions](https://infisical.com/sla).
## Parental leave
At Infisical, we recognize that parental leave is a special and important time, significantly different from a typical vacation. Were proud to offer parental leave to everyone, regardless of gender, and whether youve become a parent through childbirth or adoption.
For team members who have been with Infisical for over a year by the time of your childs birth or adoption, you are eligible for up to 12 weeks of paid parental leave. This leave will be provided in one continuous block to allow you uninterrupted time with your family. If you have been with Infisical for less than a year, we will follow the parental leave provisions required by your local jurisdiction.
While we trust your judgment, parental leave is intended to be a distinct benefit and is not designed to be combined with our unlimited PTO policy. To ensure fairness and balance, we generally discourage combining parental leave with an extended vacation.
When youre ready, please notify Maidul about your plans for parental leave, ideally at least four months in advance. This allows us to support you fully and arrange any necessary logistics, including salary adjustments and statutory paperwork.
Were here to support you as you embark on this exciting new chapter in your life!

Some files were not shown because too many files have changed in this diff Show More