Compare commits

..

167 Commits

Author SHA1 Message Date
Maidul Islam
f24067542f test migration rename 2024-04-26 13:10:59 -04:00
Maidul Islam
a7f5a61f37 Merge pull request #1737 from akhilmhdh/chore/gh-action-migration-rename
feat: github action to rename new migration file to latest timestamp
2024-04-26 13:07:54 -04:00
Akhil Mohan
b5fd7698d8 chore: updated rename migration action to run on PR merge 2024-04-26 22:27:32 +05:30
Akhil Mohan
61c3102573 Merge pull request #1738 from Infisical/sheen/auto-complete-for-path-select
Feature: added auto-complete for secret path inputs
2024-04-26 22:05:21 +05:30
Sheen Capadngan
d6a5bf9d50 adjustment: simplified onchange propagation 2024-04-27 00:32:34 +08:00
Tuan Dang
70f63b3190 Default metadata to empty object if it does not exist on integration for aws integration sync ops 2024-04-26 09:06:01 -07:00
Sheen Capadngan
2b0670a409 fix: addressed suggestion issue in copy secrets from board 2024-04-26 19:20:05 +08:00
Sheen Capadngan
cc25639157 fix: resolved loop traversal of suggestions 2024-04-26 18:59:39 +08:00
Sheen Capadngan
5ff30aed10 fix: addressed suggestion incomplete issue 2024-04-26 18:46:16 +08:00
Sheen Capadngan
656ec4bf16 feature: migrated path inputs to new component 2024-04-26 18:18:23 +08:00
Akhil Mohan
0bac9a8e02 feat: github action to rename new migration file to latest timestamp in utc 2024-04-26 15:24:27 +05:30
Sheen Capadngan
5142e6e5f6 feature: created secret path input component with autocomplete support 2024-04-26 16:07:27 +08:00
Maidul Islam
49c735caf9 Merge pull request #1572 from Salman2301/feat-secret-input-autocomplete
Secret input auto complete
2024-04-25 15:48:25 -04:00
Maidul Islam
b4de2ea85d Merge pull request #1735 from akhilmhdh/import/recursive
feat(server): recursive imported secret fetch for api
2024-04-25 15:44:37 -04:00
Maidul Islam
8b8baf1ef2 nit: variable rename 2024-04-25 15:40:55 -04:00
Sheen Capadngan
2a89b872c5 adjustment: finalized text width 2024-04-26 03:33:58 +08:00
Akhil Mohan
2d2d9a5987 feat(server): added cyclic detector 2024-04-26 01:00:42 +05:30
Sheen Capadngan
a20a60850b adjustment: finalized popover positioning 2024-04-26 03:25:34 +08:00
Akhil Mohan
35e38c23dd feat(server): recursive imported secret fetch for api 2024-04-26 00:26:18 +05:30
Sheen Capadngan
b79e61c86b Merge remote-tracking branch 'origin/main' into feat-secret-input-autocomplete 2024-04-26 01:40:21 +08:00
Sheen Capadngan
e555d3129d fix: resolved invalid handling of undefined vals 2024-04-26 01:22:09 +08:00
Sheen Capadngan
a41883137c fix: addressed type check issue 2024-04-26 01:08:31 +08:00
Maidul Islam
c414bf6c39 Merge pull request #1734 from Infisical/daniel/fix-saml-invite-bug
Fix: SAML organization invite bug
2024-04-25 13:07:01 -04:00
Sheen Capadngan
9b782a9da6 adjustment: removed unused component 2024-04-26 00:56:45 +08:00
Sheen Capadngan
497c0cf63d adjustment: final ui/ux adjustments 2024-04-26 00:52:37 +08:00
Daniel Hougaard
93761f37ea Update saml-config-service.ts 2024-04-25 18:13:42 +02:00
Daniel Hougaard
68e530e5d2 Fix: On complete signup, check for saml auth and present org ID and handle membership status 2024-04-25 18:12:08 +02:00
Sheen Capadngan
20b1cdf909 adjustment: added click handler to suggestions and finalized env icon 2024-04-25 21:43:25 +08:00
Sheen Capadngan
4bae65cc55 adjustment: ux finalization 2024-04-25 12:44:01 +08:00
Maidul Islam
6da5f12855 Merge pull request #1733 from Infisical/test-ldap-connection
Add Test Connection capability and User Search Filter for LDAP configuration
2024-04-25 00:42:15 -04:00
Tuan Dang
7a242c4976 Fix frontend type check issue 2024-04-24 21:32:36 -07:00
Tuan Dang
b01d381993 Refactor ldap filter validation 2024-04-24 21:19:07 -07:00
Tuan Dang
1ac18fcf0c Merge remote-tracking branch 'origin' into test-ldap-connection 2024-04-24 20:58:40 -07:00
Tuan Dang
8d5ef5f4d9 Add user search filter field for LDAP and validation for search filters 2024-04-24 20:58:16 -07:00
Maidul Islam
35b5253853 Update README.md 2024-04-24 19:56:05 -04:00
Tuan Dang
99d59a38d5 Add test connection btn for LDAP, update group search filter impl, update group search filter examples in docs 2024-04-24 16:50:23 -07:00
Sheen Capadngan
9ab1fce0e0 feature: created new secret input component 2024-04-25 04:02:34 +08:00
BlackMagiq
9992fbf3dd Merge pull request #1729 from Infisical/groups-phase-3
Groups Phase 3 (LDAP)
2024-04-24 08:34:46 -07:00
Tuan Dang
3ca596d4af Clean LDAP group search impl async/await 2024-04-24 08:15:19 -07:00
Tuan Dang
1c95b3abe7 Add license check for ldap group maps 2024-04-23 21:57:40 -07:00
Tuan Dang
1f3c72b997 Update def features 2024-04-23 21:52:46 -07:00
Tuan Dang
e55b981cea Merge remote-tracking branch 'origin' into groups-phase-3 2024-04-23 21:47:22 -07:00
Tuan Dang
49d4e67e07 Smoothen name prefill LDAP 2024-04-23 21:38:51 -07:00
Tuan Dang
a54d156bf0 Patch LDAP issue 2024-04-23 21:16:55 -07:00
Tuan Dang
f3fc898232 Add docs for LDAP groups 2024-04-23 19:37:26 -07:00
vmatsiiako
c61602370e Update kubernetes-helm.mdx 2024-04-23 19:32:26 -07:00
Daniel Hougaard
5178663797 Merge pull request #1728 from Infisical/daniel/cli-get-folders-improvement
Feat: Allow "secrets folders get" command to be used with service token & universal auth
2024-04-24 02:46:20 +02:00
Daniel Hougaard
f04f3aee25 Fix: Allow service token & UA access token to be used as authentication 2024-04-24 02:36:29 +02:00
Daniel Hougaard
e5333e2718 Fix: UA token being overwritten by service token 2024-04-24 02:07:45 +02:00
Daniel Hougaard
f27d9f8cee Update release_build_infisical_cli.yml 2024-04-24 00:21:46 +02:00
Daniel Hougaard
cbd568b714 Update release_build_infisical_cli.yml 2024-04-24 00:18:25 +02:00
Daniel Hougaard
b330c5570d Allow trigger through Github UI 2024-04-24 00:06:35 +02:00
Tuan Dang
d222bbf131 Update ldap group mapping schema, replace group input field with select 2024-04-23 15:04:02 -07:00
Tuan Dang
961c6391a8 Complete LDAP group mapping data structure + frontend/backend 2024-04-23 13:58:23 -07:00
Maidul Islam
d68d7df0f8 Merge pull request #1725 from Infisical/daniel/workflow-env-bug
Fix: Undefined CLI tests env variables
2024-04-23 16:25:43 -04:00
Daniel Hougaard
c44c7810ce Fix: CLI Tests failing when called as a dependency workflow 2024-04-23 22:24:17 +02:00
Daniel Hougaard
b7893a6a72 Update test-workflow.yml 2024-04-23 22:21:32 +02:00
Daniel Hougaard
7a3d425b0e Fix: Undefined env variables 2024-04-23 22:20:43 +02:00
Maidul Islam
bd570bd02f Merge pull request #1724 from Infisical/daniel/cli-token-bug
Fix: UA Token being overwritten by INFISICAL_TOKEN env variable
2024-04-23 16:07:40 -04:00
Daniel Hougaard
b94ffb8a82 Fix: UA Token being overwritten by INFISICAL_TOKEN env variable 2024-04-23 22:00:32 +02:00
Maidul Islam
246b8728a4 add patroni gha 2024-04-23 14:49:12 -04:00
Akhil Mohan
00415e1a87 Merge pull request #1723 from Infisical/update-folder-error-mg
Update folder not found error msg
2024-04-23 23:28:37 +05:30
Maidul Islam
ad354c106e update folder not found error message 2024-04-23 13:56:12 -04:00
Sheen Capadngan
26778d92d3 adjustment: unified logic for InfisicalSecretInput 2024-04-24 01:40:46 +08:00
Sheen Capadngan
b135ba263c adjustment: finalized InfisicalSecretInput 2024-04-23 22:49:00 +08:00
Sheen Capadngan
9b7ef55ad7 adjustment: simplified caret helper 2024-04-23 21:43:01 +08:00
Sheen Capadngan
872f8bdad8 adjustment: converted remaining salug validation to use slugify 2024-04-23 21:14:33 +08:00
Sheen Capadngan
80b0dc6895 adjustment: removed autocomplete from RotationInputForm 2024-04-23 20:55:43 +08:00
vmatsiiako
b067751027 Merge pull request #1720 from Infisical/docs/amplify-patch
docs: added -y flag in infisical cli installation in amplify doc to skip confirmation prompt
2024-04-22 22:54:38 -07:00
Akhil Mohan
f2b3b7b726 docs: added -y flag in infisical cli installation in amplify doc to skip confirmation prompt 2024-04-23 11:23:03 +05:30
Tuan Dang
2d51445dd9 Add ldapjs to get user groups upon ldap login 2024-04-22 22:02:12 -07:00
Sheen Capadngan
20898c00c6 feat: added referencing autocomplete to remaining components 2024-04-23 11:35:17 +08:00
Sheen Capadngan
2200bd646e adjustment: added isImport handling 2024-04-23 11:17:12 +08:00
Sheen Capadngan
fb69236f47 Merge remote-tracking branch 'origin/main' into feat-secret-input-autocomplete 2024-04-23 11:02:45 +08:00
Sheen Capadngan
918734b26b adjustment: used enum for reference type 2024-04-23 10:43:10 +08:00
Sheen Capadngan
729c75112b adjustment: deleted unused reference select component 2024-04-23 10:26:41 +08:00
Sheen Capadngan
738e8cfc5c adjustment: standardized slug validation 2024-04-23 10:25:36 +08:00
Maidul Islam
1ba7a31e0d Merge pull request #1719 from Infisical/daniel/migration-fix
Fix: Duplicate org membership migration
2024-04-22 19:51:21 -04:00
Daniel Hougaard
233a4f7d77 Update 20240405000045_org-memberships-unique-constraint.ts 2024-04-23 01:49:44 +02:00
Daniel Hougaard
44ff1abd74 Update 20240405000045_org-memberships-unique-constraint.ts 2024-04-23 01:49:26 +02:00
Maidul Islam
08cb105fe4 Merge pull request #1712 from akhilmhdh/feat/batch-raw-secrets-api
Batch raw secrets api
2024-04-22 18:48:14 -04:00
BlackMagiq
62aebe2fd4 Merge pull request #1705 from Infisical/groups-phase-2b
Groups Phase 2A (Pending Group Additions)
2024-04-22 14:25:36 -07:00
Tuan Dang
5c0542c5a3 Merge remote-tracking branch 'origin' into groups-phase-2b 2024-04-22 14:21:21 -07:00
Daniel Hougaard
6874bff302 Merge pull request #1717 from Infisical/daniel/fix-frontend-roles
Fix: Frontend roles bug
2024-04-22 21:21:43 +02:00
Daniel Hougaard
e1b8aa8347 Update queries.tsx 2024-04-22 21:14:44 +02:00
vmatsiiako
a041fd4762 Update cassandra.mdx 2024-04-22 12:12:29 -07:00
vmatsiiako
1534ba516a Update postgresql.mdx 2024-04-22 12:12:14 -07:00
Daniel Hougaard
f7183347dc Fix: Project roles 2024-04-22 21:08:57 +02:00
Daniel Hougaard
105b8d6493 Feat: Helper function to check for project roles 2024-04-22 21:08:33 +02:00
Tuan Dang
b9d35058bf Merge remote-tracking branch 'origin' into groups-phase-2b 2024-04-22 12:08:10 -07:00
Daniel Hougaard
22a3c46902 Fix: Upgrade project permission bug 2024-04-22 21:08:09 +02:00
Daniel Hougaard
be8232dc93 Feat: Include role on project permission response 2024-04-22 21:07:56 +02:00
Tuan Dang
8c566a5ff7 Fix wording for group project addition/removal 2024-04-22 12:01:29 -07:00
Tuan Dang
0a124093d6 Patch adding groups to project for invited users, add transactions for adding/removing groups to/from projects 2024-04-22 11:59:17 -07:00
Maidul Islam
088cb72621 Merge pull request #1703 from Infisical/daniel/cli-integration-tests
Feat: CLI Integration Tests (Phase I)
2024-04-22 14:38:22 -04:00
Maidul Islam
de21b44486 small nits 2024-04-22 14:36:33 -04:00
Sheen Capadngan
6daeed68a0 adjustment: converted callback functions to use arrow notation 2024-04-23 02:28:42 +08:00
Sheen Capadngan
31a499c9cd adjustment: added clearTimeout 2024-04-23 02:10:15 +08:00
Daniel Hougaard
519d6f98a2 Chore: Use standard lib 2024-04-22 19:50:24 +02:00
Daniel Hougaard
973ed37018 Update export.go 2024-04-22 19:50:15 +02:00
Tuan Dang
c72280e9ab Merge remote-tracking branch 'origin' into groups-phase-2b 2024-04-22 10:37:53 -07:00
Tuan Dang
032c5b5620 Convert pending group addition table into isPending field 2024-04-22 10:37:24 -07:00
Akhil Mohan
aa5cd0fd0f feat(server): switched from workspace id to project slug 2024-04-22 21:19:06 +05:30
Sheen Capadngan
358ca3decd adjustment: reverted changes made to SecretInput 2024-04-22 23:26:44 +08:00
Sheen Capadngan
0899fdb7d5 adjustment: migrated to InfisicalSecretInput component 2024-04-22 22:22:48 +08:00
Daniel Hougaard
e008fb26a2 Cleanup 2024-04-22 16:02:16 +02:00
Daniel Hougaard
34543ef127 Fix: Removed old code 2024-04-22 15:59:54 +02:00
Daniel Hougaard
83107f56bb Fix: Removed old test code 2024-04-22 15:59:16 +02:00
Daniel Hougaard
35071af478 Fix: Run cmd tests 2024-04-22 15:27:42 +02:00
Daniel Hougaard
eb5f71cb05 Chore: Disable build as the tests handle this automatically 2024-04-22 15:27:35 +02:00
Daniel Hougaard
9cf1dd38a6 Fix: Run CMD snapshot fix 2024-04-22 15:27:22 +02:00
Daniel Hougaard
144a563609 Fix: Fixed snapshots order 2024-04-22 15:21:19 +02:00
Daniel Hougaard
ca0062f049 Update run-cli-tests.yml 2024-04-22 15:18:32 +02:00
Daniel Hougaard
2ed9aa888e Fix: Secrets order 2024-04-22 15:18:30 +02:00
Daniel Hougaard
8c7d329f8f Fix: Snapshot output order 2024-04-22 15:18:23 +02:00
Daniel Hougaard
a0aa06e2f5 Fix: Refactor teests to use cupaloy 2024-04-22 15:12:21 +02:00
Daniel Hougaard
1dd0167ac8 Feat: CLI Integration Tests 2024-04-22 15:12:18 +02:00
Daniel Hougaard
55aea364da Fix: Refactor teests to use cupaloy 2024-04-22 15:12:09 +02:00
Daniel Hougaard
afee47ab45 Delete root_test.go 2024-04-22 15:12:02 +02:00
Daniel Hougaard
9387d9aaac Rename 2024-04-22 15:11:58 +02:00
Daniel Hougaard
2b215a510c Fix: Integrated UA login test 2024-04-22 15:11:39 +02:00
Daniel Hougaard
89ff6a6c93 Update .gitignore 2024-04-22 15:11:25 +02:00
Daniel Hougaard
3bcf406688 Fix: Refactor 2024-04-22 15:11:20 +02:00
Daniel Hougaard
580b86cde8 Fix: Refactor teests to use cupaloy 2024-04-22 15:11:10 +02:00
Daniel Hougaard
7a20251261 Fix: Returning keys in a reproducible manner 2024-04-22 15:10:55 +02:00
Daniel Hougaard
ae63898d5e Install cupaloy 2024-04-22 15:02:51 +02:00
Daniel Hougaard
d4d3c2b10f Update .gitignore 2024-04-22 15:02:44 +02:00
Daniel Hougaard
0e3cc4fdeb Correct snapshots 2024-04-22 15:02:40 +02:00
Akhil Mohan
a339c473d5 docs: updated api doc with bulk raw secret ops 2024-04-19 20:54:41 +05:30
Akhil Mohan
718cabe49b feat(server): added batch raw bulk secret ops api 2024-04-19 20:53:54 +05:30
Tuan Dang
1dd451f221 Update groups count fn, type check 2024-04-18 14:54:02 -07:00
Daniel Hougaard
4050e56e60 Feat: CLI Integration tests 2024-04-18 23:29:11 +02:00
Tuan Dang
e1407cc093 Add comments for group-fns 2024-04-18 14:14:08 -07:00
Tuan Dang
1b38d969df Merge remote-tracking branch 'origin' into groups-phase-2b 2024-04-18 13:52:54 -07:00
Tuan Dang
6e3d5a8c7c Remove print statements, cleanup 2024-04-18 13:51:47 -07:00
Tuan Dang
fa7587900e Finish preliminary capability for adding incomplete users to groups 2024-04-18 10:57:25 -07:00
Daniel Hougaard
e453ddf937 Update secrets.go 2024-04-18 18:04:29 +02:00
Daniel Hougaard
3f68807179 Update run-cli-tests.yml 2024-04-18 17:07:37 +02:00
Daniel Hougaard
ba42aca069 Workflow 2024-04-18 15:13:58 +02:00
Daniel Hougaard
22c589e2cf Update tests.go 2024-04-18 15:01:31 +02:00
Daniel Hougaard
943945f6d7 Feat: Make run testable 2024-04-18 15:01:28 +02:00
Daniel Hougaard
b598dd3d47 Feat: Cli integration tests -- exports 2024-04-18 14:59:41 +02:00
Daniel Hougaard
ad6d18a905 Feat: Cli integration tests -- run cmd 2024-04-18 14:59:26 +02:00
Daniel Hougaard
46a91515b1 Fix: Use login UA token 2024-04-18 14:59:21 +02:00
Daniel Hougaard
b79ce8a880 Feat: Cli integration tests -- login 2024-04-18 14:59:13 +02:00
Daniel Hougaard
d31d98b5e0 Feat: CLI Integration tests 2024-04-18 14:58:59 +02:00
Daniel Hougaard
cb6cbafcae Fix: JSON error check 2024-04-17 19:33:51 +02:00
Daniel Hougaard
bcb3eaab74 Feat: Integration tests 2024-04-17 19:33:51 +02:00
Daniel Hougaard
12d5fb1043 Fix: Add support for imported secrets with raw fetching 2024-04-17 19:33:51 +02:00
Daniel Hougaard
8bf09789d6 Feat: Integration tests 2024-04-17 19:33:51 +02:00
Daniel Hougaard
7ab8db0471 Feat: Integration tests 2024-04-17 19:33:51 +02:00
Daniel Hougaard
6b473d2b36 Feat: Integration tests 2024-04-17 19:33:51 +02:00
Daniel Hougaard
7581b33b3b Fix: Add import support for raw fetching 2024-04-17 19:33:51 +02:00
Daniel Hougaard
be74f4d34c Fix: Add import & recursive support to raw fetching 2024-04-17 19:33:51 +02:00
Salman
f9957e111c feat: move to radix select component 2024-03-16 06:40:15 +05:30
Salman
1193e33890 feat: improve validation secret reference 2024-03-16 00:12:50 +05:30
Salman
ec64753795 fix: refactor and onyl valid env 2024-03-15 22:49:40 +05:30
Salman
c908310f6e fix: add lint line for reference 2024-03-15 21:07:13 +05:30
Salman
ee2b8a594a fix: handle skip and slash in environment slug 2024-03-15 20:21:27 +05:30
Salman
3ae27e088f feat: move to react query 2024-03-15 17:17:39 +05:30
Salman
393c0c9e90 fix: add todo 2024-03-15 03:30:11 +05:30
Salman
5e453ab8a6 fix: try dynamic height based on text area height 2024-03-15 03:26:49 +05:30
Salman
273c78c0a5 fix: hot fix for onChange 2024-03-15 03:04:36 +05:30
Salman
1bcc742466 feat: improve reference match, auto closing tag and reference select 2024-03-15 02:22:09 +05:30
Salman
1fc9e60254 feat: fetch folder and secrets 2024-03-14 09:39:57 +05:30
Salman
126e385046 fix: layout gap increment on new lines 2024-03-14 06:01:02 +05:30
Salman
2f932ad103 feat: add basic ui and icons 2024-03-14 05:55:23 +05:30
147 changed files with 5194 additions and 906 deletions

View File

@@ -0,0 +1,26 @@
import os
from datetime import datetime, timedelta
def rename_migrations():
migration_folder = "./backend/src/db/migrations"
with open("added_files.txt", "r") as file:
changed_files = file.readlines()
# Find the latest file among the changed files
latest_timestamp = datetime.now() # utc time
for file_path in changed_files:
file_path = file_path.strip()
# each new file bump by 1s
latest_timestamp = latest_timestamp + timedelta(seconds=1)
new_filename = os.path.join(migration_folder, latest_timestamp.strftime("%Y%m%d%H%M%S") + f"_{file_path.split('_')[1]}")
old_filename = os.path.join(migration_folder, file_path)
os.rename(old_filename, new_filename)
print(f"Renamed {old_filename} to {new_filename}")
if len(changed_files) == 0:
print("No new files added to migration folder")
if __name__ == "__main__":
rename_migrations()

View File

@@ -0,0 +1,38 @@
name: Build patroni
on: [workflow_dispatch]
jobs:
patroni-image:
name: Build patroni
runs-on: ubuntu-latest
steps:
- name: ☁️ Checkout source
uses: actions/checkout@v3
with:
repository: 'zalando/patroni'
- name: Save commit hashes for tag
id: commit
uses: pr-mpt/actions-commit-hash@v2
- name: 🔧 Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: 🐋 Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Depot CLI
uses: depot/setup-action@v1
- name: 🏗️ Build backend and push to docker hub
uses: depot/build-push-action@v1
with:
project: 64mmf0n610
token: ${{ secrets.DEPOT_PROJECT_TOKEN }}
push: true
context: .
file: Dockerfile
tags: |
infisical/patroni:${{ steps.commit.outputs.short }}
infisical/patroni:latest
platforms: linux/amd64,linux/arm64

View File

@@ -1,60 +1,72 @@
name: Build and release CLI
on:
push:
# run only against tags
tags:
- "infisical-cli/v*.*.*"
workflow_dispatch:
push:
# run only against tags
tags:
- "infisical-cli/v*.*.*"
permissions:
contents: write
# packages: write
# issues: write
contents: write
# packages: write
# issues: write
jobs:
goreleaser:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: 🐋 Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: 🔧 Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- run: git fetch --force --tags
- run: echo "Ref name ${{github.ref_name}}"
- uses: actions/setup-go@v3
with:
go-version: ">=1.19.3"
cache: true
cache-dependency-path: cli/go.sum
- name: libssl1.1 => libssl1.0-dev for OSXCross
run: |
echo 'deb http://security.ubuntu.com/ubuntu bionic-security main' | sudo tee -a /etc/apt/sources.list
sudo apt update && apt-cache policy libssl1.0-dev
sudo apt-get install libssl1.0-dev
- name: OSXCross for CGO Support
run: |
mkdir ../../osxcross
git clone https://github.com/plentico/osxcross-target.git ../../osxcross/target
- uses: goreleaser/goreleaser-action@v4
with:
distribution: goreleaser-pro
version: latest
args: release --clean
env:
GITHUB_TOKEN: ${{ secrets.GO_RELEASER_GITHUB_TOKEN }}
POSTHOG_API_KEY_FOR_CLI: ${{ secrets.POSTHOG_API_KEY_FOR_CLI }}
FURY_TOKEN: ${{ secrets.FURYPUSHTOKEN }}
AUR_KEY: ${{ secrets.AUR_KEY }}
GORELEASER_KEY: ${{ secrets.GORELEASER_KEY }}
- uses: actions/setup-python@v4
- run: pip install --upgrade cloudsmith-cli
- name: Publish to CloudSmith
run: sh cli/upload_to_cloudsmith.sh
env:
CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }}
cli-integration-tests:
name: Run tests before deployment
uses: ./.github/workflows/run-cli-tests.yml
secrets:
CLI_TESTS_UA_CLIENT_ID: ${{ secrets.CLI_TESTS_UA_CLIENT_ID }}
CLI_TESTS_UA_CLIENT_SECRET: ${{ secrets.CLI_TESTS_UA_CLIENT_SECRET }}
CLI_TESTS_SERVICE_TOKEN: ${{ secrets.CLI_TESTS_SERVICE_TOKEN }}
CLI_TESTS_PROJECT_ID: ${{ secrets.CLI_TESTS_PROJECT_ID }}
CLI_TESTS_ENV_SLUG: ${{ secrets.CLI_TESTS_ENV_SLUG }}
goreleaser:
runs-on: ubuntu-20.04
needs: [cli-integration-tests]
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: 🐋 Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: 🔧 Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- run: git fetch --force --tags
- run: echo "Ref name ${{github.ref_name}}"
- uses: actions/setup-go@v3
with:
go-version: ">=1.19.3"
cache: true
cache-dependency-path: cli/go.sum
- name: libssl1.1 => libssl1.0-dev for OSXCross
run: |
echo 'deb http://security.ubuntu.com/ubuntu bionic-security main' | sudo tee -a /etc/apt/sources.list
sudo apt update && apt-cache policy libssl1.0-dev
sudo apt-get install libssl1.0-dev
- name: OSXCross for CGO Support
run: |
mkdir ../../osxcross
git clone https://github.com/plentico/osxcross-target.git ../../osxcross/target
- uses: goreleaser/goreleaser-action@v4
with:
distribution: goreleaser-pro
version: latest
args: release --clean
env:
GITHUB_TOKEN: ${{ secrets.GO_RELEASER_GITHUB_TOKEN }}
POSTHOG_API_KEY_FOR_CLI: ${{ secrets.POSTHOG_API_KEY_FOR_CLI }}
FURY_TOKEN: ${{ secrets.FURYPUSHTOKEN }}
AUR_KEY: ${{ secrets.AUR_KEY }}
GORELEASER_KEY: ${{ secrets.GORELEASER_KEY }}
- uses: actions/setup-python@v4
- run: pip install --upgrade cloudsmith-cli
- name: Publish to CloudSmith
run: sh cli/upload_to_cloudsmith.sh
env:
CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }}

47
.github/workflows/run-cli-tests.yml vendored Normal file
View File

@@ -0,0 +1,47 @@
name: Go CLI Tests
on:
pull_request:
types: [opened, synchronize]
paths:
- "cli/**"
workflow_dispatch:
workflow_call:
secrets:
CLI_TESTS_UA_CLIENT_ID:
required: true
CLI_TESTS_UA_CLIENT_SECRET:
required: true
CLI_TESTS_SERVICE_TOKEN:
required: true
CLI_TESTS_PROJECT_ID:
required: true
CLI_TESTS_ENV_SLUG:
required: true
jobs:
test:
defaults:
run:
working-directory: ./cli
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v4
with:
go-version: "1.21.x"
- name: Install dependencies
run: go get .
- name: Test with the Go CLI
env:
CLI_TESTS_UA_CLIENT_ID: ${{ secrets.CLI_TESTS_UA_CLIENT_ID }}
CLI_TESTS_UA_CLIENT_SECRET: ${{ secrets.CLI_TESTS_UA_CLIENT_SECRET }}
CLI_TESTS_SERVICE_TOKEN: ${{ secrets.CLI_TESTS_SERVICE_TOKEN }}
CLI_TESTS_PROJECT_ID: ${{ secrets.CLI_TESTS_PROJECT_ID }}
CLI_TESTS_ENV_SLUG: ${{ secrets.CLI_TESTS_ENV_SLUG }}
run: go test -v -count=1 ./test

View File

@@ -0,0 +1,36 @@
name: Rename Migrations
on:
pull_request:
types:
- closed
paths:
- 'backend/src/db/migrations/**'
jobs:
rename:
runs-on: ubuntu-latest
if: github.event.pull_request.merged == true
steps:
- name: Check out repository
uses: actions/checkout@v2
- name: Get list of newly added files in migration folder
run: git diff --name-status HEAD^ HEAD backend/src/db/migrations | grep '^A' | cut -f2 | xargs -n1 basename > added_files.txt
- name: Script to rename migrations
run: python .github/resources/rename_migration_files.py
- name: Commit and push changes
run: |
git config user.name github-actions
git config user.email github-actions@github.com
git add ./backend/src/db/migrations
git commit -m "chore: renamed new migration files to latest timestamp (gh-action)"
- name: Push changes
env:
TOKEN: ${{ secrets.GH_PERSONAL_TOKEN }}
run: |
git push https://$GITHUB_ACTOR:$TOKEN@github.com/${{ github.repository }}.git HEAD:main

2
.gitignore vendored
View File

@@ -67,3 +67,5 @@ yarn-error.log*
frontend-build
*.tgz
cli/infisical-merge
cli/test/infisical-merge

View File

@@ -76,7 +76,7 @@ Check out the [Quickstart Guides](https://infisical.com/docs/getting-started/int
| Use Infisical Cloud | Deploy Infisical on premise |
| ------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| The fastest and most reliable way to <br> get started with Infisical is signing up <br> for free to [Infisical Cloud](https://app.infisical.com/login). | <a href="https://infisical.com/docs/self-hosting/deployment-options/aws-ec2"><img src=".github/images/deploy-to-aws.png" width="150" width="300" /></a> <a href="https://infisical.com/docs/self-hosting/deployment-options/digital-ocean-marketplace" alt="Deploy to DigitalOcean"> <img width="217" alt="Deploy to DO" src="https://www.deploytodo.com/do-btn-blue.svg"/> </a> <br> View all [deployment options](https://infisical.com/docs/self-hosting/overview) |
| The fastest and most reliable way to <br> get started with Infisical is signing up <br> for free to [Infisical Cloud](https://app.infisical.com/login). | <br> View all [deployment options](https://infisical.com/docs/self-hosting/overview) |
### Run Infisical locally

View File

@@ -942,6 +942,113 @@ describe.each([{ auth: AuthMode.JWT }, { auth: AuthMode.IDENTITY_ACCESS_TOKEN }]
const secrets = await getSecrets(seedData1.environment.slug, path);
expect(secrets).toEqual([]);
});
test.each(testRawSecrets)("Bulk create secret raw in path $path", async ({ path, secret }) => {
const createSecretReqBody = {
projectSlug: seedData1.project.slug,
environment: seedData1.environment.slug,
secretPath: path,
secrets: [
{
secretKey: secret.key,
secretValue: secret.value,
secretComment: secret.comment
}
]
};
const createSecRes = await testServer.inject({
method: "POST",
url: `/api/v3/secrets/batch/raw`,
headers: {
authorization: `Bearer ${authToken}`
},
body: createSecretReqBody
});
expect(createSecRes.statusCode).toBe(200);
const createdSecretPayload = JSON.parse(createSecRes.payload);
expect(createdSecretPayload).toHaveProperty("secrets");
// fetch secrets
const secrets = await getSecrets(seedData1.environment.slug, path);
expect(secrets).toEqual(
expect.arrayContaining([
expect.objectContaining({
key: secret.key,
value: secret.value,
type: SecretType.Shared
})
])
);
await deleteRawSecret({ path, key: secret.key });
});
test.each(testRawSecrets)("Bulk update secret raw in path $path", async ({ secret, path }) => {
await createRawSecret({ path, ...secret });
const updateSecretReqBody = {
projectSlug: seedData1.project.slug,
environment: seedData1.environment.slug,
secretPath: path,
secrets: [
{
secretValue: "new-value",
secretKey: secret.key
}
]
};
const updateSecRes = await testServer.inject({
method: "PATCH",
url: `/api/v3/secrets/batch/raw`,
headers: {
authorization: `Bearer ${authToken}`
},
body: updateSecretReqBody
});
expect(updateSecRes.statusCode).toBe(200);
const updatedSecretPayload = JSON.parse(updateSecRes.payload);
expect(updatedSecretPayload).toHaveProperty("secrets");
// fetch secrets
const secrets = await getSecrets(seedData1.environment.slug, path);
expect(secrets).toEqual(
expect.arrayContaining([
expect.objectContaining({
key: secret.key,
value: "new-value",
version: 2,
type: SecretType.Shared
})
])
);
await deleteRawSecret({ path, key: secret.key });
});
test.each(testRawSecrets)("Bulk delete secret raw in path $path", async ({ path, secret }) => {
await createRawSecret({ path, ...secret });
const deletedSecretReqBody = {
projectSlug: seedData1.project.slug,
environment: seedData1.environment.slug,
secretPath: path,
secrets: [{ secretKey: secret.key }]
};
const deletedSecRes = await testServer.inject({
method: "DELETE",
url: `/api/v3/secrets/batch/raw`,
headers: {
authorization: `Bearer ${authToken}`
},
body: deletedSecretReqBody
});
expect(deletedSecRes.statusCode).toBe(200);
const deletedSecretPayload = JSON.parse(deletedSecRes.payload);
expect(deletedSecretPayload).toHaveProperty("secrets");
// fetch secrets
const secrets = await getSecrets(seedData1.environment.slug, path);
expect(secrets).toEqual([]);
});
}
);

View File

@@ -45,6 +45,7 @@
"jsonwebtoken": "^9.0.2",
"jsrp": "^0.2.4",
"knex": "^3.0.1",
"ldapjs": "^3.0.7",
"libsodium-wrappers": "^0.7.13",
"lodash.isequal": "^4.5.0",
"ms": "^2.1.3",
@@ -2510,6 +2511,83 @@
"@jridgewell/sourcemap-codec": "^1.4.10"
}
},
"node_modules/@ldapjs/asn1": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/@ldapjs/asn1/-/asn1-2.0.0.tgz",
"integrity": "sha512-G9+DkEOirNgdPmD0I8nu57ygQJKOOgFEMKknEuQvIHbGLwP3ny1mY+OTUYLCbCaGJP4sox5eYgBJRuSUpnAddA=="
},
"node_modules/@ldapjs/attribute": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/@ldapjs/attribute/-/attribute-1.0.0.tgz",
"integrity": "sha512-ptMl2d/5xJ0q+RgmnqOi3Zgwk/TMJYG7dYMC0Keko+yZU6n+oFM59MjQOUht5pxJeS4FWrImhu/LebX24vJNRQ==",
"dependencies": {
"@ldapjs/asn1": "2.0.0",
"@ldapjs/protocol": "^1.2.1",
"process-warning": "^2.1.0"
}
},
"node_modules/@ldapjs/change": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/@ldapjs/change/-/change-1.0.0.tgz",
"integrity": "sha512-EOQNFH1RIku3M1s0OAJOzGfAohuFYXFY4s73wOhRm4KFGhmQQ7MChOh2YtYu9Kwgvuq1B0xKciXVzHCGkB5V+Q==",
"dependencies": {
"@ldapjs/asn1": "2.0.0",
"@ldapjs/attribute": "1.0.0"
}
},
"node_modules/@ldapjs/controls": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/@ldapjs/controls/-/controls-2.1.0.tgz",
"integrity": "sha512-2pFdD1yRC9V9hXfAWvCCO2RRWK9OdIEcJIos/9cCVP9O4k72BY1bLDQQ4KpUoJnl4y/JoD4iFgM+YWT3IfITWw==",
"dependencies": {
"@ldapjs/asn1": "^1.2.0",
"@ldapjs/protocol": "^1.2.1"
}
},
"node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@ldapjs/asn1/-/asn1-1.2.0.tgz",
"integrity": "sha512-KX/qQJ2xxzvO2/WOvr1UdQ+8P5dVvuOLk/C9b1bIkXxZss8BaR28njXdPgFCpj5aHaf1t8PmuVnea+N9YG9YMw=="
},
"node_modules/@ldapjs/dn": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@ldapjs/dn/-/dn-1.1.0.tgz",
"integrity": "sha512-R72zH5ZeBj/Fujf/yBu78YzpJjJXG46YHFo5E4W1EqfNpo1UsVPqdLrRMXeKIsJT3x9dJVIfR6OpzgINlKpi0A==",
"dependencies": {
"@ldapjs/asn1": "2.0.0",
"process-warning": "^2.1.0"
}
},
"node_modules/@ldapjs/filter": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/@ldapjs/filter/-/filter-2.1.1.tgz",
"integrity": "sha512-TwPK5eEgNdUO1ABPBUQabcZ+h9heDORE4V9WNZqCtYLKc06+6+UAJ3IAbr0L0bYTnkkWC/JEQD2F+zAFsuikNw==",
"dependencies": {
"@ldapjs/asn1": "2.0.0",
"@ldapjs/protocol": "^1.2.1",
"process-warning": "^2.1.0"
}
},
"node_modules/@ldapjs/messages": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/@ldapjs/messages/-/messages-1.3.0.tgz",
"integrity": "sha512-K7xZpXJ21bj92jS35wtRbdcNrwmxAtPwy4myeh9duy/eR3xQKvikVycbdWVzkYEAVE5Ce520VXNOwCHjomjCZw==",
"dependencies": {
"@ldapjs/asn1": "^2.0.0",
"@ldapjs/attribute": "^1.0.0",
"@ldapjs/change": "^1.0.0",
"@ldapjs/controls": "^2.1.0",
"@ldapjs/dn": "^1.1.0",
"@ldapjs/filter": "^2.1.1",
"@ldapjs/protocol": "^1.2.1",
"process-warning": "^2.2.0"
}
},
"node_modules/@ldapjs/protocol": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/@ldapjs/protocol/-/protocol-1.2.1.tgz",
"integrity": "sha512-O89xFDLW2gBoZWNXuXpBSM32/KealKCTb3JGtJdtUQc7RjAk8XzrRgyz02cPAwGKwKPxy0ivuC7UP9bmN87egQ=="
},
"node_modules/@lukeed/ms": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/@lukeed/ms/-/ms-2.0.1.tgz",
@@ -9304,15 +9382,7 @@
"node": ">=0.8.0"
}
},
"node_modules/ldapauth-fork/node_modules/lru-cache": {
"version": "7.18.3",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-7.18.3.tgz",
"integrity": "sha512-jumlc0BIUrS3qJGgIkWZsyfAM7NCWiBcCDhnd+3NNM5KbBmLTgHVfWBcg6W+rLUsIpzpERPsvwUP7CckAQSOoA==",
"engines": {
"node": ">=12"
}
},
"node_modules/ldapjs": {
"node_modules/ldapauth-fork/node_modules/ldapjs": {
"version": "2.3.3",
"resolved": "https://registry.npmjs.org/ldapjs/-/ldapjs-2.3.3.tgz",
"integrity": "sha512-75QiiLJV/PQqtpH+HGls44dXweviFwQ6SiIK27EqzKQ5jU/7UFrl2E5nLdQ3IYRBzJ/AVFJI66u0MZ0uofKYwg==",
@@ -9330,6 +9400,35 @@
"node": ">=10.13.0"
}
},
"node_modules/ldapauth-fork/node_modules/lru-cache": {
"version": "7.18.3",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-7.18.3.tgz",
"integrity": "sha512-jumlc0BIUrS3qJGgIkWZsyfAM7NCWiBcCDhnd+3NNM5KbBmLTgHVfWBcg6W+rLUsIpzpERPsvwUP7CckAQSOoA==",
"engines": {
"node": ">=12"
}
},
"node_modules/ldapjs": {
"version": "3.0.7",
"resolved": "https://registry.npmjs.org/ldapjs/-/ldapjs-3.0.7.tgz",
"integrity": "sha512-1ky+WrN+4CFMuoekUOv7Y1037XWdjKpu0xAPwSP+9KdvmV9PG+qOKlssDV6a+U32apwxdD3is/BZcWOYzN30cg==",
"dependencies": {
"@ldapjs/asn1": "^2.0.0",
"@ldapjs/attribute": "^1.0.0",
"@ldapjs/change": "^1.0.0",
"@ldapjs/controls": "^2.1.0",
"@ldapjs/dn": "^1.1.0",
"@ldapjs/filter": "^2.1.1",
"@ldapjs/messages": "^1.3.0",
"@ldapjs/protocol": "^1.2.1",
"abstract-logging": "^2.0.1",
"assert-plus": "^1.0.0",
"backoff": "^2.5.0",
"once": "^1.4.0",
"vasync": "^2.2.1",
"verror": "^1.10.1"
}
},
"node_modules/leven": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/leven/-/leven-2.1.0.tgz",

View File

@@ -106,6 +106,7 @@
"jsonwebtoken": "^9.0.2",
"jsrp": "^0.2.4",
"knex": "^3.0.1",
"ldapjs": "^3.0.7",
"libsodium-wrappers": "^0.7.13",
"lodash.isequal": "^4.5.0",
"ms": "^2.1.3",

View File

@@ -74,6 +74,9 @@ import {
TLdapConfigs,
TLdapConfigsInsert,
TLdapConfigsUpdate,
TLdapGroupMaps,
TLdapGroupMapsInsert,
TLdapGroupMapsUpdate,
TOrganizations,
TOrganizationsInsert,
TOrganizationsUpdate,
@@ -398,6 +401,7 @@ declare module "knex/types/tables" {
>;
[TableName.SamlConfig]: Knex.CompositeTableType<TSamlConfigs, TSamlConfigsInsert, TSamlConfigsUpdate>;
[TableName.LdapConfig]: Knex.CompositeTableType<TLdapConfigs, TLdapConfigsInsert, TLdapConfigsUpdate>;
[TableName.LdapGroupMap]: Knex.CompositeTableType<TLdapGroupMaps, TLdapGroupMapsInsert, TLdapGroupMapsUpdate>;
[TableName.OrgBot]: Knex.CompositeTableType<TOrgBots, TOrgBotsInsert, TOrgBotsUpdate>;
[TableName.AuditLog]: Knex.CompositeTableType<TAuditLogs, TAuditLogsInsert, TAuditLogsUpdate>;
[TableName.GitAppInstallSession]: Knex.CompositeTableType<

View File

@@ -42,6 +42,7 @@ export async function up(knex: Knex): Promise<void> {
await knex.transaction(async (tx) => {
const duplicateRows = await tx(TableName.OrgMembership)
.select("userId", "orgId") // Select the userId and orgId so we can group by them
.whereNotNull("userId") // Ensure that the userId is not null
.count("* as cnt") // Count the number of rows for each userId and orgId, so we can make sure there are more than 1 row (a duplicate)
.groupBy("userId", "orgId")
.havingRaw("count(*) > ?", [1]); // Using havingRaw for direct SQL expressions

View File

@@ -0,0 +1,15 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.UserGroupMembership, (t) => {
t.boolean("isPending").notNullable().defaultTo(false);
});
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.UserGroupMembership, (t) => {
t.dropColumn("isPending");
});
}

View File

@@ -0,0 +1,34 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
import { createOnUpdateTrigger, dropOnUpdateTrigger } from "../utils";
export async function up(knex: Knex): Promise<void> {
if (!(await knex.schema.hasTable(TableName.LdapGroupMap))) {
await knex.schema.createTable(TableName.LdapGroupMap, (t) => {
t.uuid("id", { primaryKey: true }).defaultTo(knex.fn.uuid());
t.uuid("ldapConfigId").notNullable();
t.foreign("ldapConfigId").references("id").inTable(TableName.LdapConfig).onDelete("CASCADE");
t.string("ldapGroupCN").notNullable();
t.uuid("groupId").notNullable();
t.foreign("groupId").references("id").inTable(TableName.Groups).onDelete("CASCADE");
t.unique(["ldapGroupCN", "groupId", "ldapConfigId"]);
});
}
await createOnUpdateTrigger(knex, TableName.LdapGroupMap);
await knex.schema.alterTable(TableName.LdapConfig, (t) => {
t.string("groupSearchBase").notNullable().defaultTo("");
t.string("groupSearchFilter").notNullable().defaultTo("");
});
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.dropTableIfExists(TableName.LdapGroupMap);
await dropOnUpdateTrigger(knex, TableName.LdapGroupMap);
await knex.schema.alterTable(TableName.LdapConfig, (t) => {
t.dropColumn("groupSearchBase");
t.dropColumn("groupSearchFilter");
});
}

View File

@@ -0,0 +1,15 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.LdapConfig, (t) => {
t.string("searchFilter").notNullable().defaultTo("");
});
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.LdapConfig, (t) => {
t.dropColumn("searchFilter");
});
}

View File

@@ -0,0 +1,10 @@
import { Knex } from "knex";
export async function up(knex: Knex): Promise<void> {
}
export async function down(knex: Knex): Promise<void> {
}

View File

@@ -22,6 +22,7 @@ export * from "./incident-contacts";
export * from "./integration-auths";
export * from "./integrations";
export * from "./ldap-configs";
export * from "./ldap-group-maps";
export * from "./models";
export * from "./org-bots";
export * from "./org-memberships";

View File

@@ -23,7 +23,10 @@ export const LdapConfigsSchema = z.object({
caCertIV: z.string(),
caCertTag: z.string(),
createdAt: z.date(),
updatedAt: z.date()
updatedAt: z.date(),
groupSearchBase: z.string().default(""),
groupSearchFilter: z.string().default(""),
searchFilter: z.string().default("")
});
export type TLdapConfigs = z.infer<typeof LdapConfigsSchema>;

View File

@@ -0,0 +1,19 @@
// Code generated by automation script, DO NOT EDIT.
// Automated by pulling database and generating zod schema
// To update. Just run npm run generate:schema
// Written by akhilmhdh.
import { z } from "zod";
import { TImmutableDBKeys } from "./models";
export const LdapGroupMapsSchema = z.object({
id: z.string().uuid(),
ldapConfigId: z.string().uuid(),
ldapGroupCN: z.string(),
groupId: z.string().uuid()
});
export type TLdapGroupMaps = z.infer<typeof LdapGroupMapsSchema>;
export type TLdapGroupMapsInsert = Omit<z.input<typeof LdapGroupMapsSchema>, TImmutableDBKeys>;
export type TLdapGroupMapsUpdate = Partial<Omit<z.input<typeof LdapGroupMapsSchema>, TImmutableDBKeys>>;

View File

@@ -60,6 +60,7 @@ export enum TableName {
SecretRotationOutput = "secret_rotation_outputs",
SamlConfig = "saml_configs",
LdapConfig = "ldap_configs",
LdapGroupMap = "ldap_group_maps",
AuditLog = "audit_logs",
GitAppInstallSession = "git_app_install_sessions",
GitAppOrg = "git_app_org",

View File

@@ -12,7 +12,8 @@ export const UserGroupMembershipSchema = z.object({
userId: z.string().uuid(),
groupId: z.string().uuid(),
createdAt: z.date(),
updatedAt: z.date()
updatedAt: z.date(),
isPending: z.boolean().default(false)
});
export type TUserGroupMembership = z.infer<typeof UserGroupMembershipSchema>;

View File

@@ -14,7 +14,9 @@ import { FastifyRequest } from "fastify";
import LdapStrategy from "passport-ldapauth";
import { z } from "zod";
import { LdapConfigsSchema } from "@app/db/schemas";
import { LdapConfigsSchema, LdapGroupMapsSchema } from "@app/db/schemas";
import { TLDAPConfig } from "@app/ee/services/ldap-config/ldap-config-types";
import { isValidLdapFilter, searchGroups } from "@app/ee/services/ldap-config/ldap-fns";
import { getConfig } from "@app/lib/config/env";
import { logger } from "@app/lib/logger";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
@@ -50,20 +52,38 @@ export const registerLdapRouter = async (server: FastifyZodProvider) => {
// eslint-disable-next-line
async (req: IncomingMessage, user, cb) => {
try {
const ldapConfig = (req as unknown as FastifyRequest).ldapConfig as TLDAPConfig;
let groups: { dn: string; cn: string }[] | undefined;
if (ldapConfig.groupSearchBase) {
const groupFilter = "(|(memberUid={{.Username}})(member={{.UserDN}})(uniqueMember={{.UserDN}}))";
const groupSearchFilter = (ldapConfig.groupSearchFilter || groupFilter)
.replace(/{{\.Username}}/g, user.uid)
.replace(/{{\.UserDN}}/g, user.dn);
if (!isValidLdapFilter(groupSearchFilter)) {
throw new Error("Generated LDAP search filter is invalid.");
}
groups = await searchGroups(ldapConfig, groupSearchFilter, ldapConfig.groupSearchBase);
}
const { isUserCompleted, providerAuthToken } = await server.services.ldap.ldapLogin({
ldapConfigId: ldapConfig.id,
externalId: user.uidNumber,
username: user.uid,
firstName: user.givenName,
lastName: user.sn,
firstName: user.givenName ?? user.cn ?? "",
lastName: user.sn ?? "",
emails: user.mail ? [user.mail] : [],
groups,
relayState: ((req as unknown as FastifyRequest).body as { RelayState?: string }).RelayState,
orgId: (req as unknown as FastifyRequest).ldapConfig.organization
});
return cb(null, { isUserCompleted, providerAuthToken });
} catch (err) {
logger.error(err);
return cb(err, false);
} catch (error) {
logger.error(error);
return cb(error, false);
}
}
)
@@ -117,6 +137,9 @@ export const registerLdapRouter = async (server: FastifyZodProvider) => {
bindDN: z.string(),
bindPass: z.string(),
searchBase: z.string(),
searchFilter: z.string(),
groupSearchBase: z.string(),
groupSearchFilter: z.string(),
caCert: z.string()
})
}
@@ -148,6 +171,12 @@ export const registerLdapRouter = async (server: FastifyZodProvider) => {
bindDN: z.string().trim(),
bindPass: z.string().trim(),
searchBase: z.string().trim(),
searchFilter: z.string().trim().default("(uid={{username}})"),
groupSearchBase: z.string().trim(),
groupSearchFilter: z
.string()
.trim()
.default("(|(memberUid={{.Username}})(member={{.UserDN}})(uniqueMember={{.UserDN}}))"),
caCert: z.string().trim().default("")
}),
response: {
@@ -183,6 +212,9 @@ export const registerLdapRouter = async (server: FastifyZodProvider) => {
bindDN: z.string().trim(),
bindPass: z.string().trim(),
searchBase: z.string().trim(),
searchFilter: z.string().trim(),
groupSearchBase: z.string().trim(),
groupSearchFilter: z.string().trim(),
caCert: z.string().trim()
})
.partial()
@@ -204,4 +236,134 @@ export const registerLdapRouter = async (server: FastifyZodProvider) => {
return ldap;
}
});
server.route({
method: "GET",
url: "/config/:configId/group-maps",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
configId: z.string().trim()
}),
response: {
200: z.array(
z.object({
id: z.string(),
ldapConfigId: z.string(),
ldapGroupCN: z.string(),
group: z.object({
id: z.string(),
name: z.string(),
slug: z.string()
})
})
)
}
},
handler: async (req) => {
const ldapGroupMaps = await server.services.ldap.getLdapGroupMaps({
actor: req.permission.type,
actorId: req.permission.id,
orgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
ldapConfigId: req.params.configId
});
return ldapGroupMaps;
}
});
server.route({
method: "POST",
url: "/config/:configId/group-maps",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
configId: z.string().trim()
}),
body: z.object({
ldapGroupCN: z.string().trim(),
groupSlug: z.string().trim()
}),
response: {
200: LdapGroupMapsSchema
}
},
handler: async (req) => {
const ldapGroupMap = await server.services.ldap.createLdapGroupMap({
actor: req.permission.type,
actorId: req.permission.id,
orgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
ldapConfigId: req.params.configId,
...req.body
});
return ldapGroupMap;
}
});
server.route({
method: "DELETE",
url: "/config/:configId/group-maps/:groupMapId",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
configId: z.string().trim(),
groupMapId: z.string().trim()
}),
response: {
200: LdapGroupMapsSchema
}
},
handler: async (req) => {
const ldapGroupMap = await server.services.ldap.deleteLdapGroupMap({
actor: req.permission.type,
actorId: req.permission.id,
orgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
ldapConfigId: req.params.configId,
ldapGroupMapId: req.params.groupMapId
});
return ldapGroupMap;
}
});
server.route({
method: "POST",
url: "/config/:configId/test-connection",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
configId: z.string().trim()
}),
response: {
200: z.boolean()
}
},
handler: async (req) => {
const result = await server.services.ldap.testLDAPConnection({
actor: req.permission.type,
actorId: req.permission.id,
orgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
ldapConfigId: req.params.configId
});
return result;
}
});
};

View File

@@ -157,7 +157,13 @@ export const registerProjectRoleRouter = async (server: FastifyZodProvider) => {
response: {
200: z.object({
data: z.object({
membership: ProjectMembershipsSchema,
membership: ProjectMembershipsSchema.extend({
roles: z
.object({
role: z.string()
})
.array()
}),
permissions: z.any().array()
})
})

View File

@@ -289,14 +289,28 @@ export const registerScimRouter = async (server: FastifyZodProvider) => {
body: z.object({
schemas: z.array(z.string()),
displayName: z.string().trim(),
members: z.array(z.any()).length(0).optional() // okta-specific
members: z
.array(
z.object({
value: z.string(),
display: z.string()
})
)
.optional() // okta-specific
}),
response: {
200: z.object({
schemas: z.array(z.string()),
id: z.string().trim(),
displayName: z.string().trim(),
members: z.array(z.any()).length(0),
members: z
.array(
z.object({
value: z.string(),
display: z.string()
})
)
.optional(),
meta: z.object({
resourceType: z.string().trim()
})
@@ -306,8 +320,8 @@ export const registerScimRouter = async (server: FastifyZodProvider) => {
onRequest: verifyAuth([AuthMode.SCIM_TOKEN]),
handler: async (req) => {
const group = await req.server.services.scim.createScimGroup({
displayName: req.body.displayName,
orgId: req.permission.orgId
orgId: req.permission.orgId,
...req.body
});
return group;
@@ -400,7 +414,12 @@ export const registerScimRouter = async (server: FastifyZodProvider) => {
schemas: z.array(z.string()),
id: z.string().trim(),
displayName: z.string().trim(),
members: z.array(z.any()).length(0)
members: z.array(
z.object({
value: z.string(), // infisical userId
display: z.string()
})
) // note: is this where members are added to group?
}),
response: {
200: z.object({
@@ -424,7 +443,7 @@ export const registerScimRouter = async (server: FastifyZodProvider) => {
const group = await req.server.services.scim.updateScimGroupNamePut({
groupId: req.params.groupId,
orgId: req.permission.orgId,
displayName: req.body.displayName
...req.body
});
return group;
@@ -482,8 +501,6 @@ export const registerScimRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.SCIM_TOKEN]),
handler: async (req) => {
// console.log("PATCH /Groups/:groupId req.body: ", req.body);
// console.log("PATCH /Groups/:groupId req.body: ", req.body.Operations[0]);
const group = await req.server.services.scim.updateScimGroupNamePatch({
groupId: req.params.groupId,
orgId: req.permission.orgId,

View File

@@ -59,32 +59,6 @@ export const groupDALFactory = (db: TDbClient) => {
}
};
const countAllGroupMembers = async ({ orgId, groupId }: { orgId: string; groupId: string }) => {
try {
interface CountResult {
count: string;
}
const doc = await db<CountResult>(TableName.OrgMembership)
.where(`${TableName.OrgMembership}.orgId`, orgId)
.join(TableName.Users, `${TableName.OrgMembership}.userId`, `${TableName.Users}.id`)
.leftJoin(TableName.UserGroupMembership, function () {
this.on(`${TableName.UserGroupMembership}.userId`, "=", `${TableName.Users}.id`).andOn(
`${TableName.UserGroupMembership}.groupId`,
"=",
db.raw("?", [groupId])
);
})
.where({ isGhost: false })
.count(`${TableName.Users}.id`)
.first();
return parseInt((doc?.count as string) || "0", 10);
} catch (err) {
throw new DatabaseError({ error: err, name: "Count all group members" });
}
};
// special query
const findAllGroupMembers = async ({
orgId,
@@ -150,7 +124,6 @@ export const groupDALFactory = (db: TDbClient) => {
return {
findGroups,
findByOrgId,
countAllGroupMembers,
findAllGroupMembers,
...groupOrm
};

View File

@@ -0,0 +1,450 @@
import { Knex } from "knex";
import { SecretKeyEncoding, TUsers } from "@app/db/schemas";
import { decryptAsymmetric, encryptAsymmetric, infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { BadRequestError, ScimRequestError } from "@app/lib/errors";
import {
TAddUsersToGroup,
TAddUsersToGroupByUserIds,
TConvertPendingGroupAdditionsToGroupMemberships,
TRemoveUsersFromGroupByUserIds
} from "./group-types";
const addAcceptedUsersToGroup = async ({
userIds,
group,
userGroupMembershipDAL,
userDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
}: TAddUsersToGroup) => {
const users = await userDAL.findUserEncKeyByUserIdsBatch(
{
userIds
},
tx
);
await userGroupMembershipDAL.insertMany(
users.map((user) => ({
userId: user.userId,
groupId: group.id,
isPending: false
})),
tx
);
// check which projects the group is part of
const projectIds = Array.from(
new Set(
(
await groupProjectDAL.find(
{
groupId: group.id
},
{ tx }
)
).map((gp) => gp.projectId)
)
);
const keys = await projectKeyDAL.find(
{
$in: {
projectId: projectIds,
receiverId: users.map((u) => u.id)
}
},
{ tx }
);
const userKeysSet = new Set(keys.map((k) => `${k.projectId}-${k.receiverId}`));
for await (const projectId of projectIds) {
const usersToAddProjectKeyFor = users.filter((u) => !userKeysSet.has(`${projectId}-${u.userId}`));
if (usersToAddProjectKeyFor.length) {
// there are users who need to be shared keys
// process adding bulk users to projects for each project individually
const ghostUser = await projectDAL.findProjectGhostUser(projectId, tx);
if (!ghostUser) {
throw new BadRequestError({
message: "Failed to find sudo user"
});
}
const ghostUserLatestKey = await projectKeyDAL.findLatestProjectKey(ghostUser.id, projectId, tx);
if (!ghostUserLatestKey) {
throw new BadRequestError({
message: "Failed to find sudo user latest key"
});
}
const bot = await projectBotDAL.findOne({ projectId }, tx);
if (!bot) {
throw new BadRequestError({
message: "Failed to find bot"
});
}
const botPrivateKey = infisicalSymmetricDecrypt({
keyEncoding: bot.keyEncoding as SecretKeyEncoding,
iv: bot.iv,
tag: bot.tag,
ciphertext: bot.encryptedPrivateKey
});
const plaintextProjectKey = decryptAsymmetric({
ciphertext: ghostUserLatestKey.encryptedKey,
nonce: ghostUserLatestKey.nonce,
publicKey: ghostUserLatestKey.sender.publicKey,
privateKey: botPrivateKey
});
const projectKeysToAdd = usersToAddProjectKeyFor.map((user) => {
const { ciphertext: encryptedKey, nonce } = encryptAsymmetric(
plaintextProjectKey,
user.publicKey,
botPrivateKey
);
return {
encryptedKey,
nonce,
senderId: ghostUser.id,
receiverId: user.userId,
projectId
};
});
await projectKeyDAL.insertMany(projectKeysToAdd, tx);
}
}
};
/**
* Add users with user ids [userIds] to group [group].
* - Users may or may not have finished completing their accounts; this function will
* handle both adding users to groups directly and via pending group additions.
* @param {group} group - group to add user(s) to
* @param {string[]} userIds - id(s) of user(s) to add to group
*/
export const addUsersToGroupByUserIds = async ({
group,
userIds,
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx: outerTx
}: TAddUsersToGroupByUserIds) => {
const processAddition = async (tx: Knex) => {
const foundMembers = await userDAL.find(
{
$in: {
id: userIds
}
},
{ tx }
);
const foundMembersIdsSet = new Set(foundMembers.map((member) => member.id));
const isCompleteMatch = userIds.every((userId) => foundMembersIdsSet.has(userId));
if (!isCompleteMatch) {
throw new ScimRequestError({
detail: "Members not found",
status: 404
});
}
// check if user(s) group membership(s) already exists
const existingUserGroupMemberships = await userGroupMembershipDAL.find(
{
groupId: group.id,
$in: {
userId: userIds
}
},
{ tx }
);
if (existingUserGroupMemberships.length) {
throw new BadRequestError({
message: `User(s) are already part of the group ${group.slug}`
});
}
// check if all user(s) are part of the organization
const existingUserOrgMemberships = await orgDAL.findMembership(
{
orgId: group.orgId,
$in: {
userId: userIds
}
},
{ tx }
);
const existingUserOrgMembershipsUserIdsSet = new Set(existingUserOrgMemberships.map((u) => u.userId));
userIds.forEach((userId) => {
if (!existingUserOrgMembershipsUserIdsSet.has(userId))
throw new BadRequestError({
message: `User with id ${userId} is not part of the organization`
});
});
const membersToAddToGroupNonPending: TUsers[] = [];
const membersToAddToGroupPending: TUsers[] = [];
foundMembers.forEach((member) => {
if (member.isAccepted) {
// add accepted member to group
membersToAddToGroupNonPending.push(member);
} else {
// add incomplete member to pending group addition
membersToAddToGroupPending.push(member);
}
});
if (membersToAddToGroupNonPending.length) {
await addAcceptedUsersToGroup({
userIds: membersToAddToGroupNonPending.map((member) => member.id),
group,
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
});
}
if (membersToAddToGroupPending.length) {
await userGroupMembershipDAL.insertMany(
membersToAddToGroupPending.map((member) => ({
userId: member.id,
groupId: group.id,
isPending: true
})),
tx
);
}
return membersToAddToGroupNonPending.concat(membersToAddToGroupPending);
};
if (outerTx) {
return processAddition(outerTx);
}
return userDAL.transaction(async (tx) => {
return processAddition(tx);
});
};
/**
* Remove users with user ids [userIds] from group [group].
* - Users may be part of the group (non-pending + pending);
* this function will handle both cases.
* @param {group} group - group to remove user(s) from
* @param {string[]} userIds - id(s) of user(s) to remove from group
*/
export const removeUsersFromGroupByUserIds = async ({
group,
userIds,
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
tx: outerTx
}: TRemoveUsersFromGroupByUserIds) => {
const processRemoval = async (tx: Knex) => {
const foundMembers = await userDAL.find({
$in: {
id: userIds
}
});
const foundMembersIdsSet = new Set(foundMembers.map((member) => member.id));
const isCompleteMatch = userIds.every((userId) => foundMembersIdsSet.has(userId));
if (!isCompleteMatch) {
throw new ScimRequestError({
detail: "Members not found",
status: 404
});
}
// check if user group membership already exists
const existingUserGroupMemberships = await userGroupMembershipDAL.find(
{
groupId: group.id,
$in: {
userId: userIds
}
},
{ tx }
);
const existingUserGroupMembershipsUserIdsSet = new Set(existingUserGroupMemberships.map((u) => u.userId));
userIds.forEach((userId) => {
if (!existingUserGroupMembershipsUserIdsSet.has(userId))
throw new BadRequestError({
message: `User(s) are not part of the group ${group.slug}`
});
});
const membersToRemoveFromGroupNonPending: TUsers[] = [];
const membersToRemoveFromGroupPending: TUsers[] = [];
foundMembers.forEach((member) => {
if (member.isAccepted) {
// remove accepted member from group
membersToRemoveFromGroupNonPending.push(member);
} else {
// remove incomplete member from pending group addition
membersToRemoveFromGroupPending.push(member);
}
});
if (membersToRemoveFromGroupNonPending.length) {
// check which projects the group is part of
const projectIds = Array.from(
new Set(
(
await groupProjectDAL.find(
{
groupId: group.id
},
{ tx }
)
).map((gp) => gp.projectId)
)
);
// TODO: this part can be optimized
for await (const userId of userIds) {
const t = await userGroupMembershipDAL.filterProjectsByUserMembership(userId, group.id, projectIds, tx);
const projectsToDeleteKeyFor = projectIds.filter((p) => !t.has(p));
if (projectsToDeleteKeyFor.length) {
await projectKeyDAL.delete(
{
receiverId: userId,
$in: {
projectId: projectsToDeleteKeyFor
}
},
tx
);
}
await userGroupMembershipDAL.delete(
{
groupId: group.id,
userId
},
tx
);
}
}
if (membersToRemoveFromGroupPending.length) {
await userGroupMembershipDAL.delete({
groupId: group.id,
$in: {
userId: membersToRemoveFromGroupPending.map((member) => member.id)
}
});
}
return membersToRemoveFromGroupNonPending.concat(membersToRemoveFromGroupPending);
};
if (outerTx) {
return processRemoval(outerTx);
}
return userDAL.transaction(async (tx) => {
return processRemoval(tx);
});
};
/**
* Convert pending group additions for users with ids [userIds] to group memberships.
* @param {string[]} userIds - id(s) of user(s) to try to convert pending group additions to group memberships
*/
export const convertPendingGroupAdditionsToGroupMemberships = async ({
userIds,
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx: outerTx
}: TConvertPendingGroupAdditionsToGroupMemberships) => {
const processConversion = async (tx: Knex) => {
const users = await userDAL.find(
{
$in: {
id: userIds
}
},
{ tx }
);
const usersUserIdsSet = new Set(users.map((u) => u.id));
userIds.forEach((userId) => {
if (!usersUserIdsSet.has(userId)) {
throw new BadRequestError({
message: `Failed to find user with id ${userId}`
});
}
});
users.forEach((user) => {
if (!user.isAccepted) {
throw new BadRequestError({
message: `Failed to convert pending group additions to group memberships for user ${user.username} because they have not confirmed their account`
});
}
});
const pendingGroupAdditions = await userGroupMembershipDAL.deletePendingUserGroupMembershipsByUserIds(userIds, tx);
for await (const pendingGroupAddition of pendingGroupAdditions) {
await addAcceptedUsersToGroup({
userIds: [pendingGroupAddition.user.id],
group: pendingGroupAddition.group,
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
});
}
};
if (outerTx) {
return processConversion(outerTx);
}
return userDAL.transaction(async (tx) => {
await processConversion(tx);
});
};

View File

@@ -1,22 +1,22 @@
import { ForbiddenError } from "@casl/ability";
import slugify from "@sindresorhus/slugify";
import { OrgMembershipRole, SecretKeyEncoding, TOrgRoles } from "@app/db/schemas";
import { OrgMembershipRole, TOrgRoles } from "@app/db/schemas";
import { isAtLeastAsPrivileged } from "@app/lib/casl";
import { decryptAsymmetric, encryptAsymmetric, infisicalSymmetricDecrypt } from "@app/lib/crypto/encryption";
import { BadRequestError, ForbiddenRequestError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TOrgDALFactory } from "@app/services/org/org-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { TUserDALFactory } from "@app/services/user/user-dal";
import { TGroupProjectDALFactory } from "../../../services/group-project/group-project-dal";
import { TOrgDALFactory } from "../../../services/org/org-dal";
import { TProjectDALFactory } from "../../../services/project/project-dal";
import { TProjectBotDALFactory } from "../../../services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "../../../services/project-key/project-key-dal";
import { TUserDALFactory } from "../../../services/user/user-dal";
import { TLicenseServiceFactory } from "../license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "../permission/org-permission";
import { TPermissionServiceFactory } from "../permission/permission-service";
import { TGroupDALFactory } from "./group-dal";
import { addUsersToGroupByUserIds, removeUsersFromGroupByUserIds } from "./group-fns";
import {
TAddUserToGroupDTO,
TCreateGroupDTO,
@@ -28,20 +28,17 @@ import {
import { TUserGroupMembershipDALFactory } from "./user-group-membership-dal";
type TGroupServiceFactoryDep = {
userDAL: Pick<TUserDALFactory, "findOne" | "findUserEncKeyByUsername">;
groupDAL: Pick<
TGroupDALFactory,
"create" | "findOne" | "update" | "delete" | "findAllGroupMembers" | "countAllGroupMembers"
>;
userDAL: Pick<TUserDALFactory, "find" | "findUserEncKeyByUserIdsBatch" | "transaction" | "findOne">;
groupDAL: Pick<TGroupDALFactory, "create" | "findOne" | "update" | "delete" | "findAllGroupMembers">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
orgDAL: Pick<TOrgDALFactory, "findMembership">;
orgDAL: Pick<TOrgDALFactory, "findMembership" | "countAllOrgMembers">;
userGroupMembershipDAL: Pick<
TUserGroupMembershipDALFactory,
"findOne" | "create" | "delete" | "filterProjectsByUserMembership"
"findOne" | "delete" | "filterProjectsByUserMembership" | "transaction" | "insertMany" | "find"
>;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "create" | "delete" | "findLatestProjectKey">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "delete" | "findLatestProjectKey" | "insertMany">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission" | "getOrgPermissionByRole">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
};
@@ -227,12 +224,9 @@ export const groupServiceFactory = ({
username
});
const totalCount = await groupDAL.countAllGroupMembers({
orgId: group.orgId,
groupId: group.id
});
const count = await orgDAL.countAllOrgMembers(group.orgId);
return { users, totalCount };
return { users, totalCount: count };
};
const addUserToGroup = async ({
@@ -272,111 +266,22 @@ export const groupServiceFactory = ({
if (!hasRequiredPriviledges)
throw new ForbiddenRequestError({ message: "Failed to add user to more privileged group" });
// get user with username
const user = await userDAL.findUserEncKeyByUsername({
username
const user = await userDAL.findOne({ username });
if (!user) throw new BadRequestError({ message: `Failed to find user with username ${username}` });
const users = await addUsersToGroupByUserIds({
group,
userIds: [user.id],
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL
});
if (!user)
throw new BadRequestError({
message: `Failed to find user with username ${username}`
});
// check if user group membership already exists
const existingUserGroupMembership = await userGroupMembershipDAL.findOne({
groupId: group.id,
userId: user.userId
});
if (existingUserGroupMembership)
throw new BadRequestError({
message: `User ${username} is already part of the group ${groupSlug}`
});
// check if user is even part of the organization
const existingUserOrgMembership = await orgDAL.findMembership({
userId: user.userId,
orgId: actorOrgId
});
if (!existingUserOrgMembership)
throw new BadRequestError({
message: `User ${username} is not part of the organization`
});
await userGroupMembershipDAL.create({
userId: user.userId,
groupId: group.id
});
// check which projects the group is part of
const projectIds = (
await groupProjectDAL.find({
groupId: group.id
})
).map((gp) => gp.projectId);
const keys = await projectKeyDAL.find({
receiverId: user.userId,
$in: {
projectId: projectIds
}
});
const keysSet = new Set(keys.map((k) => k.projectId));
const projectsToAddKeyFor = projectIds.filter((p) => !keysSet.has(p));
for await (const projectId of projectsToAddKeyFor) {
const ghostUser = await projectDAL.findProjectGhostUser(projectId);
if (!ghostUser) {
throw new BadRequestError({
message: "Failed to find sudo user"
});
}
const ghostUserLatestKey = await projectKeyDAL.findLatestProjectKey(ghostUser.id, projectId);
if (!ghostUserLatestKey) {
throw new BadRequestError({
message: "Failed to find sudo user latest key"
});
}
const bot = await projectBotDAL.findOne({ projectId });
if (!bot) {
throw new BadRequestError({
message: "Failed to find bot"
});
}
const botPrivateKey = infisicalSymmetricDecrypt({
keyEncoding: bot.keyEncoding as SecretKeyEncoding,
iv: bot.iv,
tag: bot.tag,
ciphertext: bot.encryptedPrivateKey
});
const plaintextProjectKey = decryptAsymmetric({
ciphertext: ghostUserLatestKey.encryptedKey,
nonce: ghostUserLatestKey.nonce,
publicKey: ghostUserLatestKey.sender.publicKey,
privateKey: botPrivateKey
});
const { ciphertext: encryptedKey, nonce } = encryptAsymmetric(plaintextProjectKey, user.publicKey, botPrivateKey);
await projectKeyDAL.create({
encryptedKey,
nonce,
senderId: ghostUser.id,
receiverId: user.userId,
projectId
});
}
return user;
return users[0];
};
const removeUserFromGroup = async ({
@@ -416,51 +321,19 @@ export const groupServiceFactory = ({
if (!hasRequiredPriviledges)
throw new ForbiddenRequestError({ message: "Failed to delete user from more privileged group" });
const user = await userDAL.findOne({
username
const user = await userDAL.findOne({ username });
if (!user) throw new BadRequestError({ message: `Failed to find user with username ${username}` });
const users = await removeUsersFromGroupByUserIds({
group,
userIds: [user.id],
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL
});
if (!user)
throw new BadRequestError({
message: `Failed to find user with username ${username}`
});
// check if user group membership already exists
const existingUserGroupMembership = await userGroupMembershipDAL.findOne({
groupId: group.id,
userId: user.id
});
if (!existingUserGroupMembership)
throw new BadRequestError({
message: `User ${username} is not part of the group ${groupSlug}`
});
const projectIds = (
await groupProjectDAL.find({
groupId: group.id
})
).map((gp) => gp.projectId);
const t = await userGroupMembershipDAL.filterProjectsByUserMembership(user.id, group.id, projectIds);
const projectsToDeleteKeyFor = projectIds.filter((p) => !t.has(p));
if (projectsToDeleteKeyFor.length) {
await projectKeyDAL.delete({
receiverId: user.id,
$in: {
projectId: projectsToDeleteKeyFor
}
});
}
await userGroupMembershipDAL.delete({
groupId: group.id,
userId: user.id
});
return user;
return users[0];
};
return {

View File

@@ -1,4 +1,14 @@
import { Knex } from "knex";
import { TGroups } from "@app/db/schemas";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { TGenericPermission } from "@app/lib/types";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TOrgDALFactory } from "@app/services/org/org-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { TUserDALFactory } from "@app/services/user/user-dal";
export type TCreateGroupDTO = {
name: string;
@@ -35,3 +45,54 @@ export type TRemoveUserFromGroupDTO = {
groupSlug: string;
username: string;
} & TGenericPermission;
// group fns types
export type TAddUsersToGroup = {
userIds: string[];
group: TGroups;
userDAL: Pick<TUserDALFactory, "findUserEncKeyByUserIdsBatch">;
userGroupMembershipDAL: Pick<TUserGroupMembershipDALFactory, "find" | "transaction" | "insertMany">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany">;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
tx: Knex;
};
export type TAddUsersToGroupByUserIds = {
group: TGroups;
userIds: string[];
userDAL: Pick<TUserDALFactory, "find" | "findUserEncKeyByUserIdsBatch" | "transaction">;
userGroupMembershipDAL: Pick<TUserGroupMembershipDALFactory, "find" | "transaction" | "insertMany">;
orgDAL: Pick<TOrgDALFactory, "findMembership">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany">;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
tx?: Knex;
};
export type TRemoveUsersFromGroupByUserIds = {
group: TGroups;
userIds: string[];
userDAL: Pick<TUserDALFactory, "find" | "transaction">;
userGroupMembershipDAL: Pick<TUserGroupMembershipDALFactory, "find" | "filterProjectsByUserMembership" | "delete">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "delete">;
tx?: Knex;
};
export type TConvertPendingGroupAdditionsToGroupMemberships = {
userIds: string[];
userDAL: Pick<TUserDALFactory, "findUserEncKeyByUserIdsBatch" | "transaction" | "find" | "findById">;
userGroupMembershipDAL: Pick<
TUserGroupMembershipDALFactory,
"find" | "transaction" | "insertMany" | "deletePendingUserGroupMembershipsByUserIds"
>;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany">;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
tx?: Knex;
};

View File

@@ -1,3 +1,5 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName, TUserEncryptionKeys } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
@@ -14,24 +16,28 @@ export const userGroupMembershipDALFactory = (db: TDbClient) => {
* - The user is a member of a group that is a member of the project, excluding projects that they are part of
* through the group with id [groupId].
*/
const filterProjectsByUserMembership = async (userId: string, groupId: string, projectIds: string[]) => {
const userProjectMemberships: string[] = await db(TableName.ProjectMembership)
.where(`${TableName.ProjectMembership}.userId`, userId)
.whereIn(`${TableName.ProjectMembership}.projectId`, projectIds)
.pluck(`${TableName.ProjectMembership}.projectId`);
const filterProjectsByUserMembership = async (userId: string, groupId: string, projectIds: string[], tx?: Knex) => {
try {
const userProjectMemberships: string[] = await (tx || db)(TableName.ProjectMembership)
.where(`${TableName.ProjectMembership}.userId`, userId)
.whereIn(`${TableName.ProjectMembership}.projectId`, projectIds)
.pluck(`${TableName.ProjectMembership}.projectId`);
const userGroupMemberships: string[] = await db(TableName.UserGroupMembership)
.where(`${TableName.UserGroupMembership}.userId`, userId)
.whereNot(`${TableName.UserGroupMembership}.groupId`, groupId)
.join(
TableName.GroupProjectMembership,
`${TableName.UserGroupMembership}.groupId`,
`${TableName.GroupProjectMembership}.groupId`
)
.whereIn(`${TableName.GroupProjectMembership}.projectId`, projectIds)
.pluck(`${TableName.GroupProjectMembership}.projectId`);
const userGroupMemberships: string[] = await (tx || db)(TableName.UserGroupMembership)
.where(`${TableName.UserGroupMembership}.userId`, userId)
.whereNot(`${TableName.UserGroupMembership}.groupId`, groupId)
.join(
TableName.GroupProjectMembership,
`${TableName.UserGroupMembership}.groupId`,
`${TableName.GroupProjectMembership}.groupId`
)
.whereIn(`${TableName.GroupProjectMembership}.projectId`, projectIds)
.pluck(`${TableName.GroupProjectMembership}.projectId`);
return new Set(userProjectMemberships.concat(userGroupMemberships));
return new Set(userProjectMemberships.concat(userGroupMemberships));
} catch (error) {
throw new DatabaseError({ error, name: "Filter projects by user membership" });
}
};
// special query
@@ -45,7 +51,7 @@ export const userGroupMembershipDALFactory = (db: TDbClient) => {
)
.join(TableName.Users, `${TableName.UserGroupMembership}.userId`, `${TableName.Users}.id`)
.where(`${TableName.GroupProjectMembership}.projectId`, projectId)
.whereIn(`${TableName.Users}.username`, usernames) // TODO: pluck usernames
.whereIn(`${TableName.Users}.username`, usernames)
.pluck(`${TableName.Users}.id`);
return usernameDocs;
@@ -55,7 +61,7 @@ export const userGroupMembershipDALFactory = (db: TDbClient) => {
};
/**
* Return list of users that are part of the group with id [groupId]
* Return list of completed/accepted users that are part of the group with id [groupId]
* that have not yet been added individually to project with id [projectId].
*
* Note: Filters out users that are part of other groups in the project.
@@ -63,18 +69,19 @@ export const userGroupMembershipDALFactory = (db: TDbClient) => {
* @param projectId
* @returns
*/
const findGroupMembersNotInProject = async (groupId: string, projectId: string) => {
const findGroupMembersNotInProject = async (groupId: string, projectId: string, tx?: Knex) => {
try {
// get list of groups in the project with id [projectId]
// that that are not the group with id [groupId]
const groups: string[] = await db(TableName.GroupProjectMembership)
const groups: string[] = await (tx || db)(TableName.GroupProjectMembership)
.where(`${TableName.GroupProjectMembership}.projectId`, projectId)
.whereNot(`${TableName.GroupProjectMembership}.groupId`, groupId)
.pluck(`${TableName.GroupProjectMembership}.groupId`);
// main query
const members = await db(TableName.UserGroupMembership)
const members = await (tx || db)(TableName.UserGroupMembership)
.where(`${TableName.UserGroupMembership}.groupId`, groupId)
.where(`${TableName.UserGroupMembership}.isPending`, false)
.join(TableName.Users, `${TableName.UserGroupMembership}.userId`, `${TableName.Users}.id`)
.leftJoin(TableName.ProjectMembership, function () {
this.on(`${TableName.Users}.id`, "=", `${TableName.ProjectMembership}.userId`).andOn(
@@ -116,10 +123,49 @@ export const userGroupMembershipDALFactory = (db: TDbClient) => {
}
};
const deletePendingUserGroupMembershipsByUserIds = async (userIds: string[], tx?: Knex) => {
try {
const members = await (tx || db)(TableName.UserGroupMembership)
.whereIn(`${TableName.UserGroupMembership}.userId`, userIds)
.where(`${TableName.UserGroupMembership}.isPending`, true)
.join(TableName.Groups, `${TableName.UserGroupMembership}.groupId`, `${TableName.Groups}.id`)
.join(TableName.Users, `${TableName.UserGroupMembership}.userId`, `${TableName.Users}.id`);
await userGroupMembershipOrm.delete(
{
$in: {
userId: userIds
}
},
tx
);
return members.map(({ userId, username, groupId, orgId, name, slug, role, roleId }) => ({
user: {
id: userId,
username
},
group: {
id: groupId,
orgId,
name,
slug,
role,
roleId,
createdAt: new Date(),
updatedAt: new Date()
}
}));
} catch (error) {
throw new DatabaseError({ error, name: "Delete pending user group memberships by user ids" });
}
};
return {
...userGroupMembershipOrm,
filterProjectsByUserMembership,
findUserGroupMembershipsInProject,
findGroupMembersNotInProject
findGroupMembersNotInProject,
deletePendingUserGroupMembershipsByUserIds
};
};

View File

@@ -2,6 +2,9 @@ import { ForbiddenError } from "@casl/ability";
import jwt from "jsonwebtoken";
import { OrgMembershipRole, OrgMembershipStatus, SecretKeyEncoding, TLdapConfigsUpdate } from "@app/db/schemas";
import { TGroupDALFactory } from "@app/ee/services/group/group-dal";
import { addUsersToGroupByUserIds, removeUsersFromGroupByUserIds } from "@app/ee/services/group/group-fns";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { getConfig } from "@app/lib/config/env";
import {
decryptSymmetric,
@@ -13,8 +16,12 @@ import {
} from "@app/lib/crypto/encryption";
import { BadRequestError } from "@app/lib/errors";
import { AuthMethod, AuthTokenType } from "@app/services/auth/auth-type";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TOrgBotDALFactory } from "@app/services/org/org-bot-dal";
import { TOrgDALFactory } from "@app/services/org/org-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { TUserDALFactory } from "@app/services/user/user-dal";
import { normalizeUsername } from "@app/services/user/user-fns";
import { TUserAliasDALFactory } from "@app/services/user-alias/user-alias-dal";
@@ -23,16 +30,40 @@ import { TLicenseServiceFactory } from "../license/license-service";
import { OrgPermissionActions, OrgPermissionSubjects } from "../permission/org-permission";
import { TPermissionServiceFactory } from "../permission/permission-service";
import { TLdapConfigDALFactory } from "./ldap-config-dal";
import { TCreateLdapCfgDTO, TGetLdapCfgDTO, TLdapLoginDTO, TUpdateLdapCfgDTO } from "./ldap-config-types";
import {
TCreateLdapCfgDTO,
TCreateLdapGroupMapDTO,
TDeleteLdapGroupMapDTO,
TGetLdapCfgDTO,
TGetLdapGroupMapsDTO,
TLdapLoginDTO,
TTestLdapConnectionDTO,
TUpdateLdapCfgDTO
} from "./ldap-config-types";
import { testLDAPConfig } from "./ldap-fns";
import { TLdapGroupMapDALFactory } from "./ldap-group-map-dal";
type TLdapConfigServiceFactoryDep = {
ldapConfigDAL: TLdapConfigDALFactory;
ldapConfigDAL: Pick<TLdapConfigDALFactory, "create" | "update" | "findOne">;
ldapGroupMapDAL: Pick<TLdapGroupMapDALFactory, "find" | "create" | "delete" | "findLdapGroupMapsByLdapConfigId">;
orgDAL: Pick<
TOrgDALFactory,
"createMembership" | "updateMembershipById" | "findMembership" | "findOrgById" | "findOne" | "updateById"
>;
orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "create" | "transaction">;
userDAL: Pick<TUserDALFactory, "create" | "findOne" | "transaction" | "updateById">;
groupDAL: Pick<TGroupDALFactory, "find" | "findOne">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany" | "delete">;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
userGroupMembershipDAL: Pick<
TUserGroupMembershipDALFactory,
"find" | "transaction" | "insertMany" | "filterProjectsByUserMembership" | "delete"
>;
userDAL: Pick<
TUserDALFactory,
"create" | "findOne" | "transaction" | "updateById" | "findUserEncKeyByUserIdsBatch" | "find"
>;
userAliasDAL: Pick<TUserAliasDALFactory, "create" | "findOne">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
@@ -42,8 +73,15 @@ export type TLdapConfigServiceFactory = ReturnType<typeof ldapConfigServiceFacto
export const ldapConfigServiceFactory = ({
ldapConfigDAL,
ldapGroupMapDAL,
orgDAL,
orgBotDAL,
groupDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
userGroupMembershipDAL,
userDAL,
userAliasDAL,
permissionService,
@@ -60,6 +98,9 @@ export const ldapConfigServiceFactory = ({
bindDN,
bindPass,
searchBase,
searchFilter,
groupSearchBase,
groupSearchFilter,
caCert
}: TCreateLdapCfgDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
@@ -135,6 +176,9 @@ export const ldapConfigServiceFactory = ({
bindPassIV,
bindPassTag,
searchBase,
searchFilter,
groupSearchBase,
groupSearchFilter,
encryptedCACert,
caCertIV,
caCertTag
@@ -154,6 +198,9 @@ export const ldapConfigServiceFactory = ({
bindDN,
bindPass,
searchBase,
searchFilter,
groupSearchBase,
groupSearchFilter,
caCert
}: TUpdateLdapCfgDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
@@ -169,7 +216,10 @@ export const ldapConfigServiceFactory = ({
const updateQuery: TLdapConfigsUpdate = {
isActive,
url,
searchBase
searchBase,
searchFilter,
groupSearchBase,
groupSearchFilter
};
const orgBot = await orgBotDAL.findOne({ orgId });
@@ -271,6 +321,9 @@ export const ldapConfigServiceFactory = ({
bindDN,
bindPass,
searchBase: ldapConfig.searchBase,
searchFilter: ldapConfig.searchFilter,
groupSearchBase: ldapConfig.groupSearchBase,
groupSearchFilter: ldapConfig.groupSearchFilter,
caCert
};
};
@@ -304,8 +357,8 @@ export const ldapConfigServiceFactory = ({
bindDN: ldapConfig.bindDN,
bindCredentials: ldapConfig.bindPass,
searchBase: ldapConfig.searchBase,
searchFilter: "(uid={{username}})",
searchAttributes: ["uid", "uidNumber", "givenName", "sn", "mail"],
searchFilter: ldapConfig.searchFilter || "(uid={{username}})",
// searchAttributes: ["uid", "uidNumber", "givenName", "sn", "mail"],
...(ldapConfig.caCert !== ""
? {
tlsOptions: {
@@ -320,7 +373,17 @@ export const ldapConfigServiceFactory = ({
return { opts, ldapConfig };
};
const ldapLogin = async ({ externalId, username, firstName, lastName, emails, orgId, relayState }: TLdapLoginDTO) => {
const ldapLogin = async ({
ldapConfigId,
externalId,
username,
firstName,
lastName,
emails,
groups,
orgId,
relayState
}: TLdapLoginDTO) => {
const appCfg = getConfig();
let userAlias = await userAliasDAL.findOne({
externalId,
@@ -394,7 +457,84 @@ export const ldapConfigServiceFactory = ({
});
}
const user = await userDAL.findOne({ id: userAlias.userId });
const user = await userDAL.transaction(async (tx) => {
const newUser = await userDAL.findOne({ id: userAlias.userId }, tx);
if (groups) {
const ldapGroupIdsToBePartOf = (
await ldapGroupMapDAL.find({
ldapConfigId,
$in: {
ldapGroupCN: groups.map((group) => group.cn)
}
})
).map((groupMap) => groupMap.groupId);
const groupsToBePartOf = await groupDAL.find({
orgId,
$in: {
id: ldapGroupIdsToBePartOf
}
});
const toBePartOfGroupIdsSet = new Set(groupsToBePartOf.map((groupToBePartOf) => groupToBePartOf.id));
const allLdapGroupMaps = await ldapGroupMapDAL.find({
ldapConfigId
});
const ldapGroupIdsCurrentlyPartOf = (
await userGroupMembershipDAL.find({
userId: newUser.id,
$in: {
groupId: allLdapGroupMaps.map((groupMap) => groupMap.groupId)
}
})
).map((userGroupMembership) => userGroupMembership.groupId);
const userGroupMembershipGroupIdsSet = new Set(ldapGroupIdsCurrentlyPartOf);
for await (const group of groupsToBePartOf) {
if (!userGroupMembershipGroupIdsSet.has(group.id)) {
// add user to group that they should be part of
await addUsersToGroupByUserIds({
group,
userIds: [newUser.id],
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
});
}
}
const groupsCurrentlyPartOf = await groupDAL.find({
orgId,
$in: {
id: ldapGroupIdsCurrentlyPartOf
}
});
for await (const group of groupsCurrentlyPartOf) {
if (!toBePartOfGroupIdsSet.has(group.id)) {
// remove user from group that they should no longer be part of
await removeUsersFromGroupByUserIds({
group,
userIds: [newUser.id],
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
tx
});
}
}
}
return newUser;
});
const isUserCompleted = Boolean(user.isAccepted);
@@ -424,6 +564,116 @@ export const ldapConfigServiceFactory = ({
return { isUserCompleted, providerAuthToken };
};
const getLdapGroupMaps = async ({
ldapConfigId,
actor,
actorId,
orgId,
actorAuthMethod,
actorOrgId
}: TGetLdapGroupMapsDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Read, OrgPermissionSubjects.Ldap);
const ldapConfig = await ldapConfigDAL.findOne({
id: ldapConfigId,
orgId
});
if (!ldapConfig) throw new BadRequestError({ message: "Failed to find organization LDAP data" });
const groupMaps = await ldapGroupMapDAL.findLdapGroupMapsByLdapConfigId(ldapConfigId);
return groupMaps;
};
const createLdapGroupMap = async ({
ldapConfigId,
ldapGroupCN,
groupSlug,
actor,
actorId,
orgId,
actorAuthMethod,
actorOrgId
}: TCreateLdapGroupMapDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Ldap);
const plan = await licenseService.getPlan(orgId);
if (!plan.ldap)
throw new BadRequestError({
message: "Failed to create LDAP group map due to plan restriction. Upgrade plan to create LDAP group map."
});
const ldapConfig = await ldapConfigDAL.findOne({
id: ldapConfigId,
orgId
});
if (!ldapConfig) throw new BadRequestError({ message: "Failed to find organization LDAP data" });
const group = await groupDAL.findOne({ slug: groupSlug, orgId });
if (!group) throw new BadRequestError({ message: "Failed to find group" });
const groupMap = await ldapGroupMapDAL.create({
ldapConfigId,
ldapGroupCN,
groupId: group.id
});
return groupMap;
};
const deleteLdapGroupMap = async ({
ldapConfigId,
ldapGroupMapId,
actor,
actorId,
orgId,
actorAuthMethod,
actorOrgId
}: TDeleteLdapGroupMapDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Delete, OrgPermissionSubjects.Ldap);
const plan = await licenseService.getPlan(orgId);
if (!plan.ldap)
throw new BadRequestError({
message: "Failed to delete LDAP group map due to plan restriction. Upgrade plan to delete LDAP group map."
});
const ldapConfig = await ldapConfigDAL.findOne({
id: ldapConfigId,
orgId
});
if (!ldapConfig) throw new BadRequestError({ message: "Failed to find organization LDAP data" });
const [deletedGroupMap] = await ldapGroupMapDAL.delete({
ldapConfigId: ldapConfig.id,
id: ldapGroupMapId
});
return deletedGroupMap;
};
const testLDAPConnection = async ({ actor, actorId, orgId, actorAuthMethod, actorOrgId }: TTestLdapConnectionDTO) => {
const { permission } = await permissionService.getOrgPermission(actor, actorId, orgId, actorAuthMethod, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Create, OrgPermissionSubjects.Ldap);
const plan = await licenseService.getPlan(orgId);
if (!plan.ldap)
throw new BadRequestError({
message: "Failed to test LDAP connection due to plan restriction. Upgrade plan to test the LDAP connection."
});
const ldapConfig = await getLdapCfg({
orgId
});
return testLDAPConfig(ldapConfig);
};
return {
createLdapCfg,
updateLdapCfg,
@@ -431,6 +681,10 @@ export const ldapConfigServiceFactory = ({
getLdapCfg,
// getLdapPassportOpts,
ldapLogin,
bootLdap
bootLdap,
getLdapGroupMaps,
createLdapGroupMap,
deleteLdapGroupMap,
testLDAPConnection
};
};

View File

@@ -1,5 +1,18 @@
import { TOrgPermission } from "@app/lib/types";
export type TLDAPConfig = {
id: string;
organization: string;
isActive: boolean;
url: string;
bindDN: string;
bindPass: string;
searchBase: string;
groupSearchBase: string;
groupSearchFilter: string;
caCert: string;
};
export type TCreateLdapCfgDTO = {
orgId: string;
isActive: boolean;
@@ -7,6 +20,9 @@ export type TCreateLdapCfgDTO = {
bindDN: string;
bindPass: string;
searchBase: string;
searchFilter: string;
groupSearchBase: string;
groupSearchFilter: string;
caCert: string;
} & TOrgPermission;
@@ -18,6 +34,9 @@ export type TUpdateLdapCfgDTO = {
bindDN: string;
bindPass: string;
searchBase: string;
searchFilter: string;
groupSearchBase: string;
groupSearchFilter: string;
caCert: string;
}> &
TOrgPermission;
@@ -27,11 +46,35 @@ export type TGetLdapCfgDTO = {
} & TOrgPermission;
export type TLdapLoginDTO = {
ldapConfigId: string;
externalId: string;
username: string;
firstName: string;
lastName: string;
emails: string[];
orgId: string;
groups?: {
dn: string;
cn: string;
}[];
relayState?: string;
};
export type TGetLdapGroupMapsDTO = {
ldapConfigId: string;
} & TOrgPermission;
export type TCreateLdapGroupMapDTO = {
ldapConfigId: string;
ldapGroupCN: string;
groupSlug: string;
} & TOrgPermission;
export type TDeleteLdapGroupMapDTO = {
ldapConfigId: string;
ldapGroupMapId: string;
} & TOrgPermission;
export type TTestLdapConnectionDTO = {
ldapConfigId: string;
} & TOrgPermission;

View File

@@ -0,0 +1,119 @@
import ldapjs from "ldapjs";
import { logger } from "@app/lib/logger";
import { TLDAPConfig } from "./ldap-config-types";
export const isValidLdapFilter = (filter: string) => {
try {
ldapjs.parseFilter(filter);
return true;
} catch (error) {
logger.error("Invalid LDAP filter");
logger.error(error);
return false;
}
};
/**
* Test the LDAP configuration by attempting to bind to the LDAP server
* @param ldapConfig - The LDAP configuration to test
* @returns {Boolean} isConnected - Whether or not the connection was successful
*/
export const testLDAPConfig = async (ldapConfig: TLDAPConfig): Promise<boolean> => {
return new Promise((resolve) => {
const ldapClient = ldapjs.createClient({
url: ldapConfig.url,
bindDN: ldapConfig.bindDN,
bindCredentials: ldapConfig.bindPass,
...(ldapConfig.caCert !== ""
? {
tlsOptions: {
ca: [ldapConfig.caCert]
}
}
: {})
});
ldapClient.on("error", (err) => {
logger.error("LDAP client error:", err);
logger.error(err);
resolve(false);
});
ldapClient.bind(ldapConfig.bindDN, ldapConfig.bindPass, (err) => {
if (err) {
logger.error("Error binding to LDAP");
logger.error(err);
ldapClient.unbind();
resolve(false);
} else {
logger.info("Successfully connected and bound to LDAP.");
ldapClient.unbind();
resolve(true);
}
});
});
};
/**
* Search for groups in the LDAP server
* @param ldapConfig - The LDAP configuration to use
* @param filter - The filter to use when searching for groups
* @param base - The base to search from
* @returns
*/
export const searchGroups = async (
ldapConfig: TLDAPConfig,
filter: string,
base: string
): Promise<{ dn: string; cn: string }[]> => {
return new Promise((resolve, reject) => {
const ldapClient = ldapjs.createClient({
url: ldapConfig.url,
bindDN: ldapConfig.bindDN,
bindCredentials: ldapConfig.bindPass,
...(ldapConfig.caCert !== ""
? {
tlsOptions: {
ca: [ldapConfig.caCert]
}
}
: {})
});
ldapClient.search(
base,
{
filter,
scope: "sub"
},
(err, res) => {
if (err) {
ldapClient.unbind();
return reject(err);
}
const groups: { dn: string; cn: string }[] = [];
res.on("searchEntry", (entry) => {
const dn = entry.dn.toString();
const regex = /cn=([^,]+)/;
const match = dn.match(regex);
// parse the cn from the dn
const cn = (match && match[1]) as string;
groups.push({ dn, cn });
});
res.on("error", (error) => {
ldapClient.unbind();
reject(error);
});
res.on("end", () => {
ldapClient.unbind();
resolve(groups);
});
}
);
});
};

View File

@@ -0,0 +1,41 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
export type TLdapGroupMapDALFactory = ReturnType<typeof ldapGroupMapDALFactory>;
export const ldapGroupMapDALFactory = (db: TDbClient) => {
const ldapGroupMapOrm = ormify(db, TableName.LdapGroupMap);
const findLdapGroupMapsByLdapConfigId = async (ldapConfigId: string) => {
try {
const docs = await db(TableName.LdapGroupMap)
.where(`${TableName.LdapGroupMap}.ldapConfigId`, ldapConfigId)
.join(TableName.Groups, `${TableName.LdapGroupMap}.groupId`, `${TableName.Groups}.id`)
.select(selectAllTableCols(TableName.LdapGroupMap))
.select(
db.ref("id").withSchema(TableName.Groups).as("groupId"),
db.ref("name").withSchema(TableName.Groups).as("groupName"),
db.ref("slug").withSchema(TableName.Groups).as("groupSlug")
);
return docs.map((doc) => {
return {
id: doc.id,
ldapConfigId: doc.ldapConfigId,
ldapGroupCN: doc.ldapGroupCN,
group: {
id: doc.groupId,
name: doc.groupName,
slug: doc.groupSlug
}
};
});
} catch (error) {
throw new DatabaseError({ error, name: "findGroupMaps" });
}
};
return { ...ldapGroupMapOrm, findLdapGroupMapsByLdapConfigId };
};

View File

@@ -340,11 +340,12 @@ export const samlConfigServiceFactory = ({
orgId,
inviteEmail: email,
role: OrgMembershipRole.Member,
status: OrgMembershipStatus.Accepted
status: user.isAccepted ? OrgMembershipStatus.Accepted : OrgMembershipStatus.Invited // if user is fully completed, then set status to accepted, otherwise set it to invited so we can update it later
},
tx
);
} else if (orgMembership.status === OrgMembershipStatus.Invited) {
// Only update the membership to Accepted if the user account is already completed.
} else if (orgMembership.status === OrgMembershipStatus.Invited && user.isAccepted) {
await orgDAL.updateMembershipById(
orgMembership.id,
{

View File

@@ -4,15 +4,20 @@ import jwt from "jsonwebtoken";
import { OrgMembershipRole, OrgMembershipStatus, TableName, TGroups } from "@app/db/schemas";
import { TGroupDALFactory } from "@app/ee/services/group/group-dal";
import { addUsersToGroupByUserIds, removeUsersFromGroupByUserIds } from "@app/ee/services/group/group-fns";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { TScimDALFactory } from "@app/ee/services/scim/scim-dal";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError, ScimRequestError, UnauthorizedError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TOrgPermission } from "@app/lib/types";
import { AuthMethod, AuthTokenType } from "@app/services/auth/auth-type";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TOrgDALFactory } from "@app/services/org/org-dal";
import { deleteOrgMembership } from "@app/services/org/org-fns";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { TProjectMembershipDALFactory } from "@app/services/project-membership/project-membership-dal";
import { SmtpTemplates, TSmtpService } from "@app/services/smtp/smtp-service";
import { TUserDALFactory } from "@app/services/user/user-dal";
@@ -42,14 +47,21 @@ import {
type TScimServiceFactoryDep = {
scimDAL: Pick<TScimDALFactory, "create" | "find" | "findById" | "deleteById">;
userDAL: Pick<TUserDALFactory, "findOne" | "create" | "transaction">;
userDAL: Pick<TUserDALFactory, "find" | "findOne" | "create" | "transaction" | "findUserEncKeyByUserIdsBatch">;
orgDAL: Pick<
TOrgDALFactory,
"createMembership" | "findById" | "findMembership" | "deleteMembershipById" | "transaction"
>;
projectDAL: Pick<TProjectDALFactory, "find">;
projectDAL: Pick<TProjectDALFactory, "find" | "findProjectGhostUser">;
projectMembershipDAL: Pick<TProjectMembershipDALFactory, "find" | "delete">;
groupDAL: Pick<TGroupDALFactory, "create" | "findOne" | "findAllGroupMembers" | "update" | "delete" | "findGroups">;
groupDAL: Pick<
TGroupDALFactory,
"create" | "findOne" | "findAllGroupMembers" | "update" | "delete" | "findGroups" | "transaction"
>;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
userGroupMembershipDAL: TUserGroupMembershipDALFactory; // TODO: Pick
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany" | "delete">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
smtpService: TSmtpService;
@@ -65,6 +77,10 @@ export const scimServiceFactory = ({
projectDAL,
projectMembershipDAL,
groupDAL,
groupProjectDAL,
userGroupMembershipDAL,
projectKeyDAL,
projectBotDAL,
permissionService,
smtpService
}: TScimServiceFactoryDep) => {
@@ -473,7 +489,19 @@ export const scimServiceFactory = ({
};
const listScimGroups = async ({ orgId, offset, limit }: TListScimGroupsDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
throw new BadRequestError({
message: "Failed to list SCIM groups due to plan restriction. Upgrade plan to list SCIM groups."
});
const org = await orgDAL.findById(orgId);
if (!org) {
throw new ScimRequestError({
detail: "Organization Not Found",
status: 404
});
}
if (!org.scimEnabled)
throw new ScimRequestError({
@@ -500,30 +528,76 @@ export const scimServiceFactory = ({
});
};
const createScimGroup = async ({ displayName, orgId }: TCreateScimGroupDTO) => {
const createScimGroup = async ({ displayName, orgId, members }: TCreateScimGroupDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
throw new BadRequestError({
message: "Failed to create a SCIM group due to plan restriction. Upgrade plan to create a SCIM group."
});
const org = await orgDAL.findById(orgId);
if (!org) {
throw new ScimRequestError({
detail: "Organization Not Found",
status: 404
});
}
if (!org.scimEnabled)
throw new ScimRequestError({
detail: "SCIM is disabled for the organization",
status: 403
});
const group = await groupDAL.create({
name: displayName,
slug: slugify(`${displayName}-${alphaNumericNanoId(4)}`),
orgId,
role: OrgMembershipRole.NoAccess
const newGroup = await groupDAL.transaction(async (tx) => {
const group = await groupDAL.create(
{
name: displayName,
slug: slugify(`${displayName}-${alphaNumericNanoId(4)}`),
orgId,
role: OrgMembershipRole.NoAccess
},
tx
);
if (members && members.length) {
const newMembers = await addUsersToGroupByUserIds({
group,
userIds: members.map((member) => member.value),
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
});
return { group, newMembers };
}
return { group, newMembers: [] };
});
return buildScimGroup({
groupId: group.id,
name: group.name,
members: []
groupId: newGroup.group.id,
name: newGroup.group.name,
members: newGroup.newMembers.map((member) => ({
value: member.id,
display: `${member.firstName} ${member.lastName}`
}))
});
};
const getScimGroup = async ({ groupId, orgId }: TGetScimGroupDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
throw new BadRequestError({
message: "Failed to get SCIM group due to plan restriction. Upgrade plan to get SCIM group."
});
const group = await groupDAL.findOne({
id: groupId,
orgId
@@ -553,35 +627,123 @@ export const scimServiceFactory = ({
});
};
const updateScimGroupNamePut = async ({ groupId, orgId, displayName }: TUpdateScimGroupNamePutDTO) => {
const [group] = await groupDAL.update(
{
id: groupId,
orgId
},
{
name: displayName
}
);
const updateScimGroupNamePut = async ({ groupId, orgId, displayName, members }: TUpdateScimGroupNamePutDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
throw new BadRequestError({
message: "Failed to update SCIM group due to plan restriction. Upgrade plan to update SCIM group."
});
if (!group) {
const org = await orgDAL.findById(orgId);
if (!org) {
throw new ScimRequestError({
detail: "Group Not Found",
detail: "Organization Not Found",
status: 404
});
}
if (!org.scimEnabled)
throw new ScimRequestError({
detail: "SCIM is disabled for the organization",
status: 403
});
const updatedGroup = await groupDAL.transaction(async (tx) => {
const [group] = await groupDAL.update(
{
id: groupId,
orgId
},
{
name: displayName
}
);
if (!group) {
throw new ScimRequestError({
detail: "Group Not Found",
status: 404
});
}
if (members) {
const membersIdsSet = new Set(members.map((member) => member.value));
const directMemberUserIds = (
await userGroupMembershipDAL.find({
groupId: group.id,
isPending: false
})
).map((membership) => membership.userId);
const pendingGroupAdditionsUserIds = (
await userGroupMembershipDAL.find({
groupId: group.id,
isPending: true
})
).map((pendingGroupAddition) => pendingGroupAddition.userId);
const allMembersUserIds = directMemberUserIds.concat(pendingGroupAdditionsUserIds);
const allMembersUserIdsSet = new Set(allMembersUserIds);
const toAddUserIds = members.filter((member) => !allMembersUserIdsSet.has(member.value));
const toRemoveUserIds = allMembersUserIds.filter((userId) => !membersIdsSet.has(userId));
if (toAddUserIds.length) {
await addUsersToGroupByUserIds({
group,
userIds: toAddUserIds.map((member) => member.value),
userDAL,
userGroupMembershipDAL,
orgDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
});
}
if (toRemoveUserIds.length) {
await removeUsersFromGroupByUserIds({
group,
userIds: toRemoveUserIds,
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
tx
});
}
}
return group;
});
return buildScimGroup({
groupId: group.id,
name: group.name,
members: []
groupId: updatedGroup.id,
name: updatedGroup.name,
members
});
};
// TODO: add support for add/remove op
const updateScimGroupNamePatch = async ({ groupId, orgId, operations }: TUpdateScimGroupNamePatchDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
throw new BadRequestError({
message: "Failed to update SCIM group due to plan restriction. Upgrade plan to update SCIM group."
});
const org = await orgDAL.findById(orgId);
if (!org) {
throw new ScimRequestError({
detail: "Organization Not Found",
status: 404
});
}
if (!org.scimEnabled)
throw new ScimRequestError({
detail: "SCIM is disabled for the organization",
@@ -635,6 +797,26 @@ export const scimServiceFactory = ({
};
const deleteScimGroup = async ({ groupId, orgId }: TDeleteScimGroupDTO) => {
const plan = await licenseService.getPlan(orgId);
if (!plan.groups)
throw new BadRequestError({
message: "Failed to delete SCIM group due to plan restriction. Upgrade plan to delete SCIM group."
});
const org = await orgDAL.findById(orgId);
if (!org) {
throw new ScimRequestError({
detail: "Organization Not Found",
status: 404
});
}
if (!org.scimEnabled)
throw new ScimRequestError({
detail: "SCIM is disabled for the organization",
status: 403
});
const [group] = await groupDAL.delete({
id: groupId,
orgId

View File

@@ -81,6 +81,11 @@ export type TListScimGroups = {
export type TCreateScimGroupDTO = {
displayName: string;
orgId: string;
members?: {
// TODO: account for members with value and display (is this optional?)
value: string;
display: string;
}[];
};
export type TGetScimGroupDTO = {
@@ -92,6 +97,10 @@ export type TUpdateScimGroupNamePutDTO = {
groupId: string;
orgId: string;
displayName: string;
members: {
value: string;
display: string;
}[];
};
export type TUpdateScimGroupNamePatchDTO = {

View File

@@ -495,7 +495,11 @@ export const secretApprovalRequestServiceFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, secretPath);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "GenSecretApproval" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "GenSecretApproval"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });

View File

@@ -282,6 +282,7 @@ export const RAW_SECRETS = {
},
CREATE: {
secretName: "The name of the secret to create.",
projectSlug: "The slug of the project to create the secret in.",
environment: "The slug of the environment to create the secret in.",
secretComment: "Attach a comment to the secret.",
secretPath: "The path to create the secret in.",
@@ -301,11 +302,13 @@ export const RAW_SECRETS = {
},
UPDATE: {
secretName: "The name of the secret to update.",
secretComment: "Update comment to the secret.",
environment: "The slug of the environment where the secret is located.",
secretPath: "The path of the secret to update",
secretValue: "The new value of the secret.",
skipMultilineEncoding: "Skip multiline encoding for the secret value.",
type: "The type of the secret to update.",
projectSlug: "The slug of the project to update the secret in.",
workspaceId: "The ID of the project to update the secret in."
},
DELETE: {
@@ -313,6 +316,7 @@ export const RAW_SECRETS = {
environment: "The slug of the environment where the secret is located.",
secretPath: "The path of the secret.",
type: "The type of the secret to delete.",
projectSlug: "The slug of the project to delete the secret in.",
workspaceId: "The ID of the project where the secret is located."
}
} as const;

View File

@@ -18,6 +18,7 @@ import { identityProjectAdditionalPrivilegeDALFactory } from "@app/ee/services/i
import { identityProjectAdditionalPrivilegeServiceFactory } from "@app/ee/services/identity-project-additional-privilege/identity-project-additional-privilege-service";
import { ldapConfigDALFactory } from "@app/ee/services/ldap-config/ldap-config-dal";
import { ldapConfigServiceFactory } from "@app/ee/services/ldap-config/ldap-config-service";
import { ldapGroupMapDALFactory } from "@app/ee/services/ldap-config/ldap-group-map-dal";
import { licenseDALFactory } from "@app/ee/services/license/license-dal";
import { licenseServiceFactory } from "@app/ee/services/license/license-service";
import { permissionDALFactory } from "@app/ee/services/permission/permission-dal";
@@ -200,6 +201,7 @@ export const registerRoutes = async (
const samlConfigDAL = samlConfigDALFactory(db);
const scimDAL = scimDALFactory(db);
const ldapConfigDAL = ldapConfigDALFactory(db);
const ldapGroupMapDAL = ldapGroupMapDALFactory(db);
const sapApproverDAL = secretApprovalPolicyApproverDALFactory(db);
const secretApprovalPolicyDAL = secretApprovalPolicyDALFactory(db);
const secretApprovalRequestDAL = secretApprovalRequestDALFactory(db);
@@ -290,14 +292,25 @@ export const registerRoutes = async (
projectDAL,
projectMembershipDAL,
groupDAL,
groupProjectDAL,
userGroupMembershipDAL,
projectKeyDAL,
projectBotDAL,
permissionService,
smtpService
});
const ldapService = ldapConfigServiceFactory({
ldapConfigDAL,
ldapGroupMapDAL,
orgDAL,
orgBotDAL,
groupDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
userGroupMembershipDAL,
userDAL,
userAliasDAL,
permissionService,
@@ -344,6 +357,11 @@ export const registerRoutes = async (
smtpService,
authDAL,
userDAL,
userGroupMembershipDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
groupProjectDAL,
orgDAL,
orgService,
licenseService

View File

@@ -1,3 +1,4 @@
import slugify from "@sindresorhus/slugify";
import { z } from "zod";
import { ProjectEnvironmentsSchema } from "@app/db/schemas";
@@ -26,7 +27,13 @@ export const registerProjectEnvRouter = async (server: FastifyZodProvider) => {
}),
body: z.object({
name: z.string().trim().describe(ENVIRONMENTS.CREATE.name),
slug: z.string().trim().describe(ENVIRONMENTS.CREATE.slug)
slug: z
.string()
.trim()
.refine((v) => slugify(v) === v, {
message: "Slug must be a valid slug"
})
.describe(ENVIRONMENTS.CREATE.slug)
}),
response: {
200: z.object({
@@ -84,7 +91,14 @@ export const registerProjectEnvRouter = async (server: FastifyZodProvider) => {
id: z.string().trim().describe(ENVIRONMENTS.UPDATE.id)
}),
body: z.object({
slug: z.string().trim().optional().describe(ENVIRONMENTS.UPDATE.slug),
slug: z
.string()
.trim()
.optional()
.refine((v) => !v || slugify(v) === v, {
message: "Slug must be a valid slug"
})
.describe(ENVIRONMENTS.UPDATE.slug),
name: z.string().trim().optional().describe(ENVIRONMENTS.UPDATE.name),
position: z.number().optional().describe(ENVIRONMENTS.UPDATE.position)
}),

View File

@@ -44,7 +44,6 @@ export const registerOrgRouter = async (server: FastifyZodProvider) => {
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
if (req.auth.actor !== ActorType.USER) return;
const users = await server.services.org.findAllOrgMembers(
req.permission.id,
req.params.organizationId,

View File

@@ -1656,4 +1656,263 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
return { secrets };
}
});
server.route({
method: "POST",
url: "/batch/raw",
config: {
rateLimit: secretsLimit
},
schema: {
description: "Create many secrets",
security: [
{
bearerAuth: []
}
],
body: z.object({
projectSlug: z.string().trim().describe(RAW_SECRETS.CREATE.projectSlug),
environment: z.string().trim().describe(RAW_SECRETS.CREATE.environment),
secretPath: z
.string()
.trim()
.default("/")
.transform(removeTrailingSlash)
.describe(RAW_SECRETS.CREATE.secretPath),
secrets: z
.object({
secretKey: z.string().trim().describe(RAW_SECRETS.CREATE.secretName),
secretValue: z
.string()
.transform((val) => (val.at(-1) === "\n" ? `${val.trim()}\n` : val.trim()))
.describe(RAW_SECRETS.CREATE.secretValue),
secretComment: z.string().trim().optional().default("").describe(RAW_SECRETS.CREATE.secretComment),
skipMultilineEncoding: z.boolean().optional().describe(RAW_SECRETS.CREATE.skipMultilineEncoding)
})
.array()
.min(1)
}),
response: {
200: z.object({
secrets: secretRawSchema.array()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { environment, projectSlug, secretPath, secrets: inputSecrets } = req.body;
const secrets = await server.services.secret.createManySecretsRaw({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
secretPath,
environment,
projectSlug,
secrets: inputSecrets
});
await server.services.auditLog.createAuditLog({
projectId: secrets[0].workspace,
...req.auditLogInfo,
event: {
type: EventType.CREATE_SECRETS,
metadata: {
environment: req.body.environment,
secretPath: req.body.secretPath,
secrets: secrets.map((secret, i) => ({
secretId: secret.id,
secretKey: inputSecrets[i].secretKey,
secretVersion: secret.version
}))
}
}
});
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req),
properties: {
numberOfSecrets: secrets.length,
workspaceId: secrets[0].workspace,
environment: req.body.environment,
secretPath: req.body.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),
...req.auditLogInfo
}
});
return { secrets };
}
});
server.route({
method: "PATCH",
url: "/batch/raw",
config: {
rateLimit: secretsLimit
},
schema: {
description: "Update many secrets",
security: [
{
bearerAuth: []
}
],
body: z.object({
projectSlug: z.string().trim().describe(RAW_SECRETS.UPDATE.projectSlug),
environment: z.string().trim().describe(RAW_SECRETS.UPDATE.environment),
secretPath: z
.string()
.trim()
.default("/")
.transform(removeTrailingSlash)
.describe(RAW_SECRETS.UPDATE.secretPath),
secrets: z
.object({
secretKey: z.string().trim().describe(RAW_SECRETS.UPDATE.secretName),
secretValue: z
.string()
.transform((val) => (val.at(-1) === "\n" ? `${val.trim()}\n` : val.trim()))
.describe(RAW_SECRETS.UPDATE.secretValue),
secretComment: z.string().trim().optional().describe(RAW_SECRETS.UPDATE.secretComment),
skipMultilineEncoding: z.boolean().optional().describe(RAW_SECRETS.UPDATE.skipMultilineEncoding)
})
.array()
.min(1)
}),
response: {
200: z.object({
secrets: secretRawSchema.array()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { environment, projectSlug, secretPath, secrets: inputSecrets } = req.body;
const secrets = await server.services.secret.updateManySecretsRaw({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
secretPath,
environment,
projectSlug,
secrets: inputSecrets
});
await server.services.auditLog.createAuditLog({
projectId: secrets[0].workspace,
...req.auditLogInfo,
event: {
type: EventType.UPDATE_SECRETS,
metadata: {
environment: req.body.environment,
secretPath: req.body.secretPath,
secrets: secrets.map((secret, i) => ({
secretId: secret.id,
secretKey: inputSecrets[i].secretKey,
secretVersion: secret.version
}))
}
}
});
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req),
properties: {
numberOfSecrets: secrets.length,
workspaceId: secrets[0].workspace,
environment: req.body.environment,
secretPath: req.body.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),
...req.auditLogInfo
}
});
return { secrets };
}
});
server.route({
method: "DELETE",
url: "/batch/raw",
config: {
rateLimit: secretsLimit
},
schema: {
description: "Delete many secrets",
security: [
{
bearerAuth: []
}
],
body: z.object({
projectSlug: z.string().trim().describe(RAW_SECRETS.DELETE.projectSlug),
environment: z.string().trim().describe(RAW_SECRETS.DELETE.environment),
secretPath: z
.string()
.trim()
.default("/")
.transform(removeTrailingSlash)
.describe(RAW_SECRETS.DELETE.secretPath),
secrets: z
.object({
secretKey: z.string().trim().describe(RAW_SECRETS.DELETE.secretName)
})
.array()
.min(1)
}),
response: {
200: z.object({
secrets: secretRawSchema.array()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const { environment, projectSlug, secretPath, secrets: inputSecrets } = req.body;
const secrets = await server.services.secret.deleteManySecretsRaw({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
environment,
projectSlug,
secretPath,
secrets: inputSecrets
});
await server.services.auditLog.createAuditLog({
projectId: secrets[0].workspace,
...req.auditLogInfo,
event: {
type: EventType.DELETE_SECRETS,
metadata: {
environment: req.body.environment,
secretPath: req.body.secretPath,
secrets: secrets.map((secret, i) => ({
secretId: secret.id,
secretKey: inputSecrets[i].secretKey,
secretVersion: secret.version
}))
}
}
});
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req),
properties: {
numberOfSecrets: secrets.length,
workspaceId: secrets[0].workspace,
environment: req.body.environment,
secretPath: req.body.secretPath,
channel: getUserAgentType(req.headers["user-agent"]),
...req.auditLogInfo
}
});
return { secrets };
}
});
};

View File

@@ -191,7 +191,7 @@ export const authLoginServiceFactory = ({
const decodedProviderToken = validateProviderAuthToken(providerAuthToken, email);
authMethod = decodedProviderToken.authMethod;
if (isAuthMethodSaml(authMethod) && decodedProviderToken.orgId) {
if ((isAuthMethodSaml(authMethod) || authMethod === AuthMethod.LDAP) && decodedProviderToken.orgId) {
organizationId = decodedProviderToken.orgId;
}
}

View File

@@ -1,10 +1,17 @@
import jwt from "jsonwebtoken";
import { OrgMembershipStatus } from "@app/db/schemas";
import { convertPendingGroupAdditionsToGroupMemberships } from "@app/ee/services/group/group-fns";
import { TUserGroupMembershipDALFactory } from "@app/ee/services/group/user-group-membership-dal";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { isAuthMethodSaml } from "@app/ee/services/permission/permission-fns";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors";
import { isDisposableEmail } from "@app/lib/validator";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal";
import { TProjectBotDALFactory } from "@app/services/project-bot/project-bot-dal";
import { TProjectKeyDALFactory } from "@app/services/project-key/project-key-dal";
import { TAuthTokenServiceFactory } from "../auth-token/auth-token-service";
import { TokenType } from "../auth-token/auth-token-types";
@@ -20,6 +27,14 @@ import { AuthMethod, AuthTokenType } from "./auth-type";
type TAuthSignupDep = {
authDAL: TAuthDALFactory;
userDAL: TUserDALFactory;
userGroupMembershipDAL: Pick<
TUserGroupMembershipDALFactory,
"find" | "transaction" | "insertMany" | "deletePendingUserGroupMembershipsByUserIds"
>;
projectKeyDAL: Pick<TProjectKeyDALFactory, "find" | "findLatestProjectKey" | "insertMany">;
projectDAL: Pick<TProjectDALFactory, "findProjectGhostUser">;
projectBotDAL: Pick<TProjectBotDALFactory, "findOne">;
groupProjectDAL: Pick<TGroupProjectDALFactory, "find">;
orgService: Pick<TOrgServiceFactory, "createOrganization">;
orgDAL: TOrgDALFactory;
tokenService: TAuthTokenServiceFactory;
@@ -31,6 +46,11 @@ export type TAuthSignupFactory = ReturnType<typeof authSignupServiceFactory>;
export const authSignupServiceFactory = ({
authDAL,
userDAL,
userGroupMembershipDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
groupProjectDAL,
tokenService,
smtpService,
orgService,
@@ -120,9 +140,11 @@ export const authSignupServiceFactory = ({
throw new Error("Failed to complete account for complete user");
}
let organizationId;
let organizationId: string | null = null;
let authMethod: AuthMethod | null = null;
if (providerAuthToken) {
const { orgId } = validateProviderAuthToken(providerAuthToken, user.username);
const { orgId, authMethod: userAuthMethod } = validateProviderAuthToken(providerAuthToken, user.username);
authMethod = userAuthMethod;
organizationId = orgId;
} else {
validateSignUpAuthorization(authorization, user.id);
@@ -146,6 +168,26 @@ export const authSignupServiceFactory = ({
},
tx
);
// If it's SAML Auth and the organization ID is present, we should check if the user has a pending invite for this org, and accept it
if (isAuthMethodSaml(authMethod) && organizationId) {
const [pendingOrgMembership] = await orgDAL.findMembership({
inviteEmail: email,
userId: user.id,
status: OrgMembershipStatus.Invited,
orgId: organizationId
});
if (pendingOrgMembership) {
await orgDAL.updateMembershipById(
pendingOrgMembership.id,
{
status: OrgMembershipStatus.Accepted
},
tx
);
}
}
return { info: us, key: userEncKey };
});
@@ -168,6 +210,16 @@ export const authSignupServiceFactory = ({
const uniqueOrgId = [...new Set(updatedMembersips.map(({ orgId }) => orgId))];
await Promise.allSettled(uniqueOrgId.map((orgId) => licenseService.updateSubscriptionOrgMemberCount(orgId)));
await convertPendingGroupAdditionsToGroupMemberships({
userIds: [user.id],
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL
});
const tokenSession = await tokenService.getUserTokenSession({
userAgent,
ip,
@@ -270,6 +322,17 @@ export const authSignupServiceFactory = ({
const uniqueOrgId = [...new Set(updatedMembersips.map(({ orgId }) => orgId))];
await Promise.allSettled(uniqueOrgId.map((orgId) => licenseService.updateSubscriptionOrgMemberCount(orgId)));
await convertPendingGroupAdditionsToGroupMemberships({
userIds: [user.id],
userDAL,
userGroupMembershipDAL,
groupProjectDAL,
projectKeyDAL,
projectDAL,
projectBotDAL,
tx
});
return { info: us, key: userEncKey };
});

View File

@@ -32,7 +32,7 @@ type TGroupProjectServiceFactoryDep = {
TGroupProjectMembershipRoleDALFactory,
"create" | "transaction" | "insertMany" | "delete"
>;
userGroupMembershipDAL: TUserGroupMembershipDALFactory;
userGroupMembershipDAL: Pick<TUserGroupMembershipDALFactory, "findGroupMembersNotInProject">;
projectDAL: Pick<TProjectDALFactory, "findOne" | "findProjectGhostUser">;
projectKeyDAL: Pick<TProjectKeyDALFactory, "findLatestProjectKey" | "delete" | "insertMany" | "transaction">;
projectRoleDAL: Pick<TProjectRoleDALFactory, "find">;
@@ -116,68 +116,69 @@ export const groupProjectServiceFactory = ({
},
tx
);
// share project key with users in group that have not
// individually been added to the project and that are not part of
// other groups that are in the project
const groupMembers = await userGroupMembershipDAL.findGroupMembersNotInProject(group.id, project.id, tx);
if (groupMembers.length) {
const ghostUser = await projectDAL.findProjectGhostUser(project.id, tx);
if (!ghostUser) {
throw new BadRequestError({
message: "Failed to find sudo user"
});
}
const ghostUserLatestKey = await projectKeyDAL.findLatestProjectKey(ghostUser.id, project.id, tx);
if (!ghostUserLatestKey) {
throw new BadRequestError({
message: "Failed to find sudo user latest key"
});
}
const bot = await projectBotDAL.findOne({ projectId: project.id }, tx);
if (!bot) {
throw new BadRequestError({
message: "Failed to find bot"
});
}
const botPrivateKey = infisicalSymmetricDecrypt({
keyEncoding: bot.keyEncoding as SecretKeyEncoding,
iv: bot.iv,
tag: bot.tag,
ciphertext: bot.encryptedPrivateKey
});
const plaintextProjectKey = decryptAsymmetric({
ciphertext: ghostUserLatestKey.encryptedKey,
nonce: ghostUserLatestKey.nonce,
publicKey: ghostUserLatestKey.sender.publicKey,
privateKey: botPrivateKey
});
const projectKeyData = groupMembers.map(({ user: { publicKey, id } }) => {
const { ciphertext: encryptedKey, nonce } = encryptAsymmetric(plaintextProjectKey, publicKey, botPrivateKey);
return {
encryptedKey,
nonce,
senderId: ghostUser.id,
receiverId: id,
projectId: project.id
};
});
await projectKeyDAL.insertMany(projectKeyData, tx);
}
return groupProjectMembership;
});
// share project key with users in group that have not
// individually been added to the project and that are not part of
// other groups that are in the project
const groupMembers = await userGroupMembershipDAL.findGroupMembersNotInProject(group.id, project.id);
if (groupMembers.length) {
const ghostUser = await projectDAL.findProjectGhostUser(project.id);
if (!ghostUser) {
throw new BadRequestError({
message: "Failed to find sudo user"
});
}
const ghostUserLatestKey = await projectKeyDAL.findLatestProjectKey(ghostUser.id, project.id);
if (!ghostUserLatestKey) {
throw new BadRequestError({
message: "Failed to find sudo user latest key"
});
}
const bot = await projectBotDAL.findOne({ projectId: project.id });
if (!bot) {
throw new BadRequestError({
message: "Failed to find bot"
});
}
const botPrivateKey = infisicalSymmetricDecrypt({
keyEncoding: bot.keyEncoding as SecretKeyEncoding,
iv: bot.iv,
tag: bot.tag,
ciphertext: bot.encryptedPrivateKey
});
const plaintextProjectKey = decryptAsymmetric({
ciphertext: ghostUserLatestKey.encryptedKey,
nonce: ghostUserLatestKey.nonce,
publicKey: ghostUserLatestKey.sender.publicKey,
privateKey: botPrivateKey
});
const projectKeyData = groupMembers.map(({ user: { publicKey, id } }) => {
const { ciphertext: encryptedKey, nonce } = encryptAsymmetric(plaintextProjectKey, publicKey, botPrivateKey);
return {
encryptedKey,
nonce,
senderId: ghostUser.id,
receiverId: id,
projectId: project.id
};
});
await projectKeyDAL.insertMany(projectKeyData);
}
return projectGroup;
};
@@ -287,20 +288,26 @@ export const groupProjectServiceFactory = ({
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Delete, ProjectPermissionSub.Groups);
const groupMembers = await userGroupMembershipDAL.findGroupMembersNotInProject(group.id, project.id);
const deletedProjectGroup = await groupProjectDAL.transaction(async (tx) => {
const groupMembers = await userGroupMembershipDAL.findGroupMembersNotInProject(group.id, project.id, tx);
if (groupMembers.length) {
await projectKeyDAL.delete({
projectId: project.id,
$in: {
receiverId: groupMembers.map(({ user: { id } }) => id)
}
});
}
if (groupMembers.length) {
await projectKeyDAL.delete(
{
projectId: project.id,
$in: {
receiverId: groupMembers.map(({ user: { id } }) => id)
}
},
tx
);
}
const [deletedGroup] = await groupProjectDAL.delete({ groupId: group.id, projectId: project.id });
const [projectGroup] = await groupProjectDAL.delete({ groupId: group.id, projectId: project.id }, tx);
return projectGroup;
});
return deletedGroup;
return deletedProjectGroup;
};
const listGroupsInProject = async ({

View File

@@ -458,7 +458,7 @@ const syncSecretsAWSParameterStore = async ({
});
ssm.config.update(config);
const metadata = z.record(z.any()).parse(integration.metadata);
const metadata = z.record(z.any()).parse(integration.metadata || {});
const params = {
Path: integration.path as string,
@@ -544,7 +544,7 @@ const syncSecretsAWSSecretManager = async ({
}) => {
let secretsManager;
const secKeyVal = getSecretKeyValuePair(secrets);
const metadata = z.record(z.any()).parse(integration.metadata);
const metadata = z.record(z.any()).parse(integration.metadata || {});
try {
if (!accessId) return;

View File

@@ -89,6 +89,25 @@ export const orgDALFactory = (db: TDbClient) => {
}
};
const countAllOrgMembers = async (orgId: string) => {
try {
interface CountResult {
count: string;
}
const count = await db(TableName.OrgMembership)
.where(`${TableName.OrgMembership}.orgId`, orgId)
.count("*")
.join(TableName.Users, `${TableName.OrgMembership}.userId`, `${TableName.Users}.id`)
.where({ isGhost: false })
.first();
return parseInt((count as unknown as CountResult).count || "0", 10);
} catch (error) {
throw new DatabaseError({ error, name: "Count all org members" });
}
};
const findOrgMembersByUsername = async (orgId: string, usernames: string[]) => {
try {
const members = await db(TableName.OrgMembership)
@@ -269,6 +288,7 @@ export const orgDALFactory = (db: TDbClient) => {
...orgOrm,
findOrgByProjectId,
findAllOrgMembers,
countAllOrgMembers,
findOrgById,
findAllOrgsByUserId,
ghostUserExists,

View File

@@ -248,7 +248,7 @@ export const orgServiceFactory = ({
ForbiddenError.from(permission).throwUnlessCan(OrgPermissionActions.Edit, OrgPermissionSubjects.Scim);
}
if (authEnforced || scimEnabled) {
if (authEnforced) {
const samlCfg = await samlConfigDAL.findEnforceableSamlCfg(orgId);
if (!samlCfg)
throw new BadRequestError({

View File

@@ -81,9 +81,9 @@ export const projectDALFactory = (db: TDbClient) => {
}
};
const findProjectGhostUser = async (projectId: string) => {
const findProjectGhostUser = async (projectId: string, tx?: Knex) => {
try {
const ghostUser = await db(TableName.ProjectMembership)
const ghostUser = await (tx || db)(TableName.ProjectMembership)
.where({ projectId })
.join(TableName.Users, `${TableName.ProjectMembership}.userId`, `${TableName.Users}.id`)
.select(selectAllTableCols(TableName.Users))

View File

@@ -1,33 +1,66 @@
import { SecretType, TSecretImports } from "@app/db/schemas";
import { SecretType, TSecretImports, TSecrets } from "@app/db/schemas";
import { groupBy } from "@app/lib/fn";
import { TSecretDALFactory } from "../secret/secret-dal";
import { TSecretFolderDALFactory } from "../secret-folder/secret-folder-dal";
import { TSecretImportDALFactory } from "./secret-import-dal";
type TSecretImportSecrets = {
secretPath: string;
environment: string;
environmentInfo: {
id: string;
slug: string;
name: string;
};
folderId: string | undefined;
importFolderId: string;
secrets: (TSecrets & { workspace: string; environment: string; _id: string })[];
};
const LEVEL_BREAK = 10;
const getImportUniqKey = (envSlug: string, path: string) => `${envSlug}=${path}`;
export const fnSecretsFromImports = async ({
allowedImports,
allowedImports: possibleCyclicImports,
folderDAL,
secretDAL
secretDAL,
secretImportDAL,
depth = 0,
cyclicDetector = new Set()
}: {
allowedImports: (Omit<TSecretImports, "importEnv"> & {
importEnv: { id: string; slug: string; name: string };
})[];
folderDAL: Pick<TSecretFolderDALFactory, "findByManySecretPath">;
secretDAL: Pick<TSecretDALFactory, "find">;
secretImportDAL: Pick<TSecretImportDALFactory, "findByFolderIds">;
depth?: number;
cyclicDetector?: Set<string>;
}) => {
const importedFolders = await folderDAL.findByManySecretPath(
allowedImports.map(({ importEnv, importPath }) => ({
envId: importEnv.id,
secretPath: importPath
}))
// avoid going more than a depth
if (depth >= LEVEL_BREAK) return [];
const allowedImports = possibleCyclicImports.filter(
({ importPath, importEnv }) => !cyclicDetector.has(getImportUniqKey(importEnv.slug, importPath))
);
const folderIds = importedFolders.map((el) => el?.id).filter(Boolean) as string[];
if (!folderIds.length) {
const importedFolders = (
await folderDAL.findByManySecretPath(
allowedImports.map(({ importEnv, importPath }) => ({
envId: importEnv.id,
secretPath: importPath
}))
)
).filter(Boolean); // remove undefined ones
if (!importedFolders.length) {
return [];
}
const importedFolderIds = importedFolders.map((el) => el?.id) as string[];
const importedFolderGroupBySourceImport = groupBy(importedFolders, (i) => `${i?.envId}-${i?.path}`);
const importedSecrets = await secretDAL.find(
{
$in: { folderId: folderIds },
$in: { folderId: importedFolderIds },
type: SecretType.Shared
},
{
@@ -35,18 +68,50 @@ export const fnSecretsFromImports = async ({
}
);
const importedSecsGroupByFolderId = groupBy(importedSecrets, (i) => i.folderId);
return allowedImports.map(({ importPath, importEnv }, i) => ({
secretPath: importPath,
environment: importEnv.slug,
environmentInfo: importEnv,
folderId: importedFolders?.[i]?.id,
// this will ensure for cases when secrets are empty. Could be due to missing folder for a path or when emtpy secrets inside a given path
secrets: (importedSecsGroupByFolderId?.[importedFolders?.[i]?.id as string] || []).map((item) => ({
...item,
const importedSecretsGroupByFolderId = groupBy(importedSecrets, (i) => i.folderId);
allowedImports.forEach(({ importPath, importEnv }) => {
cyclicDetector.add(getImportUniqKey(importEnv.slug, importPath));
});
// now we need to check recursively deeper imports made inside other imports
// we go level wise meaning we take all imports of a tree level and then go deeper ones level by level
const deeperImports = await secretImportDAL.findByFolderIds(importedFolderIds);
let secretsFromDeeperImports: TSecretImportSecrets[] = [];
if (deeperImports.length) {
secretsFromDeeperImports = await fnSecretsFromImports({
allowedImports: deeperImports,
secretImportDAL,
folderDAL,
secretDAL,
depth: depth + 1,
cyclicDetector
});
}
const secretsFromdeeperImportGroupedByFolderId = groupBy(secretsFromDeeperImports, (i) => i.importFolderId);
const secrets = allowedImports.map(({ importPath, importEnv, id, folderId }, i) => {
const sourceImportFolder = importedFolderGroupBySourceImport[`${importEnv.id}-${importPath}`][0];
const folderDeeperImportSecrets =
secretsFromdeeperImportGroupedByFolderId?.[sourceImportFolder?.id || ""]?.[0]?.secrets || [];
return {
secretPath: importPath,
environment: importEnv.slug,
workspace: "", // This field should not be used, it's only here to keep the older Python SDK versions backwards compatible with the new Postgres backend.
_id: item.id // The old Python SDK depends on the _id field being returned. We return this to keep the older Python SDK versions backwards compatible with the new Postgres backend.
}))
}));
environmentInfo: importEnv,
folderId: importedFolders?.[i]?.id,
id,
importFolderId: folderId,
// this will ensure for cases when secrets are empty. Could be due to missing folder for a path or when emtpy secrets inside a given path
secrets: (importedSecretsGroupByFolderId?.[importedFolders?.[i]?.id as string] || [])
.map((item) => ({
...item,
environment: importEnv.slug,
workspace: "", // This field should not be used, it's only here to keep the older Python SDK versions backwards compatible with the new Postgres backend.
_id: item.id // The old Python SDK depends on the _id field being returned. We return this to keep the older Python SDK versions backwards compatible with the new Postgres backend.
}))
.concat(folderDeeperImportSecrets)
};
});
return secrets;
};

View File

@@ -290,7 +290,7 @@ export const secretImportServiceFactory = ({
})
)
);
return fnSecretsFromImports({ allowedImports, folderDAL, secretDAL });
return fnSecretsFromImports({ allowedImports, folderDAL, secretDAL, secretImportDAL });
};
return {

View File

@@ -575,7 +575,11 @@ export const createManySecretsRawFnFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, secretPath);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -680,7 +684,11 @@ export const updateManySecretsRawFnFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, secretPath);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Update secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Update secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });

View File

@@ -33,9 +33,11 @@ import { TSecretQueueFactory } from "./secret-queue";
import {
TAttachSecretTagsDTO,
TCreateBulkSecretDTO,
TCreateManySecretRawDTO,
TCreateSecretDTO,
TCreateSecretRawDTO,
TDeleteBulkSecretDTO,
TDeleteManySecretRawDTO,
TDeleteSecretDTO,
TDeleteSecretRawDTO,
TFnSecretBlindIndexCheckV2,
@@ -46,6 +48,7 @@ import {
TGetSecretsRawDTO,
TGetSecretVersionsDTO,
TUpdateBulkSecretDTO,
TUpdateManySecretRawDTO,
TUpdateSecretDTO,
TUpdateSecretRawDTO
} from "./secret-types";
@@ -179,7 +182,11 @@ export const secretServiceFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -275,7 +282,11 @@ export const secretServiceFactory = ({
}
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -391,7 +402,11 @@ export const secretServiceFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -510,7 +525,8 @@ export const secretServiceFactory = ({
const importedSecrets = await fnSecretsFromImports({
allowedImports,
secretDAL,
folderDAL
folderDAL,
secretImportDAL
});
return {
@@ -559,7 +575,11 @@ export const secretServiceFactory = ({
subject(ProjectPermissionSub.Secrets, { environment, secretPath: path })
);
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const secretBlindIndex = await interalGenSecBlindIndexByName(projectId, secretName);
@@ -611,7 +631,8 @@ export const secretServiceFactory = ({
const importedSecrets = await fnSecretsFromImports({
allowedImports,
secretDAL,
folderDAL
folderDAL,
secretImportDAL
});
for (let i = importedSecrets.length - 1; i >= 0; i -= 1) {
for (let j = 0; j < importedSecrets[i].secrets.length; j += 1) {
@@ -655,7 +676,11 @@ export const secretServiceFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -724,7 +749,11 @@ export const secretServiceFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Update secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Update secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -810,7 +839,11 @@ export const secretServiceFactory = ({
await projectDAL.checkProjectUpgradeStatus(projectId);
const folder = await folderDAL.findBySecretPath(projectId, environment, path);
if (!folder) throw new BadRequestError({ message: "Folder not found", name: "Create secret" });
if (!folder)
throw new BadRequestError({
message: "Folder not found for the given environment slug & secret path",
name: "Create secret"
});
const folderId = folder.id;
const blindIndexCfg = await secretBlindIndexDAL.findOne({ projectId });
@@ -1036,6 +1069,143 @@ export const secretServiceFactory = ({
return decryptSecretRaw(secret, botKey);
};
const createManySecretsRaw = async ({
actorId,
projectSlug,
environment,
actor,
actorOrgId,
actorAuthMethod,
secretPath,
secrets: inputSecrets = []
}: TCreateManySecretRawDTO) => {
const project = await projectDAL.findProjectBySlug(projectSlug, actorOrgId);
if (!project) throw new BadRequestError({ message: "Project not found" });
const projectId = project.id;
const botKey = await projectBotService.getBotKey(projectId);
if (!botKey) throw new BadRequestError({ message: "Project bot not found", name: "bot_not_found_error" });
const secrets = await createManySecret({
projectId,
environment,
path: secretPath,
actor,
actorId,
actorOrgId,
actorAuthMethod,
secrets: inputSecrets.map(({ secretComment, secretKey, secretValue, skipMultilineEncoding }) => {
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8(secretKey, botKey);
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8(secretValue || "", botKey);
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8(secretComment || "", botKey);
return {
secretName: secretKey,
skipMultilineEncoding,
secretKeyCiphertext: secretKeyEncrypted.ciphertext,
secretKeyIV: secretKeyEncrypted.iv,
secretKeyTag: secretKeyEncrypted.tag,
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag
};
})
});
await snapshotService.performSnapshot(secrets[0].folderId);
await secretQueueService.syncSecrets({ secretPath, projectId, environment });
return secrets.map((secret) => decryptSecretRaw({ ...secret, workspace: projectId, environment }, botKey));
};
const updateManySecretsRaw = async ({
actorId,
projectSlug,
environment,
actor,
actorOrgId,
actorAuthMethod,
secretPath,
secrets: inputSecrets = []
}: TUpdateManySecretRawDTO) => {
const project = await projectDAL.findProjectBySlug(projectSlug, actorOrgId);
if (!project) throw new BadRequestError({ message: "Project not found" });
const projectId = project.id;
const botKey = await projectBotService.getBotKey(projectId);
if (!botKey) throw new BadRequestError({ message: "Project bot not found", name: "bot_not_found_error" });
const secrets = await updateManySecret({
projectId,
environment,
path: secretPath,
actor,
actorId,
actorOrgId,
actorAuthMethod,
secrets: inputSecrets.map(({ secretComment, secretKey, secretValue, skipMultilineEncoding }) => {
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8(secretKey, botKey);
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8(secretValue || "", botKey);
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8(secretComment || "", botKey);
return {
secretName: secretKey,
type: SecretType.Shared,
skipMultilineEncoding,
secretKeyCiphertext: secretKeyEncrypted.ciphertext,
secretKeyIV: secretKeyEncrypted.iv,
secretKeyTag: secretKeyEncrypted.tag,
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag
};
})
});
await snapshotService.performSnapshot(secrets[0].folderId);
await secretQueueService.syncSecrets({ secretPath, projectId, environment });
return secrets.map((secret) => decryptSecretRaw({ ...secret, workspace: projectId, environment }, botKey));
};
const deleteManySecretsRaw = async ({
actorId,
projectSlug,
environment,
actor,
actorOrgId,
actorAuthMethod,
secretPath,
secrets: inputSecrets = []
}: TDeleteManySecretRawDTO) => {
const project = await projectDAL.findProjectBySlug(projectSlug, actorOrgId);
if (!project) throw new BadRequestError({ message: "Project not found" });
const projectId = project.id;
const botKey = await projectBotService.getBotKey(projectId);
if (!botKey) throw new BadRequestError({ message: "Project bot not found", name: "bot_not_found_error" });
const secrets = await deleteManySecret({
projectId,
environment,
path: secretPath,
actor,
actorId,
actorOrgId,
actorAuthMethod,
secrets: inputSecrets.map(({ secretKey }) => ({ secretName: secretKey, type: SecretType.Shared }))
});
await snapshotService.performSnapshot(secrets[0].folderId);
await secretQueueService.syncSecrets({ secretPath, projectId, environment });
return secrets.map((secret) => decryptSecretRaw({ ...secret, workspace: projectId, environment }, botKey));
};
const getSecretVersions = async ({
actorId,
actor,
@@ -1280,6 +1450,9 @@ export const secretServiceFactory = ({
createSecretRaw,
updateSecretRaw,
deleteSecretRaw,
createManySecretsRaw,
updateManySecretsRaw,
deleteManySecretsRaw,
getSecretVersions,
// external services function
fnSecretBulkDelete,

View File

@@ -181,6 +181,39 @@ export type TDeleteSecretRawDTO = TProjectPermission & {
type: SecretType;
};
export type TCreateManySecretRawDTO = Omit<TProjectPermission, "projectId"> & {
secretPath: string;
projectSlug: string;
environment: string;
secrets: {
secretKey: string;
secretValue: string;
secretComment?: string;
skipMultilineEncoding?: boolean;
}[];
};
export type TUpdateManySecretRawDTO = Omit<TProjectPermission, "projectId"> & {
secretPath: string;
projectSlug: string;
environment: string;
secrets: {
secretKey: string;
secretValue: string;
secretComment?: string;
skipMultilineEncoding?: boolean;
}[];
};
export type TDeleteManySecretRawDTO = Omit<TProjectPermission, "projectId"> & {
secretPath: string;
projectSlug: string;
environment: string;
secrets: {
secretKey: string;
}[];
};
export type TGetSecretVersionsDTO = Omit<TProjectPermission, "projectId"> & {
limit?: number;
offset?: number;

View File

@@ -34,6 +34,19 @@ export const userDALFactory = (db: TDbClient) => {
}
};
const findUserEncKeyByUserIdsBatch = async ({ userIds }: { userIds: string[] }, tx?: Knex) => {
try {
return await (tx || db)(TableName.Users)
.where({
isGhost: false
})
.whereIn(`${TableName.Users}.id`, userIds)
.join(TableName.UserEncryptionKey, `${TableName.Users}.id`, `${TableName.UserEncryptionKey}.userId`);
} catch (error) {
throw new DatabaseError({ error, name: "Find user enc by user ids batch" });
}
};
const findUserEncKeyByUserId = async (userId: string) => {
try {
const user = await db(TableName.Users)
@@ -123,6 +136,7 @@ export const userDALFactory = (db: TDbClient) => {
...userOrm,
findUserByUsername,
findUserEncKeyByUsername,
findUserEncKeyByUserIdsBatch,
findUserEncKeyByUserId,
updateUserEncryptionByUserId,
findUserByProjectMembershipId,

View File

@@ -29,6 +29,7 @@ require (
require (
github.com/alessio/shellescape v1.4.1 // indirect
github.com/asaskevich/govalidator v0.0.0-20200907205600-7a23bdc65eef // indirect
github.com/bradleyjkemp/cupaloy/v2 v2.8.0 // indirect
github.com/chzyer/readline v1.5.1 // indirect
github.com/danieljoos/wincred v1.2.0 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect

View File

@@ -51,6 +51,8 @@ github.com/asaskevich/govalidator v0.0.0-20200907205600-7a23bdc65eef h1:46PFijGL
github.com/asaskevich/govalidator v0.0.0-20200907205600-7a23bdc65eef/go.mod h1:WaHUgvxTVq04UNunO+XhnAqY/wQc+bxr74GqbsZ/Jqw=
github.com/bgentry/speakeasy v0.1.0/go.mod h1:+zsyZBPWlz7T6j88CTgSN5bM796AkVf0kBD4zp0CCIs=
github.com/bketelsen/crypt v0.0.4/go.mod h1:aI6NrJ0pMGgvZKL1iVgXLnfIFJtfV+bKCoqOes/6LfM=
github.com/bradleyjkemp/cupaloy/v2 v2.8.0 h1:any4BmKE+jGIaMpnU8YgH/I2LPiLBufr6oMMlVBbn9M=
github.com/bradleyjkemp/cupaloy/v2 v2.8.0/go.mod h1:bm7JXdkRd4BHJk9HpwqAI8BoAY1lps46Enkdqw6aRX0=
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
github.com/charmbracelet/lipgloss v0.5.0 h1:lulQHuVeodSgDez+3rGiuxlPVXSnhth442DATR2/8t8=
github.com/charmbracelet/lipgloss v0.5.0/go.mod h1:EZLha/HbzEt7cYqdFPovlqy5FZPj0xFhg5SaqxScmgs=
@@ -324,6 +326,7 @@ github.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An
github.com/spf13/viper v1.8.1 h1:Kq1fyeebqsBfbjZj4EL7gj2IO0mMaiyjYUWcUsl2O44=
github.com/spf13/viper v1.8.1/go.mod h1:o0Pch8wJ9BVSWGQMbra6iw0oQ5oktSIBaujf1rJH9Ns=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
github.com/stretchr/objx v0.5.0 h1:1zr/of2m5FGMsad5YfcqgdqdWrIhu+EBEJRhR1U7z/c=
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=

View File

@@ -512,16 +512,23 @@ func CallUniversalAuthRefreshAccessToken(httpClient *resty.Client, request Unive
func CallGetRawSecretsV3(httpClient *resty.Client, request GetRawSecretsV3Request) (GetRawSecretsV3Response, error) {
var getRawSecretsV3Response GetRawSecretsV3Response
response, err := httpClient.
req := httpClient.
R().
SetResult(&getRawSecretsV3Response).
SetHeader("User-Agent", USER_AGENT).
SetBody(request).
SetQueryParam("workspaceId", request.WorkspaceId).
SetQueryParam("environment", request.Environment).
SetQueryParam("secretPath", request.SecretPath).
SetQueryParam("include_imports", "false").
Get(fmt.Sprintf("%v/v3/secrets/raw", config.INFISICAL_URL))
SetQueryParam("secretPath", request.SecretPath)
if request.IncludeImport {
req.SetQueryParam("include_imports", "true")
}
if request.Recursive {
req.SetQueryParam("recursive", "true")
}
response, err := req.Get(fmt.Sprintf("%v/v3/secrets/raw", config.INFISICAL_URL))
if err != nil {
return GetRawSecretsV3Response{}, fmt.Errorf("CallGetRawSecretsV3: Unable to complete api request [err=%w]", err)

View File

@@ -371,6 +371,22 @@ type ImportedSecretV3 struct {
Secrets []EncryptedSecretV3 `json:"secrets"`
}
type ImportedRawSecretV3 struct {
SecretPath string `json:"secretPath"`
Environment string `json:"environment"`
FolderId string `json:"folderId"`
Secrets []struct {
ID string `json:"id"`
Workspace string `json:"workspace"`
Environment string `json:"environment"`
Version int `json:"version"`
Type string `json:"type"`
SecretKey string `json:"secretKey"`
SecretValue string `json:"secretValue"`
SecretComment string `json:"secretComment"`
} `json:"secrets"`
}
type GetEncryptedSecretsV3Response struct {
Secrets []EncryptedSecretV3 `json:"secrets"`
ImportedSecrets []ImportedSecretV3 `json:"imports,omitempty"`
@@ -542,6 +558,6 @@ type GetRawSecretsV3Response struct {
SecretValue string `json:"secretValue"`
SecretComment string `json:"secretComment"`
} `json:"secrets"`
Imports []any `json:"imports"`
Imports []ImportedRawSecretV3 `json:"imports"`
ETag string
}

View File

@@ -149,6 +149,8 @@ var exportCmd = &cobra.Command{
secrets = util.ExpandSecrets(secrets, authParams, "")
}
secrets = util.FilterSecretsByTag(secrets, tagSlugs)
secrets = util.SortSecretsByKeys(secrets)
output, err = formatEnvs(secrets, format)
if err != nil {
util.HandleError(err)

View File

@@ -22,10 +22,6 @@ var folderCmd = &cobra.Command{
var getCmd = &cobra.Command{
Use: "get",
Short: "Get folders in a directory",
PersistentPreRun: func(cmd *cobra.Command, args []string) {
util.RequireLocalWorkspaceFile()
util.RequireLogin()
},
Run: func(cmd *cobra.Command, args []string) {
environmentName, _ := cmd.Flags().GetString("env")

View File

@@ -7,6 +7,7 @@ import (
"crypto/sha256"
"encoding/base64"
"fmt"
"os"
"regexp"
"sort"
"strings"
@@ -116,6 +117,9 @@ var secretsCmd = &cobra.Command{
secrets = util.ExpandSecrets(secrets, authParams, "")
}
// Sort the secrets by key so we can create a consistent output
secrets = util.SortSecretsByKeys(secrets)
visualize.PrintAllSecretDetails(secrets)
Telemetry.CaptureEvent("cli-command:secrets", posthog.NewProperties().Set("secretCount", len(secrets)).Set("version", util.CLI_VERSION))
},
@@ -201,8 +205,10 @@ var secretsSetCmd = &cobra.Command{
// decrypt workspace key
plainTextEncryptionKey := crypto.DecryptAsymmetric(encryptedWorkspaceKey, encryptedWorkspaceKeyNonce, encryptedWorkspaceKeySenderPublicKey, currentUsersPrivateKey)
infisicalTokenEnv := os.Getenv(util.INFISICAL_TOKEN_NAME)
// pull current secrets
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, SecretsPath: secretsPath}, "")
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, SecretsPath: secretsPath, InfisicalToken: infisicalTokenEnv}, "")
if err != nil {
util.HandleError(err, "unable to retrieve secrets")
}

View File

@@ -2,7 +2,6 @@ package util
import (
"fmt"
"os"
"strings"
"github.com/Infisical/infisical-merge/packages/api"
@@ -13,13 +12,11 @@ import (
func GetAllFolders(params models.GetAllFoldersParameters) ([]models.SingleFolder, error) {
if params.InfisicalToken == "" {
params.InfisicalToken = os.Getenv(INFISICAL_TOKEN_NAME)
}
var foldersToReturn []models.SingleFolder
var folderErr error
if params.InfisicalToken == "" && params.UniversalAuthAccessToken == "" {
RequireLogin()
RequireLocalWorkspaceFile()
log.Debug().Msg("GetAllFolders: Trying to fetch folders using logged in details")

View File

@@ -8,6 +8,7 @@ import (
"os"
"os/exec"
"path"
"sort"
"strings"
"time"
@@ -53,6 +54,14 @@ func GetBase64DecodedSymmetricEncryptionDetails(key string, cipher string, IV st
}, nil
}
// Helper function to sort the secrets by key so we can create a consistent output
func SortSecretsByKeys(secrets []models.SingleEnvironmentVariable) []models.SingleEnvironmentVariable {
sort.Slice(secrets, func(i, j int) bool {
return secrets[i].Key < secrets[j].Key
})
return secrets
}
func IsSecretEnvironmentValid(env string) bool {
if env == "prod" || env == "dev" || env == "test" || env == "staging" {
return true

View File

@@ -186,12 +186,12 @@ func GetPlainTextSecretsViaMachineIdentity(accessToken string, workspaceId strin
plainTextSecrets = append(plainTextSecrets, models.SingleEnvironmentVariable{Key: secret.SecretKey, Value: secret.SecretValue, Type: secret.Type, WorkspaceId: secret.Workspace})
}
// if includeImports {
// plainTextSecrets, err = InjectImportedSecret(plainTextWorkspaceKey, plainTextSecrets, encryptedSecrets.ImportedSecrets)
// if err != nil {
// return nil, err
// }
// }
if includeImports {
plainTextSecrets, err = InjectRawImportedSecret(plainTextSecrets, rawSecrets.Imports)
if err != nil {
return models.PlaintextSecretResult{}, err
}
}
return models.PlaintextSecretResult{
Secrets: plainTextSecrets,
@@ -252,6 +252,36 @@ func InjectImportedSecret(plainTextWorkspaceKey []byte, secrets []models.SingleE
return secrets, nil
}
func InjectRawImportedSecret(secrets []models.SingleEnvironmentVariable, importedSecrets []api.ImportedRawSecretV3) ([]models.SingleEnvironmentVariable, error) {
if importedSecrets == nil {
return secrets, nil
}
hasOverriden := make(map[string]bool)
for _, sec := range secrets {
hasOverriden[sec.Key] = true
}
for i := len(importedSecrets) - 1; i >= 0; i-- {
importSec := importedSecrets[i]
plainTextImportedSecrets := importSec.Secrets
for _, sec := range plainTextImportedSecrets {
if _, ok := hasOverriden[sec.SecretKey]; !ok {
secrets = append(secrets, models.SingleEnvironmentVariable{
Key: sec.SecretKey,
WorkspaceId: sec.Workspace,
Value: sec.SecretValue,
Type: sec.Type,
ID: sec.ID,
})
hasOverriden[sec.SecretKey] = true
}
}
}
return secrets, nil
}
func FilterSecretsByTag(plainTextSecrets []models.SingleEnvironmentVariable, tagSlugs string) []models.SingleEnvironmentVariable {
if tagSlugs == "" {
return plainTextSecrets
@@ -277,10 +307,6 @@ func FilterSecretsByTag(plainTextSecrets []models.SingleEnvironmentVariable, tag
}
func GetAllEnvironmentVariables(params models.GetAllSecretsParameters, projectConfigFilePath string) ([]models.SingleEnvironmentVariable, error) {
if params.InfisicalToken == "" {
params.InfisicalToken = os.Getenv(INFISICAL_TOKEN_NAME)
}
isConnected := CheckIsConnectedToInternet()
var secretsToReturn []models.SingleEnvironmentVariable
// var serviceTokenDetails api.GetServiceTokenDetailsResponse

View File

@@ -0,0 +1,5 @@
STAGING-SECRET-1='staging-value-1'
STAGING-SECRET-2='staging-value-2'
TEST-SECRET-1='test-value-1'
TEST-SECRET-2='test-value-2'
TEST-SECRET-3='test-value-3'

View File

@@ -0,0 +1,3 @@
TEST-SECRET-1='test-value-1'
TEST-SECRET-2='test-value-2'
TEST-SECRET-3='test-value-3'

View File

@@ -0,0 +1,7 @@
┌─────────────────┬────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├─────────────────┼────────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
└─────────────────┴────────────────┴─────────────┘

View File

@@ -0,0 +1,7 @@
┌──────────────────┬─────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├──────────────────┼─────────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ STAGING-SECRET-2 │ staging-value-2 │ shared │
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
└──────────────────┴─────────────────┴─────────────┘

View File

@@ -0,0 +1,8 @@
┌─────────────────┬────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├─────────────────┼────────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
│ DOES-NOT-EXIST │ *not found* │ *not found* │
└─────────────────┴────────────────┴─────────────┘

View File

@@ -0,0 +1,2 @@
 Injecting 6 Infisical secrets into your application process
hello world

View File

@@ -0,0 +1,2 @@
 Injecting 5 Infisical secrets into your application process
hello world

View File

@@ -0,0 +1,2 @@
 Injecting 3 Infisical secrets into your application process
hello world

View File

@@ -0,0 +1,10 @@
┌──────────────────┬─────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├──────────────────┼─────────────────┼─────────────┤
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
│ STAGING-SECRET-1 │ staging-value-1 │ shared │
│ STAGING-SECRET-2 │ staging-value-2 │ shared │
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ TEST-SECRET-3 │ test-value-3 │ shared │
└──────────────────┴─────────────────┴─────────────┘

View File

@@ -0,0 +1,7 @@
┌───────────────┬──────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├───────────────┼──────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ TEST-SECRET-3 │ test-value-3 │ shared │
└───────────────┴──────────────┴─────────────┘

View File

@@ -0,0 +1,5 @@
STAGING-SECRET-1='staging-value-1'
STAGING-SECRET-2='staging-value-2'
TEST-SECRET-1='test-value-1'
TEST-SECRET-2='test-value-2'
TEST-SECRET-3='test-value-3'

View File

@@ -0,0 +1,3 @@
TEST-SECRET-1='test-value-1'
TEST-SECRET-2='test-value-2'
TEST-SECRET-3='test-value-3'

View File

@@ -0,0 +1,7 @@
┌─────────────────┬────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├─────────────────┼────────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
└─────────────────┴────────────────┴─────────────┘

View File

@@ -0,0 +1,7 @@
┌──────────────────┬─────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├──────────────────┼─────────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ STAGING-SECRET-2 │ staging-value-2 │ shared │
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
└──────────────────┴─────────────────┴─────────────┘

View File

@@ -0,0 +1,8 @@
┌─────────────────┬────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├─────────────────┼────────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
│ DOES-NOT-EXIST │ *not found* │ *not found* │
└─────────────────┴────────────────┴─────────────┘

View File

@@ -0,0 +1,2 @@
 Injecting 6 Infisical secrets into your application process
hello world

View File

@@ -0,0 +1,2 @@
 Injecting 5 Infisical secrets into your application process
hello world

View File

@@ -0,0 +1,2 @@
 Injecting 3 Infisical secrets into your application process
hello world

View File

@@ -0,0 +1,10 @@
┌──────────────────┬─────────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├──────────────────┼─────────────────┼─────────────┤
│ FOLDER-SECRET-1 │ folder-value-1 │ shared │
│ STAGING-SECRET-1 │ staging-value-1 │ shared │
│ STAGING-SECRET-2 │ staging-value-2 │ shared │
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ TEST-SECRET-3 │ test-value-3 │ shared │
└──────────────────┴─────────────────┴─────────────┘

View File

@@ -0,0 +1,7 @@
┌───────────────┬──────────────┬─────────────┐
│ SECRET NAME │ SECRET VALUE │ SECRET TYPE │
├───────────────┼──────────────┼─────────────┤
│ TEST-SECRET-1 │ test-value-1 │ shared │
│ TEST-SECRET-2 │ test-value-2 │ shared │
│ TEST-SECRET-3 │ test-value-3 │ shared │
└───────────────┴──────────────┴─────────────┘

View File

@@ -0,0 +1,4 @@
error: CallGetRawSecretsV3: Unsuccessful response [GET https://app.infisical.com/api/v3/secrets/raw?environment=invalid-env&include_imports=true&recursive=true&secretPath=%2F&workspaceId=bef697d4-849b-4a75-b284-0922f87f8ba2] [status-code=500] [response={"statusCode":500,"error":"Internal Server Error","message":"'invalid-env' environment not found in project with ID bef697d4-849b-4a75-b284-0922f87f8ba2"}]
If this issue continues, get support at https://infisical.com/slack

73
cli/test/export_test.go Normal file
View File

@@ -0,0 +1,73 @@
package tests
import (
"testing"
"github.com/bradleyjkemp/cupaloy/v2"
)
func TestUniversalAuth_ExportSecretsWithImports(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "export", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_ExportSecretsWithImports(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "export", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_ExportSecretsWithoutImports(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "export", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent", "--include-imports=false")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_ExportSecretsWithoutImports(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "export", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent", "--include-imports=false")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}

64
cli/test/helper.go Normal file
View File

@@ -0,0 +1,64 @@
package tests
import (
"fmt"
"os"
"os/exec"
"strings"
"testing"
)
const (
CLI_NAME = "infisical-merge"
)
var (
FORMATTED_CLI_NAME = fmt.Sprintf("./%s", CLI_NAME)
)
type Credentials struct {
ClientID string
ClientSecret string
UAAccessToken string
ServiceToken string
ProjectID string
EnvSlug string
}
var creds = Credentials{
UAAccessToken: "",
ClientID: os.Getenv("CLI_TESTS_UA_CLIENT_ID"),
ClientSecret: os.Getenv("CLI_TESTS_UA_CLIENT_SECRET"),
ServiceToken: os.Getenv("CLI_TESTS_SERVICE_TOKEN"),
ProjectID: os.Getenv("CLI_TESTS_PROJECT_ID"),
EnvSlug: os.Getenv("CLI_TESTS_ENV_SLUG"),
}
func ExecuteCliCommand(command string, args ...string) (string, error) {
cmd := exec.Command(command, args...)
output, err := cmd.CombinedOutput()
if err != nil {
return strings.TrimSpace(string(output)), err
}
return strings.TrimSpace(string(output)), nil
}
func SetupCli(t *testing.T) {
if creds.ClientID == "" || creds.ClientSecret == "" || creds.ServiceToken == "" || creds.ProjectID == "" || creds.EnvSlug == "" {
panic("Missing required environment variables")
}
// check if the CLI is already built, if not build it
alreadyBuilt := false
if _, err := os.Stat(FORMATTED_CLI_NAME); err == nil {
alreadyBuilt = true
}
if !alreadyBuilt {
if err := exec.Command("go", "build", "../.").Run(); err != nil {
t.Fatal(err)
}
}
}

29
cli/test/login_test.go Normal file
View File

@@ -0,0 +1,29 @@
package tests
import (
"testing"
"github.com/stretchr/testify/assert"
)
func MachineIdentityLoginCmd(t *testing.T) {
SetupCli(t)
if creds.UAAccessToken != "" {
return
}
jwtPattern := `^[A-Za-z0-9-_]+\.[A-Za-z0-9-_]+\.[A-Za-z0-9-_]*$`
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "login", "--method=universal-auth", "--client-id", creds.ClientID, "--client-secret", creds.ClientSecret, "--plain", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
assert.Regexp(t, jwtPattern, output)
creds.UAAccessToken = output
// We can't use snapshot testing here because the output will be different every time
}

120
cli/test/run_test.go Normal file
View File

@@ -0,0 +1,120 @@
package tests
import (
"bytes"
"testing"
"github.com/bradleyjkemp/cupaloy/v2"
)
func TestServiceToken_RunCmdRecursiveAndImports(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "run", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent", "--", "echo", "hello world")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
output = string(bytes.Split([]byte(output), []byte("INF"))[1])
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_RunCmdWithImports(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "run", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent", "--", "echo", "hello world")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
output = string(bytes.Split([]byte(output), []byte("INF"))[1])
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_RunCmdRecursiveAndImports(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "run", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent", "--", "echo", "hello world")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
output = string(bytes.Split([]byte(output), []byte("INF"))[1])
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_RunCmdWithImports(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "run", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent", "--", "echo", "hello world")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// remove the first few characters from the output because we don't care about the time, and it will change every time
output = string(bytes.Split([]byte(output), []byte("INF"))[1])
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_RunCmdWithoutImports(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "run", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent", "--include-imports=false", "--", "echo", "hello world")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
output = string(bytes.Split([]byte(output), []byte("INF"))[1])
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_RunCmdWithoutImports(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "run", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--silent", "--include-imports=false", "--", "echo", "hello world")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Remove everything before "INF" because it's not relevant to the test
output = string(bytes.Split([]byte(output), []byte("INF"))[1])
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}

View File

@@ -0,0 +1,106 @@
package tests
import (
"testing"
"github.com/bradleyjkemp/cupaloy/v2"
)
func TestServiceToken_GetSecretsByNameRecursive(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "get", "TEST-SECRET-1", "TEST-SECRET-2", "FOLDER-SECRET-1", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_GetSecretsByNameWithNotFoundSecret(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "get", "TEST-SECRET-1", "TEST-SECRET-2", "FOLDER-SECRET-1", "DOES-NOT-EXIST", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_GetSecretsByNameWithImports(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "get", "TEST-SECRET-1", "STAGING-SECRET-2", "FOLDER-SECRET-1", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_GetSecretsByNameRecursive(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "get", "TEST-SECRET-1", "TEST-SECRET-2", "FOLDER-SECRET-1", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_GetSecretsByNameWithNotFoundSecret(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "get", "TEST-SECRET-1", "TEST-SECRET-2", "FOLDER-SECRET-1", "DOES-NOT-EXIST", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_GetSecretsByNameWithImports(t *testing.T) {
MachineIdentityLoginCmd(t)
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "get", "TEST-SECRET-1", "STAGING-SECRET-2", "FOLDER-SECRET-1", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}

87
cli/test/secrets_test.go Normal file
View File

@@ -0,0 +1,87 @@
package tests
import (
"testing"
"github.com/bradleyjkemp/cupaloy/v2"
)
func TestServiceToken_SecretsGetWithImportsAndRecursiveCmd(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestServiceToken_SecretsGetWithoutImportsAndWithoutRecursiveCmd(t *testing.T) {
SetupCli(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--token", creds.ServiceToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--include-imports=false", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_SecretsGetWithImportsAndRecursiveCmd(t *testing.T) {
SetupCli(t)
MachineIdentityLoginCmd(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--recursive", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_SecretsGetWithoutImportsAndWithoutRecursiveCmd(t *testing.T) {
SetupCli(t)
MachineIdentityLoginCmd(t)
output, err := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", creds.EnvSlug, "--include-imports=false", "--silent")
if err != nil {
t.Fatalf("error running CLI command: %v", err)
}
// Use cupaloy to snapshot test the output
err = cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}
func TestUniversalAuth_SecretsGetWrongEnvironment(t *testing.T) {
SetupCli(t)
MachineIdentityLoginCmd(t)
output, _ := ExecuteCliCommand(FORMATTED_CLI_NAME, "secrets", "--token", creds.UAAccessToken, "--projectId", creds.ProjectID, "--env", "invalid-env", "--recursive", "--silent")
// Use cupaloy to snapshot test the output
err := cupaloy.Snapshot(output)
if err != nil {
t.Fatalf("snapshot failed: %v", err)
}
}

View File

@@ -0,0 +1,8 @@
---
title: "Bulk Create"
openapi: "POST /api/v3/secrets/batch/raw"
---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -0,0 +1,8 @@
---
title: "Bulk Delete"
openapi: "DELETE /api/v3/secrets/batch/raw"
---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -0,0 +1,8 @@
---
title: "Bulk Update"
openapi: "PATCH /api/v3/secrets/batch/raw"
---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -1,6 +1,6 @@
---
title: "Cassandra"
description: "How to dynamically generate Cassandra database users"
description: "How to dynamically generate Cassandra database users."
---
The Infisical Cassandra dynamic secret allows you to generate Cassandra database credentials on demand based on configured role.

View File

@@ -1,6 +1,6 @@
---
title: "PostgreSQL"
description: "How to dynamically generate PostgreSQL database users"
description: "How to dynamically generate PostgreSQL database users."
---
The Infisical PostgreSQL dynamic secret allows you to generate PostgreSQL database credentials on demand based on configured role.
@@ -115,4 +115,4 @@ To extend the life of the generated dynamic secret leases past its initial time
<Warning>
Lease renewals cannot exceed the maximum TTL set when configuring the dynamic secret
</Warning>
</Warning>

View File

@@ -1,36 +0,0 @@
---
title: "LDAP"
description: "Log in to Infisical with LDAP"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact sales@infisical.com to purchase an enterprise license to use it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol).
<Steps>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
Next, input your LDAP server settings.
![LDAP configuration](/images/platform/ldap/ldap-config.png)
Here's some guidance for each field:
- URL: The LDAP server to connect to such as `ldap://ldap.your-org.com`, `ldaps://ldap.myorg.com:636` (for connection over SSL/TLS), etc.
- Bind DN: The distinguished name of object to bind when performing the user search such as `cn=infisical,ou=Users,dc=acme,dc=com`.
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=example,dc=com`
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate.
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>

View File

@@ -4,16 +4,17 @@ description: "Learn how to log in to Infisical with LDAP."
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact sales@infisical.com to purchase an enterprise license to use it.
LDAP is a paid feature. If you're using Infisical Cloud, then it is available
under the **Enterprise Tier**. If you're self-hosting Infisical, then you
should contact sales@infisical.com to purchase an enterprise license to use
it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol)
<Steps>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
In Infisical, head to your Organization Settings > Security > LDAP and select **Manage**.
Next, input your LDAP server settings.
@@ -24,11 +25,50 @@ You can configure your organization in Infisical to have members authenticate wi
- URL: The LDAP server to connect to such as `ldap://ldap.your-org.com`, `ldaps://ldap.myorg.com:636` (for connection over SSL/TLS), etc.
- Bind DN: The distinguished name of object to bind when performing the user search such as `cn=infisical,ou=Users,dc=acme,dc=com`.
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=example,dc=com`
- User Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=acme,dc=com`.
- User Search Filter (optional): Template used to construct the LDAP user search filter such as `(uid={{username}})`; use literal `{{username}}` to have the given username used in the search. The default is `(uid={{username}})` which is compatible with several common directory schemas.
- Group Search Base / Group DN (optional): LDAP search base to use for group membership search such as `ou=Groups,dc=acme,dc=com`.
- Group Filter (optional): Template used when constructing the group membership query such as `(&(objectClass=posixGroup)(memberUid={{.Username}}))`. The template can access the following context variables: [`UserDN`, `UserName`]. The default is `(|(memberUid={{.Username}})(member={{.UserDN}})(uniqueMember={{.UserDN}}))` which is compatible with several common directory schemas.
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate.
<Note>
The **Group Search Base / Group DN** and **Group Filter** fields are both required if you wish to sync LDAP groups to Infisical.
</Note>
</Step>
<Step title="Test the LDAP connection">
Once you've filled out the LDAP configuration, you can test that part of the configuration is correct by pressing the **Test Connection** button.
Infisical will attempt to bind to the LDAP server using the provided **URL**, **Bind DN**, and **Bind Pass**. If the operation is successful, then Infisical will display a success message; if not, then Infisical will display an error message and provide a fuller error in the server logs.
![LDAP test connection](/images/platform/ldap/ldap-test-connection.png)
</Step>
<Step title="Define mappings from LDAP groups to groups in Infisical">
In order to sync LDAP groups to Infisical, head to the **LDAP Group Mappings** section to define mappings from LDAP groups to groups in Infisical.
![LDAP group mappings section](/images/platform/ldap/ldap-group-mappings-section.png)
Group mappings ensure that users who log into Infisical via LDAP are added to or removed from the Infisical group(s) that corresponds to the LDAP group(s) they are a member of.
![LDAP group mappings table](/images/platform/ldap/ldap-group-mappings-table.png)
Each group mapping consists of two parts:
- LDAP Group CN: The common name of the LDAP group to map.
- Infisical Group: The Infisical group to map the LDAP group to.
For example, suppose you want to automatically add a user who is part of the LDAP group with CN `Engineers` to the Infisical group `Engineers` when the user sets up their account with Infisical.
In this case, you would specify a mapping from the LDAP group with CN `Engineers` to the Infisical group `Engineers`.
Now when the user logs into Infisical via LDAP, Infisical will check the LDAP groups that the user is a part of whilst referencing the group mappings you created earlier. Since the user is a member of the LDAP group with CN `Engineers`, they will be added to the Infisical group `Engineers`.
In the future, if the user is no longer part of the LDAP group with CN `Engineers`, they will be removed from the Infisical group `Engineers` upon their next login.
<Note>
Prior to defining any group mappings, ensure that you've created the Infisical groups that you want to map the LDAP groups to.
You can read more about creating (user) groups in Infisical [here](/documentation/platform/groups).
</Note>
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>
</Steps>

View File

@@ -4,9 +4,10 @@ description: "Learn how to configure JumpCloud LDAP for authenticating into Infi
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact sales@infisical.com to purchase an enterprise license to use it.
LDAP is a paid feature. If you're using Infisical Cloud, then it is available
under the **Enterprise Tier**. If you're self-hosting Infisical, then you
should contact sales@infisical.com to purchase an enterprise license to use
it.
</Info>
<Steps>
@@ -17,13 +18,12 @@ description: "Learn how to configure JumpCloud LDAP for authenticating into Infi
When creating the user, input their **First Name**, **Last Name**, **Username** (required), **Company Email** (required), and **Description**.
Also, create a password for the user.
Next, under User Security Settings and Permissions > Permission Settings, check the box next to **Enable as LDAP Bind DN**.
Next, under User Security Settings and Permissions > Permission Settings, check the box next to **Enable as LDAP Bind DN**.
![LDAP JumpCloud](/images/platform/ldap/jumpcloud/ldap-jumpcloud-enable-bind-dn.png)
</Step>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
In Infisical, head to your Organization Settings > Security > LDAP and select **Manage**.
Next, input your JumpCloud LDAP server settings.
@@ -34,21 +34,57 @@ description: "Learn how to configure JumpCloud LDAP for authenticating into Infi
- URL: The LDAP server to connect to (`ldaps://ldap.jumpcloud.com:636`).
- Bind DN: The distinguished name of object to bind when performing the user search (`uid=<ldap-user-username>,ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com`).
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search (`ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com`).
- User Search Base / User DN: Base DN under which to perform user search (`ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com`).
- User Search Filter (optional): Template used to construct the LDAP user search filter (`(uid={{username}})`).
- Group Search Base / Group DN (optional): LDAP search base to use for group membership search (`ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com`).
- Group Filter (optional): Template used when constructing the group membership query (`(&(objectClass=groupOfNames)(member=uid={{.Username}},ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com))`)
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate (instructions to obtain the certificate for JumpCloud [here](https://jumpcloud.com/support/connect-to-ldap-with-tls-ssl)).
<Tip>
When filling out the **Bind DN** and **Bind Pass** fields, refer to the username and password of the user created in Step 1.
Also, for the **Bind DN** and **Search Base / User DN** fields, you'll want to use the organization ID that appears
Also, for the **Bind DN** and **Search Base / User DN** fields, you'll want to use the organization ID that appears
in your LDAP instance **ORG DN**.
</Tip>
</Step>
<Step title="Test the LDAP connection">
Once you've filled out the LDAP configuration, you can test that part of the configuration is correct by pressing the **Test Connection** button.
Infisical will attempt to bind to the LDAP server using the provided **URL**, **Bind DN**, and **Bind Pass**. If the operation is successful, then Infisical will display a success message; if not, then Infisical will display an error message and provide a fuller error in the server logs.
![LDAP test connection](/images/platform/ldap/ldap-test-connection.png)
</Step>
<Step title="Define mappings from LDAP groups to groups in Infisical">
In order to sync LDAP groups to Infisical, head to the **LDAP Group Mappings** section to define mappings from LDAP groups to groups in Infisical.
![LDAP group mappings section](/images/platform/ldap/ldap-group-mappings-section.png)
Group mappings ensure that users who log into Infisical via LDAP are added to or removed from the Infisical group(s) that corresponds to the LDAP group(s) they are a member of.
![LDAP group mappings table](/images/platform/ldap/ldap-group-mappings-table.png)
Each group mapping consists of two parts:
- LDAP Group CN: The common name of the LDAP group to map.
- Infisical Group: The Infisical group to map the LDAP group to.
For example, suppose you want to automatically add a user who is part of the LDAP group with CN `Engineers` to the Infisical group `Engineers` when the user sets up their account with Infisical.
In this case, you would specify a mapping from the LDAP group with CN `Engineers` to the Infisical group `Engineers`.
Now when the user logs into Infisical via LDAP, Infisical will check the LDAP groups that the user is a part of whilst referencing the group mappings you created earlier. Since the user is a member of the LDAP group with CN `Engineers`, they will be added to the Infisical group `Engineers`.
In the future, if the user is no longer part of the LDAP group with CN `Engineers`, they will be removed from the Infisical group `Engineers` upon their next login.
<Note>
Prior to defining any group mappings, ensure that you've created the Infisical groups that you want to map the LDAP groups to.
You can read more about creating (user) groups in Infisical [here](/documentation/platform/groups).
</Note>
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>
Resources:
- [JumpCloud Cloud LDAP Guide](https://jumpcloud.com/support/use-cloud-ldap)
- [JumpCloud Cloud LDAP Guide](https://jumpcloud.com/support/use-cloud-ldap)

Some files were not shown because too many files have changed in this diff Show More