Compare commits

...

207 Commits

Author SHA1 Message Date
95c914631a patch notify user on risk found 2023-07-17 21:52:24 -04:00
49ae61da08 remove border from risk selection 2023-07-17 21:49:58 -04:00
993abd0921 add secret scanning status to api 2023-07-17 21:28:47 -04:00
f37b497e48 Update overview.mdx 2023-07-17 21:11:27 -04:00
0d2e55a06f add telemetry for cloud secret scanning 2023-07-17 20:29:20 -04:00
040243d4f7 add telemetry for cloud secret scanning 2023-07-17 20:29:07 -04:00
c450b01763 update email for secret leak 2023-07-17 20:20:11 -04:00
4cd203c194 add ss-webhook to values file k8-infisical 2023-07-17 19:56:07 -04:00
178d444deb add web hook under api temporarily 2023-07-17 18:58:39 -04:00
139ca9022e Update build-staging-img.yml 2023-07-17 17:36:57 -04:00
34d3e80d17 Merge pull request #743 from Infisical/git-scanning-app
bring back secret engine for dev
2023-07-17 17:21:34 -04:00
deac5fe101 Merge branch 'main' into git-scanning-app 2023-07-17 17:20:04 -04:00
216f3a0d1b reload page after org link 2023-07-17 17:18:55 -04:00
43f4110c94 update risk status names 2023-07-17 16:46:15 -04:00
56d430afd6 update risk status and update email notifications 2023-07-17 16:41:33 -04:00
38b6a48bee Merge pull request #754 from JunedKhan101/docs-typo-fix
fixed typo
2023-07-17 10:49:46 -04:00
53abce5780 remove secret engine folder 2023-07-16 16:51:01 -04:00
8c844fb188 move secret scanning to main container 2023-07-16 16:48:36 -04:00
a9135cdbcd fixed typo 2023-07-16 14:47:35 +05:30
9b96daa185 Merge pull request #752 from afrieirham/feat/sort-integrations-alphabetically
feat: sort cloud and framework integrations alphabetically
2023-07-16 14:34:26 +05:30
9919d3ee6a feat: sort cloud and framework integrations alphabetically 2023-07-16 11:05:37 +08:00
dfcd6b1efd changed docs structure 2023-07-14 19:14:36 -07:00
07bc4c4a3a change docs structure 2023-07-14 19:11:39 -07:00
d69465517f Added styling 2023-07-14 16:26:18 -07:00
6d807c0c74 Merge pull request #749 from RezaRahemtola/fix/cli-vault-cmd-last-line-break
fix(cli): Missing trailing linebreak in vault commands
2023-07-14 18:38:23 -04:00
868cc80210 fix(cli): Missing trailing linebreak in vault commands 2023-07-14 23:09:25 +02:00
3d4a616147 remove secret scanning from prod docker compose 2023-07-14 15:21:04 -04:00
bd3f9130e4 Merge pull request #747 from unkletayo/adetayoreadme-youtubelink-fix
docs(readme):update broken YouTube  page link
2023-07-14 09:19:51 -07:00
f607841acf Update README.md with the correct youtube link 2023-07-14 17:15:09 +01:00
55d813043d Update README.md
This PR fixes broken link to the YouTube page in the Readme file
2023-07-14 08:15:51 +01:00
b2a3a3a0e6 added click-to-copy and changed the slack link 2023-07-13 19:09:00 -07:00
67d5f52aca extract correct params after git app install 2023-07-13 19:56:49 -04:00
a34047521c styled cli redirect 2023-07-13 16:37:05 -07:00
7ff806e8a6 fixed the signup orgId issue 2023-07-13 16:16:00 -07:00
9763353d59 Fixed routing issues 2023-07-13 16:09:33 -07:00
4382935cb5 Merge pull request #733 from akhilmhdh/feat/webhooks
Feat/webhooks
2023-07-13 18:47:47 -04:00
7e3646ddcd add docs on how to pin k8 operator to avoid breaking changes 2023-07-13 17:53:59 -04:00
f7766fc182 fix: resolved just space in a secret value and not changing save state 2023-07-13 23:53:24 +05:30
3176370ef6 feat(webhook): removed console.log 2023-07-13 23:22:20 +05:30
9bed1682fc feat(webhooks): updated docs 2023-07-13 23:22:20 +05:30
daf2e2036e feat(webhook): implemented ui for webhooks 2023-07-13 23:22:20 +05:30
0f81c78639 feat(webhook): implemented api for webhooks 2023-07-13 23:21:18 +05:30
8a19cfe0c6 removed secret scanning from the menu 2023-07-13 10:31:54 -07:00
a00fec9bca trigger standalone docker img too 2023-07-13 11:23:41 -04:00
209f224517 Merge pull request #745 from Infisical/docs-sdk
Remove individual SDK pages from docs
2023-07-13 17:10:26 +07:00
0b7f2b7d4b Remove individual SDK pages from docs in favor of each SDKs README on GitHub 2023-07-13 17:08:32 +07:00
eff15fc3d0 Merge pull request #744 from Infisical/usage-billing
Fix subscription context get organization from useOrganization
2023-07-13 17:07:42 +07:00
2614459772 Fix subscription context get organization from useOrganization 2023-07-13 17:01:53 +07:00
4e926746cf fixing the pro trial bug 2023-07-12 15:46:42 -07:00
f022f6d3ee update secret engine port 2023-07-12 16:39:45 -04:00
1133ae4ae9 bring back secret engine for dev 2023-07-12 16:10:09 -04:00
edd5afa13b remove secret engine from main 2023-07-12 15:50:36 -04:00
442f572acc Merge branch 'infisical-radar-app' into main 2023-07-12 12:12:24 -07:00
be58f3c429 removed the learning item from sidebar 2023-07-12 11:50:36 -07:00
3eea5d9322 Merge pull request #735 from Infisical/new-sidebars
fixing the bugs with sidebars
2023-07-12 11:23:26 -07:00
e4e87163e8 removed org member section 2023-07-12 11:19:56 -07:00
d3aeb729e0 fixing ui/ux bugs 2023-07-12 11:18:42 -07:00
2e7c7cf1da fix typo in folder docs 2023-07-12 01:41:14 -04:00
5d39416532 replace cli quick start 2023-07-12 01:38:59 -04:00
af95adb589 Update usage.mdx 2023-07-12 01:31:09 -04:00
0fc4f96773 Merge pull request #736 from Infisical/revamp-docs
Revamp core docs
2023-07-12 01:29:10 -04:00
0a9adf33c8 revamp core docs 2023-07-12 01:23:28 -04:00
f9110cedfa fixing the bug with switching orgs 2023-07-11 22:13:54 -07:00
88ec55fc49 Merge pull request #700 from Infisical/new-sidebars
new sidebars
2023-07-11 17:29:48 -07:00
98b2a2a5c1 adding trial to the sidebar 2023-07-11 17:26:36 -07:00
27eeafbf36 Merge pull request #730 from Infisical/main
Catching up the branch
2023-07-11 16:19:39 -07:00
0cf63028df fixing style and solving merge conflicts 2023-07-11 16:19:07 -07:00
0b52b3cf58 Update mint.json 2023-07-11 14:14:23 -07:00
e1764880a2 Update overview.mdx 2023-07-11 14:09:57 -07:00
d3a47ffcdd Update mint.json 2023-07-11 13:56:24 -07:00
9c1f88bb9c Update mint.json 2023-07-11 13:49:55 -07:00
ae2f3184e2 Merge pull request #711 from afrieirham/form-ux-enhancement
fix: enable users to press `Enter` in forms
2023-07-11 16:34:21 -04:00
3f1db47c30 Merge pull request #731 from Infisical/office-365-smtp
Add support for Office365 SMTP
2023-07-11 15:04:26 +07:00
3e3bbe298d Add support for Office365 SMTP 2023-07-11 14:50:41 +07:00
46dc357651 final changes to sidebars 2023-07-11 00:04:14 -07:00
07d25cb673 extract version from tag 2023-07-10 23:26:14 -04:00
264f75ce8e correct gha for k8 operator 2023-07-10 23:20:45 -04:00
9713a19405 add semvar to k8 images 2023-07-10 23:14:10 -04:00
ccfb8771f1 Merge pull request #728 from JunedKhan101/feature-723-remove-trailing-slash
Implemented feature to remove the trailing slash from the domain url
2023-07-10 10:26:53 -04:00
b36801652f Merge pull request #729 from Infisical/trial-revamp
Infisical Cloud Pro Free Trial Update
2023-07-10 15:13:28 +07:00
9e5b9cbdb5 Fix lint errors 2023-07-10 15:06:00 +07:00
bdf4ebd1bc second iteration of the new sidebar 2023-07-09 23:58:27 -07:00
e91e7f96c2 Update free plan logic 2023-07-10 13:48:46 +07:00
34fef4aaad Implemented feature to remove the trailing slash from the domain url 2023-07-10 12:16:51 +05:30
09330458e5 Merge pull request #721 from agoodman1999/main
add --path flag to docs for infisical secrets set
2023-07-10 00:09:09 -04:00
ed95b99ed1 Merge branch 'main' into main 2023-07-10 00:08:25 -04:00
dc1e1e8dcb Merge pull request #726 from RezaRahemtola/fix/docs
fix(docs): Wrong integration name and missing link
2023-07-10 00:05:47 -04:00
13a81c9222 add 401 error message for get secrets in cli 2023-07-09 23:25:35 -04:00
6354464859 update terraform docs with path and env 2023-07-09 22:40:00 -04:00
ec26404b94 Merge pull request #727 from Infisical/main
Catching up with main
2023-07-09 11:13:40 -07:00
5ef2508736 docs: Add missing pull request contribution link 2023-07-09 15:44:25 +02:00
93264fd2d0 docs: Fix wrong integration name 2023-07-09 15:40:59 +02:00
7020c7aeab fix: completing allow user to press Enter in forgot password flow 2023-07-09 15:08:25 +08:00
25b1673321 improve k8 operator docs 2023-07-08 21:48:06 -04:00
628bc711c2 update k8 docks for quick start 2023-07-08 21:12:05 -04:00
a3b4228685 add path to export command 2023-07-08 16:15:45 -04:00
374c8e4a1a Update ingress class values.yaml 2023-07-08 13:47:13 -04:00
5afcf2798f Update build-staging-img.yml 2023-07-08 13:32:35 -04:00
1657cf0a7e Update values.yaml 2023-07-08 13:16:10 -04:00
c9820d0071 Update values.yaml 2023-07-08 12:55:49 -04:00
b53c046eef Merge pull request #713 from akhilmhdh/feat/secret-reference
secret reference
2023-07-07 19:02:57 -04:00
fd10d7ed34 add docs for k8 secret refs 2023-07-07 18:59:23 -04:00
c5aae44249 add docs for k8 secret refs 2023-07-07 18:56:38 -04:00
83aa6127ec update k8 chart version 2023-07-07 15:56:47 -04:00
5a2299f758 update k8 operator crd for secret refs 2023-07-07 15:55:45 -04:00
57cdab0727 update k8 operator crd for secret refs 2023-07-07 15:55:22 -04:00
f82fa1b3b3 add secret reference support 2023-07-07 15:49:21 -04:00
e95eef2071 Merge branch 'main' of https://github.com/Infisical/infisical 2023-07-07 13:01:51 +07:00
53efdac0f0 Bring back catch TokenExpiredError in backend error-handling middleware 2023-07-07 13:01:38 +07:00
f5eafc39c5 Merge pull request #717 from atimapreandrew/add-laravel-forge-docs
Added docs for Laravel Forge Integration
2023-07-07 12:14:47 +07:00
0f72ccf82e Remove Laravel Forge from self-hosting docs, update image name 2023-07-07 12:13:47 +07:00
c191eb74fd Update README.md 2023-07-06 21:39:05 -07:00
f9fca42c5b fix incorrect leading slash in example 2023-07-06 13:36:15 -04:00
11a19eef07 add --path flag to docs for infisical secrets set 2023-07-06 13:20:48 -04:00
8a237af4ac feat(secret-ref): updated reference corner cases of trailing slashes 2023-07-06 22:15:10 +05:30
24413e1edd Added docs for Laravel Forge Integration 2023-07-06 15:43:43 +01:00
5aba0c60b8 feat(secret-ref): removed migration field unset op, refactored service token scope check to a utility fn 2023-07-06 20:01:46 +05:30
5599132efe fix(secret-ref): resolved service token unable to fetch secrets in cli 2023-07-06 18:58:48 +05:30
7f9e27e3d3 Update README.md 2023-07-05 15:41:38 -07:00
7d36360111 Updated AWS deploy image 2023-07-05 15:34:22 -07:00
d350297ce1 Deploy to AWS button updated 2023-07-05 15:28:17 -07:00
18d4e42d1f Update README.md 2023-07-05 15:19:54 -07:00
9faf5a3d5c add secret scanning to gamma values 2023-07-05 18:17:09 -04:00
da113612eb diable secret scan by default 2023-07-05 18:09:46 -04:00
e9e2eade89 update helm chart version 2023-07-05 17:56:30 -04:00
3cbc9c1b5c update helm chart to include git app 2023-07-05 17:54:29 -04:00
0772510e47 update gha for git app gamma deploy 2023-07-05 15:52:43 -04:00
f389aa07eb update docker file for prod build 2023-07-05 15:39:44 -04:00
27a110a93a build secret scanning 2023-07-05 15:22:29 -04:00
13eaa4e9a1 feat(secret-ref): updated doc 2023-07-05 23:00:17 +05:30
7ec7d05fb0 feat(secret-ref): implemented cli changes for secret reference 2023-07-05 23:00:17 +05:30
7fe4089bb0 feat(secret-ref): implemented ui for service token changes 2023-07-05 23:00:17 +05:30
0cee453202 feat(secret-ref): implemented backend changes for multi env and folder in service token 2023-07-05 23:00:17 +05:30
088d8097a9 Merge pull request #712 from atimapreandrew/laravel-forge-integration
Laravel forge integration
2023-07-05 23:43:43 +07:00
4e6fae03ff Patch sync Laravel Forge integration 2023-07-05 23:40:43 +07:00
732d0dfdca Added docs for Laravel Forge Integration 2023-07-05 13:45:10 +01:00
93e0232c21 fix: allow user to press Enter in forgot password page 2023-07-05 19:02:48 +08:00
37707c422a fix: allow user to press Enter in login page 2023-07-05 18:40:48 +08:00
2f1bd9ca61 fix: enable user to press Enter in signup flow 2023-07-05 18:32:03 +08:00
3d9ddbf9bc Merge branch 'main' of https://github.com/Infisical/infisical 2023-07-05 13:52:06 +07:00
7c9140dcec Update trial message 2023-07-05 13:51:50 +07:00
a63d179a0d add email notifications for risks 2023-07-04 22:06:29 -04:00
95dd8718bd Merge pull request #709 from raykeating/add-path-flag-to-infisical-run-docs
add --path flag to docs
2023-07-04 20:25:56 -04:00
ff2c9e98c0 add --path flag to docs 2023-07-04 19:48:36 -04:00
23f4a350e7 Added docs for Laravel Forge Integration 2023-07-04 21:08:15 +01:00
696225d8d2 laravel forge integration 2023-07-04 20:01:49 +01:00
6c1ccc17b3 laravel forge integration 2023-07-04 19:28:42 +01:00
aa60f3a664 Merge branch 'main' of github.com:atimapreandrew/infisical 2023-07-04 17:49:08 +01:00
f01fb2830a patch Eslint GetToken issue 2023-07-04 11:11:05 -04:00
9f6aa6b13e add v1 secret scanning 2023-07-04 10:54:44 -04:00
b2ee15a4ff Merge pull request #708 from Infisical/free-trial
Initialize users on Infisical Cloud to Pro (Trial) Tier
2023-07-04 16:26:05 +07:00
42de0fbe73 Fix lint errors 2023-07-04 16:22:06 +07:00
553c986aa8 Update free trial indicator in usage and billing page 2023-07-04 16:01:20 +07:00
9a1e2260a0 Merge pull request #701 from Infisical/main
Update branch
2023-06-30 16:54:26 -07:00
98f7ce2585 Merge branch 'main' of github.com:atimapreandrew/infisical 2023-06-30 17:55:22 +01:00
c30ec8cb5f Merge pull request #697 from Infisical/revamp-project-settings
Standardize styling of Project Settings Page
2023-06-30 16:44:02 +07:00
104c752f9a Finish preliminary standardization of project settings page 2023-06-30 16:38:54 +07:00
b66bea5671 Merge pull request #692 from akhilmhdh/feat/multi-line-secrets
multi line support for secrets
2023-06-29 17:35:25 -04:00
f9313204a7 add docs for k8 re sync interval 2023-06-29 16:08:43 -04:00
cb5c371a4f add re-sync interval 2023-06-29 15:02:53 -04:00
a32df58f46 Merge pull request #695 from Infisical/check-rbac
Rewire RBAC paywall to new mechanism
2023-06-29 18:53:07 +07:00
e2658cc8dd Rewire RBAC paywall to new mechanism 2023-06-29 18:47:35 +07:00
1fbec20c6f Merge pull request #694 from Infisical/clean-org-settings
Clean Personal Settings and Organization Settings Pages
2023-06-29 18:19:24 +07:00
ddff8be53c Fix build error 2023-06-29 18:15:59 +07:00
114d488345 Fix merge conflicts 2023-06-29 17:53:33 +07:00
c4da5a6ead Fix merge conflicts 2023-06-29 17:49:01 +07:00
056f5a4555 Finish preliminary making user settings, org settings styling similar to usage and billing page 2023-06-29 17:47:23 +07:00
dfc88d99f6 first draft new sidebar 2023-06-28 14:28:52 -07:00
033f41a7d5 Merge branch 'main' of github.com:atimapreandrew/infisical 2023-06-28 19:15:08 +01:00
5612a01039 fix(multi-line): resolved linting issues 2023-06-28 20:50:02 +05:30
f1d609cf40 fix: resolved secret version empty 2023-06-28 20:32:12 +05:30
0e9c71ae9f feat(multi-line): added support for multi-line in ui 2023-06-28 20:32:12 +05:30
d1af399489 Merge pull request #684 from akhilmhdh/feat/integrations-page-revamp
integrations page revamp
2023-06-27 17:50:49 -04:00
f445bac42f swap out for v3 secrets 2023-06-27 17:20:30 -04:00
798f091ff2 fix fetching secrets via service token 2023-06-27 15:00:03 -04:00
8381944bb2 feat(integrations-page): fixed id in delete modal 2023-06-27 23:56:43 +05:30
f9d0e0d971 Replace - with Unlimited in compare plans table 2023-06-27 22:00:13 +07:00
29d50f850b Correct current plan text in usage and billing 2023-06-27 19:01:31 +07:00
81c69d92b3 Restyle org name change section 2023-06-27 18:48:26 +07:00
5cd9f37fdf Merge pull request #687 from Infisical/paywalls
Add paywall for PIT and redirect paywall to contact sales in self-hosted
2023-06-27 17:49:42 +07:00
1cf65aca1b Remove print statement 2023-06-27 17:46:36 +07:00
470c429bd9 Merge remote-tracking branch 'origin' into paywalls 2023-06-27 17:46:18 +07:00
c8d081e818 Remove print statement 2023-06-27 17:45:20 +07:00
492c6a6f97 Fix lint errors 2023-06-27 17:30:37 +07:00
1dfd18e779 Add paywall for PIT and redirect paywall to contact sales in self-hosted 2023-06-27 17:19:33 +07:00
caed17152d Merge pull request #686 from Infisical/org-settings
Revamped organization usage and billing page for Infisical Cloud
2023-06-27 16:16:02 +07:00
825143f17c Adjust breadcrumb spacing 2023-06-27 16:12:18 +07:00
da144b4d02 Hide usage and billing from Navbar in self-hosted 2023-06-27 15:56:48 +07:00
f4c4545099 Merge remote-tracking branch 'origin' into org-settings 2023-06-27 15:39:51 +07:00
924a969307 Fix lint errors for revamped billing and usage page 2023-06-27 15:39:36 +07:00
072f6c737c UI update to inetgrations 2023-06-26 18:08:00 -07:00
5f683dd389 feat(integrations-page): updated current integrations width and fixed id in delete modal 2023-06-26 14:31:13 +05:30
2526cbe6ca Add padding Checkly integration page 2023-06-26 12:39:29 +07:00
6959fc52ac minor style updates 2023-06-25 21:49:28 -07:00
81bd684305 removed unnecessary variable declarations 2023-06-25 17:17:29 +01:00
68c8dad829 Merge pull request #682 from atimapreandrew/remove-unnecessary-backend-dependencies
removed await-to-js and builder-pattern dependencies from backend
2023-06-25 18:41:56 +07:00
ca3f7bac6c Remove catch error-handling in favor of error-handling middleware 2023-06-25 17:31:19 +07:00
a127d452bd Continue to make progress on usage and billing page revamp 2023-06-25 17:03:41 +07:00
7c77cc4ea4 fix(integrations-page): eslint fixes to the new upstream changes made 2023-06-24 23:44:52 +05:30
9c0e32a790 fix(integrations-page): added back cloudflare changes in main integrations page 2023-06-24 23:35:55 +05:30
611fae785a chore: updated to latested storybook v7 stable version 2023-06-24 23:31:37 +05:30
0ef4ac1cdc feat(integration-page): implemented new optimized integrations page 2023-06-24 23:31:37 +05:30
c04ea7e731 feat(integration-page): updated components and api hooks 2023-06-24 23:30:27 +05:30
9bdecaf02f removed await-to-js and builder-pattern dependencies from backend 2023-06-24 00:29:31 +01:00
6b222bad01 youtube link change 2023-06-22 19:49:21 -07:00
079d68c042 remove dummy file content 2023-06-22 22:28:39 -04:00
4b800202fb git app with probot 2023-06-22 22:26:23 -04:00
356 changed files with 24914 additions and 13971 deletions

BIN
.github/images/Deploy to AWS.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

BIN
.github/images/deploy-to-aws.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

11
.github/values.yaml vendored
View File

@ -1,3 +1,10 @@
# secretScanningGitApp:
# enabled: false
# deploymentAnnotations:
# secrets.infisical.com/auto-reload: "true"
# image:
# repository: infisical/staging_deployment_secret-scanning-git-app
frontend:
enabled: true
name: frontend
@ -51,8 +58,8 @@ mongodbConnection:
ingress:
enabled: true
annotations:
kubernetes.io/ingress.class: "nginx"
# annotations:
# kubernetes.io/ingress.class: "nginx"
# cert-manager.io/issuer: letsencrypt-nginx
hostName: gamma.infisical.com ## <- Replace with your own domain
frontend:

View File

@ -135,7 +135,7 @@ jobs:
- name: Download helm values to file and upgrade gamma deploy
run: |
wget https://raw.githubusercontent.com/Infisical/infisical/main/.github/values.yaml
helm upgrade infisical infisical-helm-charts/infisical --values values.yaml --wait
helm upgrade infisical infisical-helm-charts/infisical --values values.yaml --wait --install
if [[ $(helm status infisical) == *"FAILED"* ]]; then
echo "Helm upgrade failed"
exit 1

View File

@ -1,11 +1,17 @@
name: Release standalone docker image
on: [workflow_dispatch]
on:
push:
tags:
- "infisical/v*.*.*"
jobs:
infisical-standalone:
name: Build infisical standalone image
runs-on: ubuntu-latest
steps:
- name: Extract version from tag
id: extract_version
run: echo "::set-output name=version::${GITHUB_REF_NAME#infisical/}"
- name: ☁️ Checkout source
uses: actions/checkout@v3
with:
@ -64,5 +70,6 @@ jobs:
tags: |
infisical/infisical:latest
infisical/infisical:${{ steps.commit.outputs.short }}
infisical/infisical:${{ steps.extract_version.outputs.version }}
platforms: linux/amd64,linux/arm64
file: Dockerfile.standalone-infisical

View File

@ -1,10 +1,16 @@
name: Release Docker image for K8 operator
on: [workflow_dispatch]
name: Release Docker image for K8 operator
on:
push:
tags:
- "infisical-k8-operator/v*.*.*"
jobs:
release:
runs-on: ubuntu-latest
steps:
- name: Extract version from tag
id: extract_version
run: echo "::set-output name=version::${GITHUB_REF_NAME#infisical-k8-operator/}"
- uses: actions/checkout@v2
- name: 🔧 Set up QEMU
@ -26,4 +32,6 @@ jobs:
context: k8-operator
push: true
platforms: linux/amd64,linux/arm64
tags: infisical/kubernetes-operator:latest
tags: |
infisical/kubernetes-operator:latest
infisical/kubernetes-operator:${{ steps.extract_version.outputs.version }}

View File

@ -3,17 +3,26 @@
<img width="300" src="/img/logoname-white.svg#gh-dark-mode-only" alt="infisical">
</h1>
<p align="center">
<p align="center">Open-source, end-to-end encrypted platform to manage secrets and configs across your team and infrastructure.</p>
<p align="center"><b>Open-source, end-to-end encrypted secret management platform</b>: distribute secrets/configs across your team/infrastructure and prevent secret leaks.</p>
</p>
<h4 align="center">
<a href="https://join.slack.com/t/infisical-users/shared_invite/zt-1wehzfnzn-1aMo5JcGENJiNAC2SD8Jlg">Slack</a> |
<a href="https://infisical.com/slack">Slack</a> |
<a href="https://infisical.com/">Infisical Cloud</a> |
<a href="https://infisical.com/docs/self-hosting/overview">Self-Hosting</a> |
<a href="https://infisical.com/docs/documentation/getting-started/introduction">Docs</a> |
<a href="https://www.infisical.com">Website</a>
</h4>
<p align="center">
<a href="https://infisical.com/docs/self-hosting/deployment-options/aws-ec2">
<img src=".github/images/deploy-to-aws.png" width="137" />
</a>
<a href="https://infisical.com/docs/self-hosting/deployment-options/digital-ocean-marketplace" alt="Deploy to DigitalOcean">
<img width="200" alt="Deploy to DO" src="https://www.deploytodo.com/do-btn-blue.svg"/>
</a>
</p>
<h4 align="center">
<a href="https://github.com/Infisical/infisical/blob/main/LICENSE">
<img src="https://img.shields.io/badge/license-MIT-blue.svg" alt="Infisical is released under the MIT license." />
@ -25,9 +34,9 @@
<img src="https://img.shields.io/github/commit-activity/m/infisical/infisical" alt="git commit activity" />
</a>
<a href="https://cloudsmith.io/~infisical/repos/">
<img src="https://img.shields.io/badge/Downloads-305.8k-orange" alt="Cloudsmith downloads" />
<img src="https://img.shields.io/badge/Downloads-395.8k-orange" alt="Cloudsmith downloads" />
</a>
<a href="https://join.slack.com/t/infisical-users/shared_invite/zt-1wehzfnzn-1aMo5JcGENJiNAC2SD8Jlg">
<a href="https://infisical.com/slack">
<img src="https://img.shields.io/badge/chat-on%20Slack-blueviolet" alt="Slack community channel" />
</a>
<a href="https://twitter.com/infisical">
@ -54,24 +63,18 @@ We're on a mission to make secret management more accessible to everyone, not ju
- **[Secret versioning](https://infisical.com/docs/documentation/platform/secret-versioning)** and **[Point-in-Time Recovery]()** to version every secret and project state
- **[Audit logs](https://infisical.com/docs/documentation/platform/audit-logs)** to record every action taken in a project
- **Role-based Access Controls** per environment
- [**Simple on-premise deployments** to AWS and Digital Ocean](https://infisical.com/docs/self-hosting/overview)
- [**Secret Scanning**](https://infisical.com/docs/cli/scanning-overview)
- [**Simple on-premise deployments** to AWS, Digital Ocean, and more](https://infisical.com/docs/self-hosting/overview)
- [**Secret Scanning and Leak Prevention**](https://infisical.com/docs/cli/scanning-overview)
And much more.
## Getting started
Check out the [Quickstart](https://infisical.com/docs/getting-started/introduction) Guides
Check out the [Quickstart Guides](https://infisical.com/docs/getting-started/introduction)
### Use Infisical Cloud
The fastest and most reliable way to get started with Infisical is signing up for free to [Infisical Cloud](https://app.infisical.com/login).
### Deploy Infisical on premise
<a href="https://infisical.com/docs/self-hosting/deployment-options/digital-ocean-marketplace"><img src=".github/images/do-k8-install-btn.png" width="200"/></a> <a href="https://infisical.com/docs/self-hosting/deployment-options/aws-ec2"><img src=".github/images/deploy-aws-button.png" width="150" width="300" /></a>
View all [deployment options](https://infisical.com/docs/self-hosting/overview)
| Use Infisical Cloud | Deploy Infisical on premise |
| --- | ----------- |
| The fastest and most reliable way to <br> get started with Infisical is signing up <br> for free to [Infisical Cloud](https://app.infisical.com/login). | <a href="https://infisical.com/docs/self-hosting/deployment-options/aws-ec2"><img src=".github/images/deploy-to-aws.png" width="150" width="300" /></a> <a href="https://infisical.com/docs/self-hosting/deployment-options/digital-ocean-marketplace" alt="Deploy to DigitalOcean"> <img width="217" alt="Deploy to DO" src="https://www.deploytodo.com/do-btn-blue.svg"/> </a> <br> View all [deployment options](https://infisical.com/docs/self-hosting/overview) |
### Run Infisical locally
@ -92,7 +95,7 @@ git clone https://github.com/Infisical/infisical && cd infisical && copy .env.ex
Create an account at `http://localhost:80`
### Scan and prevent secret leaks
On top managing secrets with Infisical, you can also scan for over 140+ secret types in your files, directories and git repositories.
On top managing secrets with Infisical, you can also [scan for over 140+ secret types]() in your files, directories and git repositories.
To scan your full git history, run:
@ -111,7 +114,11 @@ Lean about Infisical's code scanning feature [here](https://infisical.com/docs/c
## Open-source vs. paid
This repo available under the [MIT expat license](https://github.com/Infisical/infisical/blob/main/LICENSE), with the exception of the `ee` directory which will contain premium enterprise features requiring a Infisical license in the future.
This repo available under the [MIT expat license](https://github.com/Infisical/infisical/blob/main/LICENSE), with the exception of the `ee` directory which will contain premium enterprise features requiring a Infisical license.
If you are interested in managed Infisical Cloud of self-hosted Enterprise Offering, take a look at [our webiste](https://infisical.com/) or [book a meeting with us](https://cal.com/vmatsiiako/infisical-demo):
<a href="https://cal.com/vmatsiiako/infisical-demo"><img alt="Schedule a meeting" src="https://cal.com/book-with-cal-dark.svg" /></a>
## Security
@ -128,15 +135,15 @@ Whether it's big or small, we love contributions. Check out our guide to see how
Not sure where to get started? You can:
- [Book a free, non-pressure pairing session / code walkthrough with one of our teammates](https://cal.com/tony-infisical/30-min-meeting-contributing)!
- Join our <a href="https://join.slack.com/t/infisical-users/shared_invite/zt-1wehzfnzn-1aMo5JcGENJiNAC2SD8Jlg">Slack</a>, and ask us any questions there.
- Join our <a href="https://infisical.com/slack">Slack</a>, and ask us any questions there.
## Resources
- [Docs](https://infisical.com/docs/documentation/getting-started/introduction) for comprehensive documentation and guides
- [Slack](https://join.slack.com/t/infisical-users/shared_invite/zt-1wehzfnzn-1aMo5JcGENJiNAC2SD8Jlg) for discussion with the community and Infisical team.
- [Slack](https://infisical.com/slack) for discussion with the community and Infisical team.
- [GitHub](https://github.com/Infisical/infisical) for code, issues, and pull requests
- [Twitter](https://twitter.com/infisical) for fast news
- [YouTube](https://www.youtube.com/@infisical5306) for videos on secret management
- [YouTube](https://www.youtube.com/@infisical_os) for videos on secret management
- [Blog](https://infisical.com/blog) for secret management insights, articles, tutorials, and updates
- [Roadmap](https://www.notion.so/infisical/be2d2585a6694e40889b03aef96ea36b?v=5b19a8127d1a4060b54769567a8785fa) for planned features

View File

@ -1,6 +1,9 @@
{
"parser": "@typescript-eslint/parser",
"plugins": ["@typescript-eslint", "unused-imports"],
"plugins": [
"@typescript-eslint",
"unused-imports"
],
"extends": [
"eslint:recommended",
"plugin:@typescript-eslint/eslint-recommended",
@ -8,14 +11,29 @@
],
"rules": {
"no-console": 2,
"quotes": ["error", "double", { "avoidEscape": true }],
"comma-dangle": ["error", "only-multiline"],
"quotes": [
"error",
"double",
{
"avoidEscape": true
}
],
"comma-dangle": [
"error",
"only-multiline"
],
"@typescript-eslint/no-unused-vars": "off",
"unused-imports/no-unused-imports": "error",
"@typescript-eslint/no-empty-function": "off",
"unused-imports/no-unused-vars": [
"warn",
{ "vars": "all", "varsIgnorePattern": "^_", "args": "after-used", "argsIgnorePattern": "^_" }
{
"vars": "all",
"varsIgnorePattern": "^_",
"args": "after-used",
"argsIgnorePattern": "^_"
}
],
"sort-imports": ["error", { "ignoreDeclarationSort": true }]
"sort-imports": 1
}
}
}

View File

@ -19,6 +19,10 @@ RUN npm ci --only-production
COPY --from=build /app .
RUN apk add --no-cache bash curl && curl -1sLf \
'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.alpine.sh' | bash \
&& apk add infisical=0.8.1
HEALTHCHECK --interval=10s --timeout=3s --start-period=10s \
CMD node healthcheck.js

6646
backend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -8,13 +8,11 @@
"@types/crypto-js": "^4.1.1",
"@types/libsodium-wrappers": "^0.7.10",
"argon2": "^0.30.3",
"await-to-js": "^3.0.0",
"aws-sdk": "^2.1364.0",
"axios": "^1.3.5",
"axios-retry": "^3.4.0",
"bcrypt": "^5.1.0",
"bigint-conversion": "^2.4.0",
"builder-pattern": "^2.2.0",
"cookie-parser": "^1.4.6",
"cors": "^2.8.5",
"crypto-js": "^4.1.1",
@ -38,6 +36,7 @@
"passport": "^0.6.0",
"passport-google-oauth20": "^2.0.0",
"posthog-node": "^2.6.0",
"probot": "^12.3.1",
"query-string": "^7.1.3",
"rate-limit-mongo": "^2.3.2",
"rimraf": "^3.0.2",
@ -91,6 +90,7 @@
"@types/node": "^18.11.3",
"@types/nodemailer": "^6.4.6",
"@types/passport": "^1.0.12",
"@types/picomatch": "^2.3.0",
"@types/supertest": "^2.0.12",
"@types/swagger-jsdoc": "^6.0.1",
"@types/swagger-ui-express": "^4.1.3",
@ -104,6 +104,7 @@
"jest-junit": "^15.0.0",
"nodemon": "^2.0.19",
"npm": "^8.19.3",
"smee-client": "^1.2.3",
"supertest": "^6.3.3",
"ts-jest": "^29.0.3",
"ts-node": "^10.9.1"

View File

@ -10,7 +10,7 @@ export const getEncryptionKey = async () => {
return secretValue === "" ? undefined : secretValue;
}
export const getRootEncryptionKey = async () => {
const secretValue = (await client.getSecret("ROOT_ENCRYPTION_KEY")).secretValue;
const secretValue = (await client.getSecret("ROOT_ENCRYPTION_KEY")).secretValue;
return secretValue === "" ? undefined : secretValue;
}
export const getInviteOnlySignup = async () => (await client.getSecret("INVITE_ONLY_SIGNUP")).secretValue === "true"
@ -57,6 +57,11 @@ export const getSmtpPassword = async () => (await client.getSecret("SMTP_PASSWOR
export const getSmtpFromAddress = async () => (await client.getSecret("SMTP_FROM_ADDRESS")).secretValue;
export const getSmtpFromName = async () => (await client.getSecret("SMTP_FROM_NAME")).secretValue || "Infisical";
export const getSecretScanningWebhookProxy = async () => (await client.getSecret("SECRET_SCANNING_WEBHOOK_PROXY")).secretValue;
export const getSecretScanningWebhookSecret = async () => (await client.getSecret("SECRET_SCANNING_WEBHOOK_SECRET")).secretValue;
export const getSecretScanningGitAppId = async () => (await client.getSecret("SECRET_SCANNING_GIT_APP_ID")).secretValue;
export const getSecretScanningPrivateKey = async () => (await client.getSecret("SECRET_SCANNING_PRIVATE_KEY")).secretValue;
export const getLicenseKey = async () => {
const secretValue = (await client.getSecret("LICENSE_KEY")).secretValue;
return secretValue === "" ? undefined : secretValue;
@ -82,4 +87,4 @@ export const getHttpsEnabled = async () => {
}
return (await client.getSecret("HTTPS_ENABLED")).secretValue === "true" && true
}
}

View File

@ -13,21 +13,25 @@ import * as signupController from "./signupController";
import * as userActionController from "./userActionController";
import * as userController from "./userController";
import * as workspaceController from "./workspaceController";
import * as secretScanningController from "./secretScanningController";
import * as webhookController from "./webhookController";
export {
authController,
botController,
integrationAuthController,
integrationController,
keyController,
membershipController,
membershipOrgController,
organizationController,
passwordController,
secretController,
serviceTokenController,
signupController,
userActionController,
userController,
workspaceController,
authController,
botController,
integrationAuthController,
integrationController,
keyController,
membershipController,
membershipOrgController,
organizationController,
passwordController,
secretController,
serviceTokenController,
signupController,
userActionController,
userController,
workspaceController,
secretScanningController,
webhookController
};

View File

@ -2,7 +2,7 @@ import { Request, Response } from "express";
import { Types } from "mongoose";
import { Integration } from "../../models";
import { EventService } from "../../services";
import { eventPushSecrets } from "../../events";
import { eventPushSecrets, eventStartIntegration } from "../../events";
import Folder from "../../models/folder";
import { getFolderByPath } from "../../services/FolderService";
import { BadRequestError } from "../../utils/errors";
@ -27,19 +27,19 @@ export const createIntegration = async (req: Request, res: Response) => {
owner,
path,
region,
secretPath,
secretPath
} = req.body;
const folders = await Folder.findOne({
workspace: req.integrationAuth.workspace._id,
environment: sourceEnvironment,
environment: sourceEnvironment
});
if (folders) {
const folder = getFolderByPath(folders.nodes, secretPath);
if (!folder) {
throw BadRequestError({
message: "Path for service token does not exist",
message: "Path for service token does not exist"
});
}
}
@ -62,21 +62,21 @@ export const createIntegration = async (req: Request, res: Response) => {
region,
secretPath,
integration: req.integrationAuth.integration,
integrationAuth: new Types.ObjectId(integrationAuthId),
integrationAuth: new Types.ObjectId(integrationAuthId)
}).save();
if (integration) {
// trigger event - push secrets
EventService.handleEvent({
event: eventPushSecrets({
event: eventStartIntegration({
workspaceId: integration.workspace,
environment: sourceEnvironment,
}),
environment: sourceEnvironment
})
});
}
return res.status(200).send({
integration,
integration
});
};
@ -97,26 +97,26 @@ export const updateIntegration = async (req: Request, res: Response) => {
appId,
targetEnvironment,
owner, // github-specific integration param
secretPath,
secretPath
} = req.body;
const folders = await Folder.findOne({
workspace: req.integration.workspace,
environment,
environment
});
if (folders) {
const folder = getFolderByPath(folders.nodes, secretPath);
if (!folder) {
throw BadRequestError({
message: "Path for service token does not exist",
message: "Path for service token does not exist"
});
}
}
const integration = await Integration.findOneAndUpdate(
{
_id: req.integration._id,
_id: req.integration._id
},
{
environment,
@ -125,25 +125,25 @@ export const updateIntegration = async (req: Request, res: Response) => {
appId,
targetEnvironment,
owner,
secretPath,
secretPath
},
{
new: true,
new: true
}
);
if (integration) {
// trigger event - push secrets
EventService.handleEvent({
event: eventPushSecrets({
event: eventStartIntegration({
workspaceId: integration.workspace,
environment,
}),
environment
})
});
}
return res.status(200).send({
integration,
integration
});
};
@ -158,12 +158,12 @@ export const deleteIntegration = async (req: Request, res: Response) => {
const { integrationId } = req.params;
const integration = await Integration.findOneAndDelete({
_id: integrationId,
_id: integrationId
});
if (!integration) throw new Error("Failed to find integration");
return res.status(200).send({
integration,
integration
});
};

View File

@ -80,7 +80,8 @@ export const pushSecrets = async (req: Request, res: Response) => {
EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment
environment,
secretPath: "/"
})
});

View File

@ -0,0 +1,91 @@
import { Request, Response } from "express";
import GitAppInstallationSession from "../../models/gitAppInstallationSession";
import crypto from "crypto";
import { Types } from "mongoose";
import { UnauthorizedRequestError } from "../../utils/errors";
import GitAppOrganizationInstallation from "../../models/gitAppOrganizationInstallation";
import { MembershipOrg } from "../../models";
import GitRisks, { STATUS_RESOLVED_FALSE_POSITIVE, STATUS_RESOLVED_NOT_REVOKED, STATUS_RESOLVED_REVOKED } from "../../models/gitRisks";
export const createInstallationSession = async (req: Request, res: Response) => {
const sessionId = crypto.randomBytes(16).toString("hex");
await GitAppInstallationSession.findByIdAndUpdate(
req.organization,
{
organization: new Types.ObjectId(req.organization),
sessionId: sessionId,
user: new Types.ObjectId(req.user._id)
},
{ upsert: true }
).lean();
res.send({
sessionId: sessionId
})
}
export const linkInstallationToOrganization = async (req: Request, res: Response) => {
const { installationId, sessionId } = req.body
const installationSession = await GitAppInstallationSession.findOneAndDelete({ sessionId: sessionId })
if (!installationSession) {
throw UnauthorizedRequestError()
}
const userMembership = await MembershipOrg.find({ user: req.user._id, organization: installationSession.organization })
if (!userMembership) {
throw UnauthorizedRequestError()
}
const installationLink = await GitAppOrganizationInstallation.findOneAndUpdate({
organizationId: installationSession.organization,
}, {
installationId: installationId,
organizationId: installationSession.organization,
user: installationSession.user
}, {
upsert: true
}).lean()
res.json(installationLink)
}
export const getCurrentOrganizationInstallationStatus = async (req: Request, res: Response) => {
const { organizationId } = req.params
try {
const appInstallation = await GitAppOrganizationInstallation.findOne({ organizationId: organizationId }).lean()
if (!appInstallation) {
res.json({
appInstallationComplete: false
})
}
res.json({
appInstallationComplete: true
})
} catch {
res.json({
appInstallationComplete: false
})
}
}
export const getRisksForOrganization = async (req: Request, res: Response) => {
const { organizationId } = req.params
const risks = await GitRisks.find({ organization: organizationId }).sort({ createdAt: -1 }).lean()
res.json({
risks: risks
})
}
export const updateRisksStatus = async (req: Request, res: Response) => {
const { riskId } = req.params
const { status } = req.body
const isRiskResolved = status == STATUS_RESOLVED_FALSE_POSITIVE || status == STATUS_RESOLVED_REVOKED || status == STATUS_RESOLVED_NOT_REVOKED ? true : false
const risk = await GitRisks.findByIdAndUpdate(riskId, {
status: status,
isResolved: isRiskResolved
}).lean()
res.json(risk)
}

View File

@ -0,0 +1,140 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { client, getRootEncryptionKey } from "../../config";
import { validateMembership } from "../../helpers";
import Webhook from "../../models/webhooks";
import { getWebhookPayload, triggerWebhookRequest } from "../../services/WebhookService";
import { BadRequestError } from "../../utils/errors";
import { ADMIN, ALGORITHM_AES_256_GCM, ENCODING_SCHEME_BASE64, MEMBER } from "../../variables";
export const createWebhook = async (req: Request, res: Response) => {
const { webhookUrl, webhookSecretKey, environment, workspaceId, secretPath } = req.body;
const webhook = new Webhook({
workspace: workspaceId,
environment,
secretPath,
url: webhookUrl,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_BASE64
});
if (webhookSecretKey) {
const rootEncryptionKey = await getRootEncryptionKey();
const { ciphertext, iv, tag } = client.encryptSymmetric(webhookSecretKey, rootEncryptionKey);
webhook.iv = iv;
webhook.tag = tag;
webhook.encryptedSecretKey = ciphertext;
}
await webhook.save();
return res.status(200).send({
webhook,
message: "successfully created webhook"
});
};
export const updateWebhook = async (req: Request, res: Response) => {
const { webhookId } = req.params;
const { isDisabled } = req.body;
const webhook = await Webhook.findById(webhookId);
if (!webhook) {
throw BadRequestError({ message: "Webhook not found!!" });
}
// check that user is a member of the workspace
await validateMembership({
userId: req.user._id.toString(),
workspaceId: webhook.workspace,
acceptedRoles: [ADMIN, MEMBER]
});
if (typeof isDisabled !== undefined) {
webhook.isDisabled = isDisabled;
}
await webhook.save();
return res.status(200).send({
webhook,
message: "successfully updated webhook"
});
};
export const deleteWebhook = async (req: Request, res: Response) => {
const { webhookId } = req.params;
const webhook = await Webhook.findById(webhookId);
if (!webhook) {
throw BadRequestError({ message: "Webhook not found!!" });
}
await validateMembership({
userId: req.user._id.toString(),
workspaceId: webhook.workspace,
acceptedRoles: [ADMIN, MEMBER]
});
await webhook.remove();
return res.status(200).send({
message: "successfully removed webhook"
});
};
export const testWebhook = async (req: Request, res: Response) => {
const { webhookId } = req.params;
const webhook = await Webhook.findById(webhookId);
if (!webhook) {
throw BadRequestError({ message: "Webhook not found!!" });
}
await validateMembership({
userId: req.user._id.toString(),
workspaceId: webhook.workspace,
acceptedRoles: [ADMIN, MEMBER]
});
try {
await triggerWebhookRequest(
webhook,
getWebhookPayload(
"test",
webhook.workspace.toString(),
webhook.environment,
webhook.secretPath
)
);
await Webhook.findByIdAndUpdate(webhookId, {
lastStatus: "success",
lastRunErrorMessage: null
});
} catch (err) {
await Webhook.findByIdAndUpdate(webhookId, {
lastStatus: "failed",
lastRunErrorMessage: (err as Error).message
});
return res.status(400).send({
message: "Failed to receive response",
error: (err as Error).message
});
}
return res.status(200).send({
message: "Successfully received response"
});
};
export const listWebhooks = async (req: Request, res: Response) => {
const { environment, workspaceId, secretPath } = req.query;
const optionalFilters: Record<string, string> = {};
if (environment) optionalFilters.environment = environment as string;
if (secretPath) optionalFilters.secretPath = secretPath as string;
const webhooks = await Webhook.find({
workspace: new Types.ObjectId(workspaceId as string),
...optionalFilters
});
return res.status(200).send({
webhooks
});
};

View File

@ -1,72 +0,0 @@
import { Request, Response } from "express";
import crypto from "crypto";
import bcrypt from "bcrypt";
import { APIKeyData } from "../../models";
import { getSaltRounds } from "../../config";
/**
* Return API key data for user with id [req.user_id]
* @param req
* @param res
* @returns
*/
export const getAPIKeyData = async (req: Request, res: Response) => {
const apiKeyData = await APIKeyData.find({
user: req.user._id,
});
return res.status(200).send({
apiKeyData,
});
};
/**
* Create new API key data for user with id [req.user._id]
* @param req
* @param res
*/
export const createAPIKeyData = async (req: Request, res: Response) => {
const { name, expiresIn } = req.body;
const secret = crypto.randomBytes(16).toString("hex");
const secretHash = await bcrypt.hash(secret, await getSaltRounds());
const expiresAt = new Date();
expiresAt.setSeconds(expiresAt.getSeconds() + expiresIn);
let apiKeyData = await new APIKeyData({
name,
lastUsed: new Date(),
expiresAt,
user: req.user._id,
secretHash,
}).save();
// return api key data without sensitive data
// FIX: fix this any
apiKeyData = (await APIKeyData.findById(apiKeyData._id)) as any;
if (!apiKeyData) throw new Error("Failed to find API key data");
const apiKey = `ak.${apiKeyData._id.toString()}.${secret}`;
return res.status(200).send({
apiKey,
apiKeyData,
});
};
/**
* Delete API key data with id [apiKeyDataId].
* @param req
* @param res
* @returns
*/
export const deleteAPIKeyData = async (req: Request, res: Response) => {
const { apiKeyDataId } = req.params;
const apiKeyData = await APIKeyData.findByIdAndDelete(apiKeyDataId);
return res.status(200).send({
apiKeyData,
});
};

View File

@ -4,7 +4,6 @@ import * as usersController from "./usersController";
import * as organizationsController from "./organizationsController";
import * as workspaceController from "./workspaceController";
import * as serviceTokenDataController from "./serviceTokenDataController";
import * as apiKeyDataController from "./apiKeyDataController";
import * as secretController from "./secretController";
import * as secretsController from "./secretsController";
import * as serviceAccountsController from "./serviceAccountsController";
@ -18,7 +17,6 @@ export {
organizationsController,
workspaceController,
serviceTokenDataController,
apiKeyDataController,
secretController,
secretsController,
serviceAccountsController,

View File

@ -1,4 +1,3 @@
import to from "await-to-js";
import { Request, Response } from "express";
import mongoose, { Types } from "mongoose";
import Secret, { ISecret } from "../../models/secret";
@ -56,10 +55,7 @@ export const createSecret = async (req: Request, res: Response) => {
keyEncoding: ENCODING_SCHEME_UTF8
};
const [error, secret] = await to(Secret.create(sanitizedSecret).then());
if (error instanceof ValidationError) {
throw RouteValidationError({ message: error.message, stack: error.stack });
}
const secret = await new Secret(sanitizedSecret).save();
if (postHogClient) {
postHogClient.capture({
@ -81,7 +77,7 @@ export const createSecret = async (req: Request, res: Response) => {
};
/**
* Create many secrets for workspace wiht id [workspaceId] and environment [environment]
* Create many secrets for workspace with id [workspaceId] and environment [environment]
* @param req
* @param res
*/
@ -116,20 +112,7 @@ export const createSecrets = async (req: Request, res: Response) => {
sanitizedSecretesToCreate.push(safeUpdateFields);
});
const [bulkCreateError, secrets] = await to(Secret.insertMany(sanitizedSecretesToCreate).then());
if (bulkCreateError) {
if (bulkCreateError instanceof ValidationError) {
throw RouteValidationError({
message: bulkCreateError.message,
stack: bulkCreateError.stack
});
}
throw InternalServerError({
message: "Unable to process your batch create request. Please try again",
stack: bulkCreateError.stack
});
}
const secrets = await Secret.insertMany(sanitizedSecretesToCreate);
if (postHogClient) {
postHogClient.capture({
@ -160,14 +143,7 @@ export const deleteSecrets = async (req: Request, res: Response) => {
const { workspaceId, environmentName } = req.params;
const secretIdsToDelete: string[] = req.body.secretIds;
const [secretIdsUserCanDeleteError, secretIdsUserCanDelete] = await to(
Secret.find({ workspace: workspaceId, environment: environmentName }, { _id: 1 }).then()
);
if (secretIdsUserCanDeleteError) {
throw InternalServerError({
message: `Unable to fetch secrets you own: [error=${secretIdsUserCanDeleteError.message}]`
});
}
const secretIdsUserCanDelete = await Secret.find({ workspace: workspaceId, environment: environmentName }, { _id: 1 });
const secretsUserCanDeleteSet: Set<string> = new Set(
secretIdsUserCanDelete.map((objectId) => objectId._id.toString())
@ -189,16 +165,7 @@ export const deleteSecrets = async (req: Request, res: Response) => {
}
});
const [bulkDeleteError] = await to(Secret.bulkWrite(deleteOperationsToPerform).then());
if (bulkDeleteError) {
if (bulkDeleteError instanceof ValidationError) {
throw RouteValidationError({
message: "Unable to apply modifications, please try again",
stack: bulkDeleteError.stack
});
}
throw InternalServerError();
}
await Secret.bulkWrite(deleteOperationsToPerform);
if (postHogClient) {
postHogClient.capture({
@ -255,12 +222,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
const postHogClient = await TelemetryService.getPostHogClient();
const { workspaceId, environmentName } = req.params;
const secretsModificationsRequested: ModifySecretRequestBody[] = req.body.secrets;
const [secretIdsUserCanModifyError, secretIdsUserCanModify] = await to(
Secret.find({ workspace: workspaceId, environment: environmentName }, { _id: 1 }).then()
);
if (secretIdsUserCanModifyError) {
throw InternalServerError({ message: "Unable to fetch secrets you own" });
}
const secretIdsUserCanModify = await Secret.find({ workspace: workspaceId, environment: environmentName }, { _id: 1 });
const secretsUserCanModifySet: Set<string> = new Set(
secretIdsUserCanModify.map((objectId) => objectId._id.toString())
@ -298,19 +260,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
}
});
const [bulkModificationInfoError, bulkModificationInfo] = await to(
Secret.bulkWrite(updateOperationsToPerform).then()
);
if (bulkModificationInfoError) {
if (bulkModificationInfoError instanceof ValidationError) {
throw RouteValidationError({
message: "Unable to apply modifications, please try again",
stack: bulkModificationInfoError.stack
});
}
throw InternalServerError();
}
await Secret.bulkWrite(updateOperationsToPerform);
if (postHogClient) {
postHogClient.capture({
@ -340,12 +290,7 @@ export const updateSecret = async (req: Request, res: Response) => {
const { workspaceId, environmentName } = req.params;
const secretModificationsRequested: ModifySecretRequestBody = req.body.secret;
const [secretIdUserCanModifyError, secretIdUserCanModify] = await to(
Secret.findOne({ workspace: workspaceId, environment: environmentName }, { _id: 1 }).then()
);
if (secretIdUserCanModifyError && !secretIdUserCanModify) {
throw BadRequestError();
}
await Secret.findOne({ workspace: workspaceId, environment: environmentName }, { _id: 1 });
const sanitizedSecret: SanitizedSecretModify = {
secretKeyCiphertext: secretModificationsRequested.secretKeyCiphertext,
@ -362,18 +307,20 @@ export const updateSecret = async (req: Request, res: Response) => {
secretCommentHash: secretModificationsRequested.secretCommentHash
};
const [error, singleModificationUpdate] = await to(
Secret.updateOne(
{ _id: secretModificationsRequested._id, workspace: workspaceId },
{ $inc: { version: 1 }, $set: sanitizedSecret }
).then()
);
if (error instanceof ValidationError) {
throw RouteValidationError({
message: "Unable to apply modifications, please try again",
stack: error.stack
});
}
const singleModificationUpdate = await Secret.updateOne(
{ _id: secretModificationsRequested._id, workspace: workspaceId },
{ $inc: { version: 1 }, $set: sanitizedSecret }
)
.catch((error) => {
if (error instanceof ValidationError) {
throw RouteValidationError({
message: "Unable to apply modifications, please try again",
stack: error.stack
});
}
throw error;
});
if (postHogClient) {
postHogClient.capture({
@ -419,21 +366,18 @@ export const getSecrets = async (req: Request, res: Response) => {
userEmail = user.email;
}
const [err, secrets] = await to(
Secret.find({
workspace: workspaceId,
environment,
$or: [{ user: userId }, { user: { $exists: false } }],
type: { $in: [SECRET_SHARED, SECRET_PERSONAL] }
}).then()
);
if (err) {
const secrets = await Secret.find({
workspace: workspaceId,
environment,
$or: [{ user: userId }, { user: { $exists: false } }],
type: { $in: [SECRET_SHARED, SECRET_PERSONAL] }
})
.catch((err) => {
throw RouteValidationError({
message: "Failed to get secrets, please try again",
stack: err.stack
});
}
})
if (postHogClient) {
postHogClient.capture({

View File

@ -9,7 +9,7 @@ import {
ACTION_UPDATE_SECRETS,
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_UTF8,
SECRET_PERSONAL,
SECRET_PERSONAL
} from "../../variables";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
import { EventService } from "../../services";
@ -21,7 +21,7 @@ import { PERMISSION_WRITE_SECRETS } from "../../variables";
import {
userHasNoAbility,
userHasWorkspaceAccess,
userHasWriteOnlyAbility,
userHasWriteOnlyAbility
} from "../../ee/helpers/checkMembershipPermissions";
import Tag from "../../models/tag";
import _ from "lodash";
@ -31,7 +31,10 @@ import {
getFolderByPath,
getFolderIdFromServiceToken,
searchByFolderId,
searchByFolderIdWithDir
} from "../../services/FolderService";
import { isValidScope } from "../../helpers/secrets";
import path from "path";
/**
* Peform a batch of any specified CUD secret operations
@ -46,14 +49,13 @@ export const batchSecrets = async (req: Request, res: Response) => {
const {
workspaceId,
environment,
requests,
secretPath,
requests
}: {
workspaceId: string;
environment: string;
requests: BatchSecretRequest[];
secretPath: string;
} = req.body;
let secretPath = req.body.secretPath as string;
let folderId = req.body.folderId as string;
const createSecrets: BatchSecret[] = [];
@ -63,31 +65,31 @@ export const batchSecrets = async (req: Request, res: Response) => {
// get secret blind index salt
const salt = await SecretService.getSecretBlindIndexSalt({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
const folders = await Folder.findOne({ workspace: workspaceId, environment });
if (folders && folderId !== "root") {
const folder = searchByFolderId(folders.nodes, folderId as string);
if (!folder) throw BadRequestError({ message: "Folder not found" });
}
if (req.authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = req.authData.authPayload;
const isValidScopeAccess = isValidScope(req.authData.authPayload, environment, secretPath);
// in service token when not giving secretpath folderid must be root
// this is to avoid giving folderid when service tokens are used
if (
(!secretPath && folderId !== "root") ||
(secretPath && secretPath !== serviceTkScopedSecretPath)
) {
if ((!secretPath && folderId !== "root") || (secretPath && !isValidScopeAccess)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
if (secretPath) {
folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
}
if (folders && folderId !== "root") {
const folder = searchByFolderIdWithDir(folders.nodes, folderId as string);
if (!folder?.folder) throw BadRequestError({ message: "Folder not found" });
secretPath = path.join(
"/",
...folder.dir.map(({ name }) => name).filter((name) => name !== "root")
);
}
@ -97,12 +99,10 @@ export const batchSecrets = async (req: Request, res: Response) => {
let secretBlindIndex = "";
switch (request.method) {
case "POST":
secretBlindIndex = await SecretService.generateSecretBlindIndexWithSalt(
{
secretName: request.secret.secretName,
salt,
}
);
secretBlindIndex = await SecretService.generateSecretBlindIndexWithSalt({
secretName: request.secret.secretName,
salt
});
createSecrets.push({
...request.secret,
@ -113,16 +113,14 @@ export const batchSecrets = async (req: Request, res: Response) => {
folder: folderId,
secretBlindIndex,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
});
break;
case "PATCH":
secretBlindIndex = await SecretService.generateSecretBlindIndexWithSalt(
{
secretName: request.secret.secretName,
salt,
}
);
secretBlindIndex = await SecretService.generateSecretBlindIndexWithSalt({
secretName: request.secret.secretName,
salt
});
updateSecrets.push({
...request.secret,
@ -130,7 +128,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
secretBlindIndex,
folder: folderId,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
});
break;
case "DELETE":
@ -150,9 +148,9 @@ export const batchSecrets = async (req: Request, res: Response) => {
...n._doc,
_id: new Types.ObjectId(),
secret: n._id,
isDeleted: false,
isDeleted: false
};
}),
})
});
const addAction = (await EELogService.createAction({
@ -161,7 +159,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
serviceAccountId: req.serviceAccount?._id,
serviceTokenDataId: req.serviceTokenData?._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: createdSecrets.map((n) => n._id),
secretIds: createdSecrets.map((n) => n._id)
})) as IAction;
actions.push(addAction);
@ -175,8 +173,8 @@ export const batchSecrets = async (req: Request, res: Response) => {
workspaceId,
folderId,
channel,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
}
@ -195,7 +193,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
listedSecretsObj = req.secrets.reduce(
(obj: any, secret: ISecret) => ({
...obj,
[secret._id.toString()]: secret,
[secret._id.toString()]: secret
}),
{}
);
@ -204,16 +202,16 @@ export const batchSecrets = async (req: Request, res: Response) => {
updateOne: {
filter: {
_id: new Types.ObjectId(u._id),
workspace: new Types.ObjectId(workspaceId),
workspace: new Types.ObjectId(workspaceId)
},
update: {
$inc: {
version: 1,
version: 1
},
...u,
_id: new Types.ObjectId(u._id),
},
},
_id: new Types.ObjectId(u._id)
}
}
}));
await Secret.bulkWrite(updateOperations);
@ -240,25 +238,25 @@ export const batchSecrets = async (req: Request, res: Response) => {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
tags: u.tags,
folder: u.folder,
folder: u.folder
})
);
await EESecretService.addSecretVersions({
secretVersions,
secretVersions
});
updatedSecrets = await Secret.find({
_id: {
$in: updateSecrets.map((u) => new Types.ObjectId(u._id)),
},
$in: updateSecrets.map((u) => new Types.ObjectId(u._id))
}
});
const updateAction = (await EELogService.createAction({
name: ACTION_UPDATE_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: updatedSecrets.map((u) => u._id),
secretIds: updatedSecrets.map((u) => u._id)
})) as IAction;
actions.push(updateAction);
@ -272,8 +270,8 @@ export const batchSecrets = async (req: Request, res: Response) => {
workspaceId,
folderId,
channel,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
}
@ -282,19 +280,19 @@ export const batchSecrets = async (req: Request, res: Response) => {
if (deleteSecrets.length > 0) {
await Secret.deleteMany({
_id: {
$in: deleteSecrets,
},
$in: deleteSecrets
}
});
await EESecretService.markDeletedSecretVersions({
secretIds: deleteSecrets,
secretIds: deleteSecrets
});
const deleteAction = (await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: deleteSecrets,
secretIds: deleteSecrets
})) as IAction;
actions.push(deleteAction);
@ -307,8 +305,8 @@ export const batchSecrets = async (req: Request, res: Response) => {
environment,
workspaceId,
channel: channel,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
}
@ -320,7 +318,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(workspaceId),
actions,
channel,
ipAddress: req.realIP,
ipAddress: req.realIP
});
}
@ -328,14 +326,17 @@ export const batchSecrets = async (req: Request, res: Response) => {
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
}),
environment,
// root condition else this will be filled according to the path or folderid
secretPath: secretPath || "/"
})
});
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId: new Types.ObjectId(workspaceId),
environment,
folderId,
folderId
});
const resObj: { [key: string]: ISecret[] | string[] } = {};
@ -418,7 +419,7 @@ export const createSecrets = async (req: Request, res: Response) => {
const {
workspaceId,
environment,
secretPath,
secretPath
}: {
workspaceId: string;
environment: string;
@ -435,8 +436,7 @@ export const createSecrets = async (req: Request, res: Response) => {
);
if (!hasAccess) {
throw UnauthorizedRequestError({
message:
"You do not have the necessary permission(s) perform this action",
message: "You do not have the necessary permission(s) perform this action"
});
}
}
@ -449,28 +449,27 @@ export const createSecrets = async (req: Request, res: Response) => {
// case: create 1 secret
listOfSecretsToCreate = [req.body.secrets];
}
if (req.authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = req.authData.authPayload;
const isValidScopeAccess = isValidScope(
req.authData.authPayload,
environment,
secretPath || "/"
);
// in service token when not giving secretpath folderid must be root
// this is to avoid giving folderid when service tokens are used
if (
(!secretPath && folderId !== "root") ||
(secretPath && secretPath !== serviceTkScopedSecretPath)
) {
if ((!secretPath && folderId !== "root") || (secretPath && !isValidScopeAccess)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
if (secretPath) {
folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
);
folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
}
// get secret blind index salt
const salt = await SecretService.getSecretBlindIndexSalt({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
type secretsToCreateType = {
@ -502,15 +501,14 @@ export const createSecrets = async (req: Request, res: Response) => {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags,
tags
}: secretsToCreateType) => {
let secretBlindIndex;
if (secretName) {
secretBlindIndex =
await SecretService.generateSecretBlindIndexWithSalt({
secretName,
salt,
});
secretBlindIndex = await SecretService.generateSecretBlindIndexWithSalt({
secretName,
salt
});
}
return {
@ -532,22 +530,24 @@ export const createSecrets = async (req: Request, res: Response) => {
secretCommentTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
tags,
tags
};
}
)
);
const newlyCreatedSecrets: ISecret[] = (
await Secret.insertMany(secretsToInsert)
).map((insertedSecret) => insertedSecret.toObject());
const newlyCreatedSecrets: ISecret[] = (await Secret.insertMany(secretsToInsert)).map(
(insertedSecret) => insertedSecret.toObject()
);
setTimeout(async () => {
// trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
}),
environment,
secretPath: secretPath || "/"
})
});
}, 5000);
@ -567,7 +567,7 @@ export const createSecrets = async (req: Request, res: Response) => {
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueTag
}) =>
new SecretVersion({
secret: _id,
@ -586,9 +586,9 @@ export const createSecrets = async (req: Request, res: Response) => {
secretValueTag,
folder: folderId,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
})
),
)
});
const addAction = await EELogService.createAction({
@ -597,7 +597,7 @@ export const createSecrets = async (req: Request, res: Response) => {
serviceAccountId: req.serviceAccount?._id,
serviceTokenDataId: req.serviceTokenData?._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: newlyCreatedSecrets.map((n) => n._id),
secretIds: newlyCreatedSecrets.map((n) => n._id)
});
// (EE) create (audit) log
@ -609,14 +609,14 @@ export const createSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(workspaceId),
actions: [addAction],
channel,
ipAddress: req.realIP,
ipAddress: req.realIP
}));
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId: new Types.ObjectId(workspaceId),
environment,
folderId,
folderId
});
const postHogClient = await TelemetryService.getPostHogClient();
@ -624,7 +624,7 @@ export const createSecrets = async (req: Request, res: Response) => {
postHogClient.capture({
event: "secrets added",
distinctId: await TelemetryService.getDistinctId({
authData: req.authData,
authData: req.authData
}),
properties: {
numberOfSecrets: listOfSecretsToCreate.length,
@ -632,13 +632,13 @@ export const createSecrets = async (req: Request, res: Response) => {
workspaceId,
channel: channel,
folderId,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
return res.status(200).send({
secrets: newlyCreatedSecrets,
secrets: newlyCreatedSecrets
});
};
@ -696,10 +696,7 @@ export const getSecrets = async (req: Request, res: Response) => {
const environment = req.query.environment as string;
const folders = await Folder.findOne({ workspace: workspaceId, environment });
if (
(!folders && folderId && folderId !== "root") ||
(!folders && secretPath)
) {
if ((!folders && folderId && folderId !== "root") || (!folders && secretPath)) {
res.send({ secrets: [] });
return;
}
@ -712,13 +709,15 @@ export const getSecrets = async (req: Request, res: Response) => {
}
if (req.authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = req.authData.authPayload;
const isValidScopeAccess = isValidScope(
req.authData.authPayload,
environment,
(secretPath as string) || "/"
);
// in service token when not giving secretpath folderid must be root
// this is to avoid giving folderid when service tokens are used
if (
(!secretPath && folderId !== "root") ||
(secretPath && secretPath !== serviceTkScopedSecretPath)
) {
if ((!secretPath && folderId !== "root") || (secretPath && !isValidScopeAccess)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
@ -738,8 +737,7 @@ export const getSecrets = async (req: Request, res: Response) => {
// query tags table to get all tags ids for the tag names for the given workspace
let tagIds = [];
const tagNamesList =
typeof tagSlugs === "string" && tagSlugs !== "" ? tagSlugs.split(",") : [];
const tagNamesList = typeof tagSlugs === "string" && tagSlugs !== "" ? tagSlugs.split(",") : [];
if (tagNamesList != undefined && tagNamesList.length != 0) {
const workspaceFromDB = await Tag.find({ workspace: workspaceId });
tagIds = _.map(tagNamesList, (tagName: string) => {
@ -762,8 +760,7 @@ export const getSecrets = async (req: Request, res: Response) => {
);
if (hasNoAccess) {
throw UnauthorizedRequestError({
message:
"You do not have the necessary permission(s) perform this action",
message: "You do not have the necessary permission(s) perform this action"
});
}
@ -773,8 +770,8 @@ export const getSecrets = async (req: Request, res: Response) => {
folder: folderId,
$or: [
{ user: req.user._id }, // personal secrets for this user
{ user: { $exists: false } }, // shared secrets from workspace
],
{ user: { $exists: false } } // shared secrets from workspace
]
};
if (tagIds.length > 0) {
@ -801,8 +798,8 @@ export const getSecrets = async (req: Request, res: Response) => {
environment,
$or: [
{ user: userId }, // personal secrets for this user
{ user: { $exists: false } }, // shared secrets from workspace
],
{ user: { $exists: false } } // shared secrets from workspace
]
};
if (tagIds.length > 0) {
@ -820,7 +817,7 @@ export const getSecrets = async (req: Request, res: Response) => {
workspace: workspaceId,
environment,
folder: folderId,
user: { $exists: false }, // shared secrets only from workspace
user: { $exists: false } // shared secrets only from workspace
};
if (tagIds.length > 0) {
@ -838,7 +835,7 @@ export const getSecrets = async (req: Request, res: Response) => {
serviceAccountId: req.serviceAccount?._id,
serviceTokenDataId: req.serviceTokenData?._id,
workspaceId: new Types.ObjectId(workspaceId as string),
secretIds: secrets.map((n: any) => n._id),
secretIds: secrets.map((n: any) => n._id)
});
readAction &&
@ -849,7 +846,7 @@ export const getSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(workspaceId as string),
actions: [readAction],
channel,
ipAddress: req.realIP,
ipAddress: req.realIP
}));
const postHogClient = await TelemetryService.getPostHogClient();
@ -857,7 +854,7 @@ export const getSecrets = async (req: Request, res: Response) => {
postHogClient.capture({
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({
authData: req.authData,
authData: req.authData
}),
properties: {
numberOfSecrets: secrets.length,
@ -865,13 +862,13 @@ export const getSecrets = async (req: Request, res: Response) => {
workspaceId,
channel,
folderId,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
return res.status(200).send({
secrets,
secrets
});
};
@ -925,9 +922,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
}
}
*/
const channel = req.headers?.["user-agent"]?.toLowerCase().includes("mozilla")
? "web"
: "cli";
const channel = req.headers?.["user-agent"]?.toLowerCase().includes("mozilla") ? "web" : "cli";
interface PatchSecret {
id: string;
@ -943,51 +938,47 @@ export const updateSecrets = async (req: Request, res: Response) => {
tags: string[];
}
const updateOperationsToPerform = req.body.secrets.map(
(secret: PatchSecret) => {
const {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags,
} = secret;
const updateOperationsToPerform = req.body.secrets.map((secret: PatchSecret) => {
const {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags
} = secret;
return {
updateOne: {
filter: { _id: new Types.ObjectId(secret.id) },
update: {
$inc: {
version: 1,
},
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
tags,
...(secretCommentCiphertext !== undefined &&
secretCommentIV &&
secretCommentTag
? {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
}
: {}),
return {
updateOne: {
filter: { _id: new Types.ObjectId(secret.id) },
update: {
$inc: {
version: 1
},
},
};
}
);
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
tags,
...(secretCommentCiphertext !== undefined && secretCommentIV && secretCommentTag
? {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag
}
: {})
}
}
};
});
await Secret.bulkWrite(updateOperationsToPerform);
@ -1009,7 +1000,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags,
tags
} = secretModificationsBySecretId[secret._id.toString()];
return {
@ -1018,9 +1009,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
workspace: secret.workspace,
type: secret.type,
environment: secret.environment,
secretKeyCiphertext: secretKeyCiphertext
? secretKeyCiphertext
: secret.secretKeyCiphertext,
secretKeyCiphertext: secretKeyCiphertext ? secretKeyCiphertext : secret.secretKeyCiphertext,
secretKeyIV: secretKeyIV ? secretKeyIV : secret.secretKeyIV,
secretKeyTag: secretKeyTag ? secretKeyTag : secret.secretKeyTag,
secretValueCiphertext: secretValueCiphertext
@ -1031,17 +1020,13 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretCommentCiphertext: secretCommentCiphertext
? secretCommentCiphertext
: secret.secretCommentCiphertext,
secretCommentIV: secretCommentIV
? secretCommentIV
: secret.secretCommentIV,
secretCommentTag: secretCommentTag
? secretCommentTag
: secret.secretCommentTag,
secretCommentIV: secretCommentIV ? secretCommentIV : secret.secretCommentIV,
secretCommentTag: secretCommentTag ? secretCommentTag : secret.secretCommentTag,
tags: tags ? tags : secret.tags,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
};
}),
})
};
await EESecretService.addSecretVersions(secretVersions);
@ -1059,13 +1044,16 @@ export const updateSecrets = async (req: Request, res: Response) => {
Object.keys(workspaceSecretObj).forEach(async (key) => {
// trigger event - push secrets
setTimeout(async () => {
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(key),
}),
});
}, 10000);
// This route is not used anymore thus keep it commented out as it does not expose environment
// it will end up creating a lot of requests from the server
// setTimeout(async () => {
// await EventService.handleEvent({
// event: eventPushSecrets({
// workspaceId: new Types.ObjectId(key),
// environment,
// })
// });
// }, 10000);
const updateAction = await EELogService.createAction({
name: ACTION_UPDATE_SECRETS,
@ -1073,7 +1061,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
serviceAccountId: req.serviceAccount?._id,
serviceTokenDataId: req.serviceTokenData?._id,
workspaceId: new Types.ObjectId(key),
secretIds: workspaceSecretObj[key].map((secret: ISecret) => secret._id),
secretIds: workspaceSecretObj[key].map((secret: ISecret) => secret._id)
});
// (EE) create (audit) log
@ -1085,7 +1073,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(key),
actions: [updateAction],
channel,
ipAddress: req.realIP,
ipAddress: req.realIP
}));
// (EE) take a secret snapshot
@ -1101,15 +1089,15 @@ export const updateSecrets = async (req: Request, res: Response) => {
postHogClient.capture({
event: "secrets modified",
distinctId: await TelemetryService.getDistinctId({
authData: req.authData,
authData: req.authData
}),
properties: {
numberOfSecrets: workspaceSecretObj[key].length,
environment: workspaceSecretObj[key][0].environment,
workspaceId: key,
channel: channel,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
});
@ -1117,9 +1105,9 @@ export const updateSecrets = async (req: Request, res: Response) => {
return res.status(200).send({
secrets: await Secret.find({
_id: {
$in: req.secrets.map((secret: ISecret) => secret._id),
},
}),
$in: req.secrets.map((secret: ISecret) => secret._id)
}
})
});
};
@ -1179,12 +1167,12 @@ export const deleteSecrets = async (req: Request, res: Response) => {
await Secret.deleteMany({
_id: {
$in: toDelete,
},
$in: toDelete
}
});
await EESecretService.markDeletedSecretVersions({
secretIds: toDelete,
secretIds: toDelete
});
// group secrets into workspaces so deleted secrets can
@ -1200,18 +1188,20 @@ export const deleteSecrets = async (req: Request, res: Response) => {
Object.keys(workspaceSecretObj).forEach(async (key) => {
// trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(key),
}),
});
// DEPRECIATED(akhilmhdh): as this would cause server to send so many request
// and this route is not used anymore thus like snapshot keeping it commented out
// await EventService.handleEvent({
// event: eventPushSecrets({
// workspaceId: new Types.ObjectId(key)
// })
// });
const deleteAction = await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
userId: req.user?._id,
serviceAccountId: req.serviceAccount?._id,
serviceTokenDataId: req.serviceTokenData?._id,
workspaceId: new Types.ObjectId(key),
secretIds: workspaceSecretObj[key].map((secret: ISecret) => secret._id),
secretIds: workspaceSecretObj[key].map((secret: ISecret) => secret._id)
});
// (EE) create (audit) log
@ -1223,7 +1213,7 @@ export const deleteSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(key),
actions: [deleteAction],
channel,
ipAddress: req.realIP,
ipAddress: req.realIP
}));
// (EE) take a secret snapshot
@ -1237,20 +1227,20 @@ export const deleteSecrets = async (req: Request, res: Response) => {
postHogClient.capture({
event: "secrets deleted",
distinctId: await TelemetryService.getDistinctId({
authData: req.authData,
authData: req.authData
}),
properties: {
numberOfSecrets: workspaceSecretObj[key].length,
environment: workspaceSecretObj[key][0].environment,
workspaceId: key,
channel: channel,
userAgent: req.headers?.["user-agent"],
},
userAgent: req.headers?.["user-agent"]
}
});
}
});
return res.status(200).send({
secrets: req.secrets,
secrets: req.secrets
});
};

View File

@ -2,10 +2,7 @@ import { Request, Response } from "express";
import crypto from "crypto";
import bcrypt from "bcrypt";
import { ServiceAccount, ServiceTokenData, User } from "../../models";
import {
AUTH_MODE_JWT,
AUTH_MODE_SERVICE_ACCOUNT,
} from "../../variables";
import { AUTH_MODE_JWT, AUTH_MODE_SERVICE_ACCOUNT } from "../../variables";
import { getSaltRounds } from "../../config";
import { BadRequestError } from "../../utils/errors";
import Folder from "../../models/folder";
@ -46,14 +43,13 @@ export const getServiceTokenData = async (req: Request, res: Response) => {
if (!(req.authData.authPayload instanceof ServiceTokenData))
throw BadRequestError({
message: "Failed accepted client validation for service token data",
message: "Failed accepted client validation for service token data"
});
const serviceTokenData = await ServiceTokenData.findById(
req.authData.authPayload._id
)
const serviceTokenData = await ServiceTokenData.findById(req.authData.authPayload._id)
.select("+encryptedKey +iv +tag")
.populate("user");
.populate("user")
.lean();
return res.status(200).json(serviceTokenData);
};
@ -68,29 +64,7 @@ export const getServiceTokenData = async (req: Request, res: Response) => {
export const createServiceTokenData = async (req: Request, res: Response) => {
let serviceTokenData;
const {
name,
workspaceId,
environment,
encryptedKey,
iv,
tag,
expiresIn,
secretPath,
permissions,
} = req.body;
const folders = await Folder.findOne({
workspace: workspaceId,
environment,
});
if (folders) {
const folder = getFolderByPath(folders.nodes, secretPath);
if (folder == undefined) {
throw BadRequestError({ message: "Path for service token does not exist" })
}
}
const { name, workspaceId, encryptedKey, iv, tag, expiresIn, permissions, scopes } = req.body;
const secret = crypto.randomBytes(16).toString("hex");
const secretHash = await bcrypt.hash(secret, await getSaltRounds());
@ -103,10 +77,7 @@ export const createServiceTokenData = async (req: Request, res: Response) => {
let user, serviceAccount;
if (
req.authData.authMode === AUTH_MODE_JWT &&
req.authData.authPayload instanceof User
) {
if (req.authData.authMode === AUTH_MODE_JWT && req.authData.authPayload instanceof User) {
user = req.authData.authPayload._id;
}
@ -120,17 +91,16 @@ export const createServiceTokenData = async (req: Request, res: Response) => {
serviceTokenData = await new ServiceTokenData({
name,
workspace: workspaceId,
environment,
user,
serviceAccount,
scopes,
lastUsed: new Date(),
expiresAt,
secretHash,
encryptedKey,
iv,
tag,
secretPath,
permissions,
permissions
}).save();
// return service token data without sensitive data
@ -142,7 +112,7 @@ export const createServiceTokenData = async (req: Request, res: Response) => {
return res.status(200).send({
serviceToken,
serviceTokenData,
serviceTokenData
});
};
@ -155,11 +125,9 @@ export const createServiceTokenData = async (req: Request, res: Response) => {
export const deleteServiceTokenData = async (req: Request, res: Response) => {
const { serviceTokenDataId } = req.params;
const serviceTokenData = await ServiceTokenData.findByIdAndDelete(
serviceTokenDataId
);
const serviceTokenData = await ServiceTokenData.findByIdAndDelete(serviceTokenDataId);
return res.status(200).send({
serviceTokenData,
serviceTokenData
});
};

View File

@ -1,32 +1,22 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { Membership, Secret } from "../../models";
import Tag, { ITag } from "../../models/tag";
import { Builder } from "builder-pattern";
import to from "await-to-js";
import Tag from "../../models/tag";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
import { MongoError } from "mongodb";
export const createWorkspaceTag = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const { name, slug } = req.body;
const sanitizedTagToCreate = Builder<ITag>()
.name(name)
.workspace(new Types.ObjectId(workspaceId))
.slug(slug)
.user(new Types.ObjectId(req.user._id))
.build();
const [err, createdTag] = await to(Tag.create(sanitizedTagToCreate));
if (err) {
if ((err as MongoError).code === 11000) {
throw BadRequestError({ message: "Tags must be unique in a workspace" });
}
throw err;
}
const tagToCreate = {
name,
workspace: new Types.ObjectId(workspaceId),
slug,
user: new Types.ObjectId(req.user._id),
};
const createdTag = await new Tag(tagToCreate).save();
res.json(createdTag);
};
@ -58,7 +48,11 @@ export const deleteWorkspaceTag = async (req: Request, res: Response) => {
export const getWorkspaceTags = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const workspaceTags = await Tag.find({ workspace: workspaceId });
const workspaceTags = await Tag.find({
workspace: new Types.ObjectId(workspaceId)
});
return res.json({
workspaceTags
});

View File

@ -1,8 +1,14 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import crypto from "crypto";
import bcrypt from "bcrypt";
import {
MembershipOrg,
User,
APIKeyData,
TokenVersion
} from "../../models";
import { getSaltRounds } from "../../config";
/**
* Return the current user.
@ -117,3 +123,106 @@ export const getMyOrganizations = async (req: Request, res: Response) => {
organizations,
});
}
/**
* Return API keys belonging to current user.
* @param req
* @param res
* @returns
*/
export const getMyAPIKeys = async (req: Request, res: Response) => {
const apiKeyData = await APIKeyData.find({
user: req.user._id,
});
return res.status(200).send(apiKeyData);
}
/**
* Create new API key for current user.
* @param req
* @param res
* @returns
*/
export const createAPIKey = async (req: Request, res: Response) => {
const { name, expiresIn } = req.body;
const secret = crypto.randomBytes(16).toString("hex");
const secretHash = await bcrypt.hash(secret, await getSaltRounds());
const expiresAt = new Date();
expiresAt.setSeconds(expiresAt.getSeconds() + expiresIn);
let apiKeyData = await new APIKeyData({
name,
lastUsed: new Date(),
expiresAt,
user: req.user._id,
secretHash,
}).save();
// return api key data without sensitive data
apiKeyData = (await APIKeyData.findById(apiKeyData._id)) as any;
if (!apiKeyData) throw new Error("Failed to find API key data");
const apiKey = `ak.${apiKeyData._id.toString()}.${secret}`;
return res.status(200).send({
apiKey,
apiKeyData,
});
}
/**
* Delete API key with id [apiKeyDataId] belonging to current user
* @param req
* @param res
*/
export const deleteAPIKey = async (req: Request, res: Response) => {
const { apiKeyDataId } = req.params;
const apiKeyData = await APIKeyData.findOneAndDelete({
_id: new Types.ObjectId(apiKeyDataId),
user: req.user._id
});
return res.status(200).send({
apiKeyData
});
}
/**
* Return active sessions (TokenVersion) belonging to user
* @param req
* @param res
* @returns
*/
export const getMySessions = async (req: Request, res: Response) => {
const tokenVersions = await TokenVersion.find({
user: req.user._id
});
return res.status(200).send(tokenVersions);
}
/**
* Revoke all active sessions belong to user
* @param req
* @param res
* @returns
*/
export const deleteMySessions = async (req: Request, res: Response) => {
await TokenVersion.updateMany({
user: req.user._id,
}, {
$inc: {
refreshVersion: 1,
accessVersion: 1,
},
});
return res.status(200).send({
message: "Successfully revoked all sessions"
});
}

View File

@ -1,34 +1,29 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { Key, Membership, ServiceTokenData, Workspace } from "../../models";
import {
Key,
Membership,
ServiceTokenData,
Workspace,
} from "../../models";
import {
pullSecrets as pull,
v2PushSecrets as push,
reformatPullSecrets,
pullSecrets as pull,
v2PushSecrets as push,
reformatPullSecrets
} from "../../helpers/secret";
import { pushKeys } from "../../helpers/key";
import { EventService, TelemetryService } from "../../services";
import { eventPushSecrets } from "../../events";
interface V2PushSecret {
type: string; // personal or shared
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretKeyHash: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretValueHash: string;
secretCommentCiphertext?: string;
secretCommentIV?: string;
secretCommentTag?: string;
secretCommentHash?: string;
type: string; // personal or shared
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretKeyHash: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretValueHash: string;
secretCommentCiphertext?: string;
secretCommentIV?: string;
secretCommentTag?: string;
secretCommentHash?: string;
}
/**
@ -39,7 +34,7 @@ interface V2PushSecret {
* @returns
*/
export const pushWorkspaceSecrets = async (req: Request, res: Response) => {
// upload (encrypted) secrets to workspace with id [workspaceId]
// upload (encrypted) secrets to workspace with id [workspaceId]
const postHogClient = await TelemetryService.getPostHogClient();
let { secrets }: { secrets: V2PushSecret[] } = req.body;
const { keys, environment, channel } = req.body;
@ -62,13 +57,13 @@ export const pushWorkspaceSecrets = async (req: Request, res: Response) => {
environment,
secrets,
channel: channel ? channel : "cli",
ipAddress: req.realIP,
ipAddress: req.realIP
});
await pushKeys({
userId: req.user._id,
workspaceId,
keys,
keys
});
if (postHogClient) {
@ -79,8 +74,8 @@ export const pushWorkspaceSecrets = async (req: Request, res: Response) => {
numberOfSecrets: secrets.length,
environment,
workspaceId,
channel: channel ? channel : "cli",
},
channel: channel ? channel : "cli"
}
});
}
@ -89,12 +84,13 @@ export const pushWorkspaceSecrets = async (req: Request, res: Response) => {
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath: "/"
})
});
return res.status(200).send({
message: "Successfully uploaded workspace secrets",
});
return res.status(200).send({
message: "Successfully uploaded workspace secrets"
});
};
/**
@ -105,7 +101,7 @@ export const pushWorkspaceSecrets = async (req: Request, res: Response) => {
* @returns
*/
export const pullSecrets = async (req: Request, res: Response) => {
let secrets;
let secrets;
const postHogClient = await TelemetryService.getPostHogClient();
const environment: string = req.query.environment as string;
const channel: string = req.query.channel as string;
@ -128,7 +124,7 @@ export const pullSecrets = async (req: Request, res: Response) => {
workspaceId,
environment,
channel: channel ? channel : "cli",
ipAddress: req.realIP,
ipAddress: req.realIP
});
if (channel !== "cli") {
@ -144,18 +140,18 @@ export const pullSecrets = async (req: Request, res: Response) => {
numberOfSecrets: secrets.length,
environment,
workspaceId,
channel: channel ? channel : "cli",
},
channel: channel ? channel : "cli"
}
});
}
return res.status(200).send({
secrets,
});
return res.status(200).send({
secrets
});
};
export const getWorkspaceKey = async (req: Request, res: Response) => {
/*
/*
#swagger.summary = 'Return encrypted project key'
#swagger.description = 'Return encrypted project key'
@ -183,43 +179,38 @@ export const getWorkspaceKey = async (req: Request, res: Response) => {
}
}
*/
let key;
let key;
const { workspaceId } = req.params;
key = await Key.findOne({
workspace: workspaceId,
receiver: req.user._id,
receiver: req.user._id
}).populate("sender", "+publicKey");
if (!key) throw new Error("Failed to find workspace key");
return res.status(200).json(key);
}
export const getWorkspaceServiceTokenData = async (
req: Request,
res: Response
) => {
return res.status(200).json(key);
};
export const getWorkspaceServiceTokenData = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const serviceTokenData = await ServiceTokenData
.find({
workspace: workspaceId,
})
.select("+encryptedKey +iv +tag");
const serviceTokenData = await ServiceTokenData.find({
workspace: workspaceId
}).select("+encryptedKey +iv +tag");
return res.status(200).send({
serviceTokenData,
});
}
return res.status(200).send({
serviceTokenData
});
};
/**
* Return memberships for workspace with id [workspaceId]
* @param req
* @param res
* @returns
* @param req
* @param res
* @returns
*/
export const getWorkspaceMemberships = async (req: Request, res: Response) => {
/*
/*
#swagger.summary = 'Return project memberships'
#swagger.description = 'Return project memberships'
@ -255,22 +246,22 @@ export const getWorkspaceMemberships = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const memberships = await Membership.find({
workspace: workspaceId,
workspace: workspaceId
}).populate("user", "+publicKey");
return res.status(200).send({
memberships,
});
}
return res.status(200).send({
memberships
});
};
/**
* Update role of membership with id [membershipId] to role [role]
* @param req
* @param res
* @returns
* @param req
* @param res
* @returns
*/
export const updateWorkspaceMembership = async (req: Request, res: Response) => {
/*
/*
#swagger.summary = 'Update project membership'
#swagger.description = 'Update project membership'
@ -323,33 +314,32 @@ export const updateWorkspaceMembership = async (req: Request, res: Response) =>
}
}
*/
const {
membershipId,
} = req.params;
const { membershipId } = req.params;
const { role } = req.body;
const membership = await Membership.findByIdAndUpdate(
membershipId,
{
role,
}, {
new: true,
role
},
{
new: true
}
);
return res.status(200).send({
membership,
});
}
return res.status(200).send({
membership
});
};
/**
* Delete workspace membership with id [membershipId]
* @param req
* @param res
* @returns
* @param req
* @param res
* @returns
*/
export const deleteWorkspaceMembership = async (req: Request, res: Response) => {
/*
/*
#swagger.summary = 'Delete project membership'
#swagger.description = 'Delete project membership'
@ -385,23 +375,21 @@ export const deleteWorkspaceMembership = async (req: Request, res: Response) =>
}
}
*/
const {
membershipId,
} = req.params;
const { membershipId } = req.params;
const membership = await Membership.findByIdAndDelete(membershipId);
if (!membership) throw new Error("Failed to delete workspace membership");
await Key.deleteMany({
receiver: membership.user,
workspace: membership.workspace,
workspace: membership.workspace
});
return res.status(200).send({
membership,
});
}
return res.status(200).send({
membership
});
};
/**
* Change autoCapitilzation Rule of workspace
@ -415,18 +403,18 @@ export const toggleAutoCapitalization = async (req: Request, res: Response) => {
const workspace = await Workspace.findOneAndUpdate(
{
_id: workspaceId,
_id: workspaceId
},
{
autoCapitalization,
autoCapitalization
},
{
new: true,
new: true
}
);
return res.status(200).send({
message: "Successfully changed autoCapitalization setting",
workspace,
});
return res.status(200).send({
message: "Successfully changed autoCapitalization setting",
workspace
});
};

View File

@ -21,22 +21,22 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(workspaceId),
environment,
secretPath,
authData: req.authData,
authData: req.authData
});
const key = await BotService.getWorkspaceKeyWithBot({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
return res.status(200).send({
secrets: secrets.map((secret) => {
const rep = repackageSecretToRaw({
secret,
key,
key
});
return rep;
}),
})
});
};
@ -58,54 +58,47 @@ export const getSecretByNameRaw = async (req: Request, res: Response) => {
environment,
type,
secretPath,
authData: req.authData,
authData: req.authData
});
const key = await BotService.getWorkspaceKeyWithBot({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
return res.status(200).send({
secret: repackageSecretToRaw({
secret,
key,
}),
key
})
});
};
/**
* Create secret with name [secretName] in plaintext
* @param req
* @param res
* @param res
*/
export const createSecretRaw = async (req: Request, res: Response) => {
const { secretName } = req.params;
const {
workspaceId,
environment,
type,
secretValue,
secretComment,
secretPath = "/",
} = req.body;
const { workspaceId, environment, type, secretValue, secretComment, secretPath = "/" } = req.body;
const key = await BotService.getWorkspaceKeyWithBot({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
const secretKeyEncrypted = encryptSymmetric128BitHexKeyUTF8({
plaintext: secretName,
key,
key
});
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8({
plaintext: secretValue,
key,
key
});
const secretCommentEncrypted = encryptSymmetric128BitHexKeyUTF8({
plaintext: secretComment,
key,
key
});
const secret = await SecretService.createSecret({
@ -123,14 +116,15 @@ export const createSecretRaw = async (req: Request, res: Response) => {
secretPath,
secretCommentCiphertext: secretCommentEncrypted.ciphertext,
secretCommentIV: secretCommentEncrypted.iv,
secretCommentTag: secretCommentEncrypted.tag,
secretCommentTag: secretCommentEncrypted.tag
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath
})
});
const secretWithoutBlindIndex = secret.toObject();
@ -139,10 +133,10 @@ export const createSecretRaw = async (req: Request, res: Response) => {
return res.status(200).send({
secret: repackageSecretToRaw({
secret: secretWithoutBlindIndex,
key,
}),
key
})
});
}
};
/**
* Update secret with name [secretName]
@ -151,21 +145,15 @@ export const createSecretRaw = async (req: Request, res: Response) => {
*/
export const updateSecretByNameRaw = async (req: Request, res: Response) => {
const { secretName } = req.params;
const {
workspaceId,
environment,
type,
secretValue,
secretPath = "/",
} = req.body;
const { workspaceId, environment, type, secretValue, secretPath = "/" } = req.body;
const key = await BotService.getWorkspaceKeyWithBot({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
const secretValueEncrypted = encryptSymmetric128BitHexKeyUTF8({
plaintext: secretValue,
key,
key
});
const secret = await SecretService.updateSecret({
@ -177,21 +165,22 @@ export const updateSecretByNameRaw = async (req: Request, res: Response) => {
secretValueCiphertext: secretValueEncrypted.ciphertext,
secretValueIV: secretValueEncrypted.iv,
secretValueTag: secretValueEncrypted.tag,
secretPath,
secretPath
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath
})
});
return res.status(200).send({
secret: repackageSecretToRaw({
secret,
key,
}),
key
})
});
};
@ -202,12 +191,7 @@ export const updateSecretByNameRaw = async (req: Request, res: Response) => {
*/
export const deleteSecretByNameRaw = async (req: Request, res: Response) => {
const { secretName } = req.params;
const {
workspaceId,
environment,
type,
secretPath = "/",
} = req.body;
const { workspaceId, environment, type, secretPath = "/" } = req.body;
const { secret } = await SecretService.deleteSecret({
secretName,
@ -215,25 +199,26 @@ export const deleteSecretByNameRaw = async (req: Request, res: Response) => {
environment,
type,
authData: req.authData,
secretPath,
secretPath
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath
})
});
const key = await BotService.getWorkspaceKeyWithBot({
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
return res.status(200).send({
secret: repackageSecretToRaw({
secret,
key,
}),
key
})
});
};
@ -252,11 +237,11 @@ export const getSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(workspaceId),
environment,
secretPath,
authData: req.authData,
authData: req.authData
});
return res.status(200).send({
secrets,
secrets
});
};
@ -278,11 +263,11 @@ export const getSecretByName = async (req: Request, res: Response) => {
environment,
type,
secretPath,
authData: req.authData,
authData: req.authData
});
return res.status(200).send({
secret,
secret
});
};
@ -306,7 +291,7 @@ export const createSecret = async (req: Request, res: Response) => {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
secretPath = "/",
secretPath = "/"
} = req.body;
const secret = await SecretService.createSecret({
@ -324,25 +309,25 @@ export const createSecret = async (req: Request, res: Response) => {
secretPath,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
secretCommentTag
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath
})
});
const secretWithoutBlindIndex = secret.toObject();
delete secretWithoutBlindIndex.secretBlindIndex;
return res.status(200).send({
secret: secretWithoutBlindIndex,
secret: secretWithoutBlindIndex
});
};
/**
* Update secret with name [secretName]
* @param req
@ -357,7 +342,7 @@ export const updateSecretByName = async (req: Request, res: Response) => {
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretPath = "/",
secretPath = "/"
} = req.body;
const secret = await SecretService.updateSecret({
@ -369,18 +354,19 @@ export const updateSecretByName = async (req: Request, res: Response) => {
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretPath,
secretPath
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath
})
});
return res.status(200).send({
secret,
secret
});
};
@ -391,12 +377,7 @@ export const updateSecretByName = async (req: Request, res: Response) => {
*/
export const deleteSecretByName = async (req: Request, res: Response) => {
const { secretName } = req.params;
const {
workspaceId,
environment,
type,
secretPath = "/",
} = req.body;
const { workspaceId, environment, type, secretPath = "/" } = req.body;
const { secret } = await SecretService.deleteSecret({
secretName,
@ -404,17 +385,18 @@ export const deleteSecretByName = async (req: Request, res: Response) => {
environment,
type,
authData: req.authData,
secretPath,
secretPath
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
}),
secretPath
})
});
return res.status(200).send({
secret,
secret
});
};

View File

@ -4,7 +4,6 @@ import { IMembershipPermission } from "../../../models/membership";
import { BadRequestError, UnauthorizedRequestError } from "../../../utils/errors";
import { ADMIN, MEMBER } from "../../../variables/organization";
import { PERMISSION_READ_SECRETS, PERMISSION_WRITE_SECRETS } from "../../../variables";
import { Builder } from "builder-pattern"
import _ from "lodash";
export const denyMembershipPermissions = async (req: Request, res: Response) => {
@ -15,10 +14,10 @@ export const denyMembershipPermissions = async (req: Request, res: Response) =>
throw BadRequestError({ message: "One or more required fields are missing from the request or have incorrect type" })
}
return Builder<IMembershipPermission>()
.environmentSlug(permission.environmentSlug)
.ability(permission.ability)
.build();
return {
environmentSlug: permission.environmentSlug,
ability: permission.ability
}
})
const sanitizedMembershipPermissionsUnique = _.uniqWith(sanitizedMembershipPermissions, _.isEqual)

View File

@ -3,8 +3,18 @@ import { getLicenseServerUrl } from "../../../config";
import { licenseServerKeyRequest } from "../../../config/request";
import { EELicenseService } from "../../services";
export const getOrganizationPlansTable = async (req: Request, res: Response) => {
const billingCycle = req.query.billingCycle as string;
const { data } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/cloud-products?billing-cycle=${billingCycle}`
);
return res.status(200).send(data);
}
/**
* Return the organization's current plan and allowed feature set
* Return the organization current plan's feature set
*/
export const getOrganizationPlan = async (req: Request, res: Response) => {
const { organizationId } = req.params;
@ -18,26 +28,82 @@ export const getOrganizationPlan = async (req: Request, res: Response) => {
}
/**
* Update the organization plan to product with id [productId]
* Return checkout url for pro trial
* @param req
* @param res
* @returns
*/
export const updateOrganizationPlan = async (req: Request, res: Response) => {
const {
productId,
} = req.body;
export const startOrganizationTrial = async (req: Request, res: Response) => {
const { organizationId } = req.params;
const { success_url } = req.body;
const { data } = await licenseServerKeyRequest.patch(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/cloud-plan`,
const { data: { url } } = await licenseServerKeyRequest.post(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/session/trial`,
{
productId,
success_url
}
);
EELicenseService.delPlan(organizationId);
return res.status(200).send({
url
});
}
/**
* Return the organization's current plan's billing info
* @param req
* @param res
* @returns
*/
export const getOrganizationPlanBillingInfo = async (req: Request, res: Response) => {
const { data } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/cloud-plan/billing`
);
return res.status(200).send(data);
}
/**
* Return the organization's current plan's feature table
* @param req
* @param res
* @returns
*/
export const getOrganizationPlanTable = async (req: Request, res: Response) => {
const { data } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/cloud-plan/table`
);
return res.status(200).send(data);
}
export const getOrganizationBillingDetails = async (req: Request, res: Response) => {
const { data } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details`
);
return res.status(200).send(data);
}
export const updateOrganizationBillingDetails = async (req: Request, res: Response) => {
const {
name,
email
} = req.body;
const { data } = await licenseServerKeyRequest.patch(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details`,
{
...(name ? { name } : {}),
...(email ? { email } : {})
}
);
return res.status(200).send(data);
}
/**
* Return the organization's payment methods on file
*/
@ -46,9 +112,7 @@ export const getOrganizationPmtMethods = async (req: Request, res: Response) =>
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/payment-methods`
);
return res.status(200).send({
pmtMethods,
});
return res.status(200).send(pmtMethods);
}
/**
@ -81,4 +145,53 @@ export const deleteOrganizationPmtMethod = async (req: Request, res: Response) =
);
return res.status(200).send(data);
}
/**
* Return the organization's tax ids on file
*/
export const getOrganizationTaxIds = async (req: Request, res: Response) => {
const { data: { tax_ids } } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/tax-ids`
);
return res.status(200).send(tax_ids);
}
/**
* Add tax id to organization
*/
export const addOrganizationTaxId = async (req: Request, res: Response) => {
const {
type,
value
} = req.body;
const { data } = await licenseServerKeyRequest.post(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/tax-ids`,
{
type,
value
}
);
return res.status(200).send(data);
}
export const deleteOrganizationTaxId = async (req: Request, res: Response) => {
const { taxId } = req.params;
const { data } = await licenseServerKeyRequest.delete(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/tax-ids/${taxId}`,
);
return res.status(200).send(data);
}
export const getOrganizationInvoices = async (req: Request, res: Response) => {
const { data: { invoices } } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/invoices`
);
return res.status(200).send(invoices);
}

View File

@ -54,23 +54,20 @@ export const getSecretVersions = async (req: Request, res: Response) => {
}
}
*/
const { secretId, workspaceId, environment, folderId } = req.params;
const { secretId } = req.params;
const offset: number = parseInt(req.query.offset as string);
const limit: number = parseInt(req.query.limit as string);
const secretVersions = await SecretVersion.find({
secret: secretId,
workspace: workspaceId,
environment,
folder: folderId,
secret: secretId
})
.sort({ createdAt: -1 })
.skip(offset)
.limit(limit);
return res.status(200).send({
secretVersions,
secretVersions
});
};
@ -135,7 +132,7 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
// validate secret version
const oldSecretVersion = await SecretVersion.findOne({
secret: secretId,
version,
version
}).select("+secretBlindIndex");
if (!oldSecretVersion) throw new Error("Failed to find secret version");
@ -154,7 +151,7 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretValueTag,
algorithm,
folder,
keyEncoding,
keyEncoding
} = oldSecretVersion;
// update secret
@ -162,7 +159,7 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretId,
{
$inc: {
version: 1,
version: 1
},
workspace,
type,
@ -177,10 +174,10 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretValueTag,
folderId: folder,
algorithm,
keyEncoding,
keyEncoding
},
{
new: true,
new: true
}
);
@ -204,17 +201,17 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretValueTag,
folder,
algorithm,
keyEncoding,
keyEncoding
}).save();
// take secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId: secret.workspace,
environment,
folderId: folder,
folderId: folder
});
return res.status(200).send({
secret,
secret
});
};

View File

@ -11,10 +11,25 @@ import {
ACCEPTED, ADMIN, MEMBER, OWNER,
} from "../../../variables";
router.get(
"/:organizationId/plans/table",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
query("billingCycle").exists().isString().isIn(["monthly", "yearly"]),
validateRequest,
organizationsController.getOrganizationPlansTable
);
router.get(
"/:organizationId/plan",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
@ -26,25 +41,85 @@ router.get(
organizationsController.getOrganizationPlan
);
router.patch(
"/:organizationId/plan",
router.post(
"/:organizationId/session/trial",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
body("productId").exists().isString(),
body("success_url").exists().trim(),
validateRequest,
organizationsController.updateOrganizationPlan
organizationsController.startOrganizationTrial
);
router.get(
"/:organizationId/plan/billing",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
query("workspaceId").optional().isString(),
validateRequest,
organizationsController.getOrganizationPlanBillingInfo
);
router.get(
"/:organizationId/plan/table",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
query("workspaceId").optional().isString(),
validateRequest,
organizationsController.getOrganizationPlanTable
);
router.get(
"/:organizationId/billing-details",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
validateRequest,
organizationsController.getOrganizationBillingDetails
);
router.patch(
"/:organizationId/billing-details",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
body("email").optional().isString().trim(),
body("name").optional().isString().trim(),
validateRequest,
organizationsController.updateOrganizationBillingDetails
);
router.get(
"/:organizationId/billing-details/payment-methods",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
@ -58,7 +133,7 @@ router.get(
router.post(
"/:organizationId/billing-details/payment-methods",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
@ -74,7 +149,22 @@ router.post(
router.delete(
"/:organizationId/billing-details/payment-methods/:pmtMethodId",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
param("pmtMethodId").exists().trim(),
validateRequest,
organizationsController.deleteOrganizationPmtMethod
);
router.get(
"/:organizationId/billing-details/tax-ids",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
@ -82,7 +172,52 @@ router.delete(
}),
param("organizationId").exists().trim(),
validateRequest,
organizationsController.deleteOrganizationPmtMethod
organizationsController.getOrganizationTaxIds
);
router.post(
"/:organizationId/billing-details/tax-ids",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
body("type").exists().isString(),
body("value").exists().isString(),
validateRequest,
organizationsController.addOrganizationTaxId
);
router.delete(
"/:organizationId/billing-details/tax-ids/:taxId",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
param("taxId").exists().trim(),
validateRequest,
organizationsController.deleteOrganizationTaxId
);
router.get(
"/:organizationId/invoices",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
param("organizationId").exists().trim(),
validateRequest,
organizationsController.getOrganizationInvoices
);
export default router;

View File

@ -30,6 +30,9 @@ interface FeatureSet {
customRateLimits: boolean;
customAlerts: boolean;
auditLogs: boolean;
status: 'incomplete' | 'incomplete_expired' | 'trialing' | 'active' | 'past_due' | 'canceled' | 'unpaid' | null;
trial_end: number | null;
has_used_trial: boolean;
}
/**
@ -55,11 +58,14 @@ class EELicenseService {
environmentLimit: null,
environmentsUsed: 0,
secretVersioning: true,
pitRecovery: true,
pitRecovery: false,
rbac: true,
customRateLimits: true,
customAlerts: true,
auditLogs: false,
status: null,
trial_end: null,
has_used_trial: true
}
public localFeatureSet: NodeCache;
@ -67,7 +73,7 @@ class EELicenseService {
constructor() {
this._isLicenseValid = true;
this.localFeatureSet = new NodeCache({
stdTTL: 300,
stdTTL: 60,
});
}
@ -108,6 +114,12 @@ class EELicenseService {
await this.getPlan(organizationId, workspaceId);
}
}
public async delPlan(organizationId: string) {
if (this.instanceType === "cloud") {
this.localFeatureSet.del(`${organizationId}-`);
}
}
public async initGlobalFeatureSet() {
const licenseServerKey = await getLicenseServerKey();

View File

@ -1,5 +1,4 @@
import { eventPushSecrets } from "./secret"
import { eventPushSecrets } from "./secret";
import { eventStartIntegration } from "./integration";
export {
eventPushSecrets,
}
export { eventPushSecrets, eventStartIntegration };

View File

@ -0,0 +1,23 @@
import { Types } from "mongoose";
import { EVENT_START_INTEGRATION } from "../variables";
/*
* Return event for starting integrations
* @param {Object} obj
* @param {String} obj.workspaceId - id of workspace to push secrets to
* @returns
*/
export const eventStartIntegration = ({
workspaceId,
environment
}: {
workspaceId: Types.ObjectId;
environment: string;
}) => {
return {
name: EVENT_START_INTEGRATION,
workspaceId,
environment,
payload: {}
};
};

View File

@ -1,64 +1,54 @@
import { Types } from "mongoose";
import {
EVENT_PULL_SECRETS,
EVENT_PUSH_SECRETS,
} from "../variables";
import { EVENT_PULL_SECRETS, EVENT_PUSH_SECRETS } from "../variables";
interface PushSecret {
ciphertextKey: string;
ivKey: string;
tagKey: string;
hashKey: string;
ciphertextValue: string;
ivValue: string;
tagValue: string;
hashValue: string;
type: "shared" | "personal";
ciphertextKey: string;
ivKey: string;
tagKey: string;
hashKey: string;
ciphertextValue: string;
ivValue: string;
tagValue: string;
hashValue: string;
type: "shared" | "personal";
}
/**
* Return event for pushing secrets
* @param {Object} obj
* @param {String} obj.workspaceId - id of workspace to push secrets to
* @returns
* @returns
*/
const eventPushSecrets = ({
workspaceId,
environment,
secretPath
}: {
workspaceId: Types.ObjectId;
environment: string;
secretPath: string;
}) => {
return {
name: EVENT_PUSH_SECRETS,
workspaceId,
environment,
}: {
workspaceId: Types.ObjectId;
environment?: string;
}) => {
return ({
name: EVENT_PUSH_SECRETS,
workspaceId,
environment,
payload: {
},
});
}
secretPath,
payload: {}
};
};
/**
* Return event for pulling secrets
* @param {Object} obj
* @param {String} obj.workspaceId - id of workspace to pull secrets from
* @returns
* @returns
*/
const eventPullSecrets = ({
const eventPullSecrets = ({ workspaceId }: { workspaceId: string }) => {
return {
name: EVENT_PULL_SECRETS,
workspaceId,
}: {
workspaceId: string;
}) => {
return ({
name: EVENT_PULL_SECRETS,
workspaceId,
payload: {
payload: {}
};
};
},
});
}
export {
eventPushSecrets,
}
export { eventPushSecrets };

View File

@ -1,12 +1,14 @@
import { Types } from "mongoose";
import { Bot } from "../models";
import { EVENT_PUSH_SECRETS } from "../variables";
import { EVENT_PUSH_SECRETS, EVENT_START_INTEGRATION } from "../variables";
import { IntegrationService } from "../services";
import { triggerWebhook } from "../services/WebhookService";
interface Event {
name: string;
workspaceId: Types.ObjectId;
environment?: string;
secretPath?: string;
payload: any;
}
@ -19,22 +21,31 @@ interface Event {
* @param {Object} obj.event.payload - payload of event (depends on event)
*/
export const handleEventHelper = async ({ event }: { event: Event }) => {
const { workspaceId, environment } = event;
const { workspaceId, environment, secretPath } = event;
// TODO: moduralize bot check into separate function
const bot = await Bot.findOne({
workspace: workspaceId,
isActive: true,
isActive: true
});
if (!bot) return;
switch (event.name) {
case EVENT_PUSH_SECRETS:
IntegrationService.syncIntegrations({
workspaceId,
environment,
});
if (bot) {
await IntegrationService.syncIntegrations({
workspaceId,
environment
});
}
triggerWebhook(workspaceId.toString(), environment || "", secretPath || "");
break;
case EVENT_START_INTEGRATION:
if (bot) {
IntegrationService.syncIntegrations({
workspaceId,
environment
});
}
break;
}
};
};

View File

@ -45,6 +45,7 @@ export const createOrganization = async ({
name,
customerId
}).save();
} else {
organization = await new Organization({
name,

View File

@ -4,13 +4,14 @@ import {
DeleteSecretParams,
GetSecretParams,
GetSecretsParams,
UpdateSecretParams,
UpdateSecretParams
} from "../interfaces/services/SecretService";
import {
ISecret,
IServiceTokenData,
Secret,
SecretBlindIndexData,
ServiceTokenData,
ServiceTokenData
} from "../models";
import { SecretVersion } from "../ee/models";
import {
@ -18,7 +19,7 @@ import {
InternalServerError,
SecretBlindIndexDataNotFoundError,
SecretNotFoundError,
UnauthorizedRequestError,
UnauthorizedRequestError
} from "../utils/errors";
import {
ACTION_ADD_SECRETS,
@ -29,51 +30,57 @@ import {
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8,
SECRET_PERSONAL,
SECRET_SHARED,
SECRET_SHARED
} from "../variables";
import crypto from "crypto";
import * as argon2 from "argon2";
import {
decryptSymmetric128BitHexKeyUTF8,
encryptSymmetric128BitHexKeyUTF8,
encryptSymmetric128BitHexKeyUTF8
} from "../utils/crypto";
import { TelemetryService } from "../services";
import { client, getEncryptionKey, getRootEncryptionKey } from "../config";
import { EELogService, EESecretService } from "../ee/services";
import {
getAuthDataPayloadIdObj,
getAuthDataPayloadUserObj,
} from "../utils/auth";
import { getAuthDataPayloadIdObj, getAuthDataPayloadUserObj } from "../utils/auth";
import { getFolderIdFromServiceToken } from "../services/FolderService";
import picomatch from "picomatch";
export const isValidScope = (
authPayload: IServiceTokenData,
environment: string,
secretPath: string
) => {
const { scopes: tkScopes } = authPayload;
const validScope = tkScopes.find(
(scope) =>
picomatch.isMatch(secretPath, scope.secretPath, { strictSlashes: false }) &&
scope.environment === environment
);
return Boolean(validScope);
};
/**
* Returns an object containing secret [secret] but with its value, key, comment decrypted.
*
*
* Precondition: the workspace for secret [secret] must have E2EE disabled
* @param {ISecret} secret - secret to repackage to raw
* @param {String} key - symmetric key to use to decrypt secret
* @returns
* @returns
*/
export const repackageSecretToRaw = ({
secret,
key,
}: {
secret: ISecret;
key: string;
}) => {
export const repackageSecretToRaw = ({ secret, key }: { secret: ISecret; key: string }) => {
const secretKey = decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretKeyCiphertext,
iv: secret.secretKeyIV,
tag: secret.secretKeyTag,
key,
key
});
const secretValue = decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretValueCiphertext,
iv: secret.secretValueIV,
tag: secret.secretValueTag,
key,
key
});
let secretComment = "";
@ -83,11 +90,11 @@ export const repackageSecretToRaw = ({
ciphertext: secret.secretCommentCiphertext,
iv: secret.secretCommentIV,
tag: secret.secretCommentTag,
key,
key
});
}
return ({
return {
_id: secret._id,
version: secret.version,
workspace: secret.workspace,
@ -96,9 +103,9 @@ export const repackageSecretToRaw = ({
user: secret.user,
secretKey,
secretValue,
secretComment,
});
}
secretComment
};
};
/**
* Create secret blind index data containing encrypted blind index [salt]
@ -107,7 +114,7 @@ export const repackageSecretToRaw = ({
* @param {Types.ObjectId} obj.workspaceId
*/
export const createSecretBlindIndexDataHelper = async ({
workspaceId,
workspaceId
}: {
workspaceId: Types.ObjectId;
}) => {
@ -121,7 +128,7 @@ export const createSecretBlindIndexDataHelper = async ({
const {
ciphertext: encryptedSaltCiphertext,
iv: saltIV,
tag: saltTag,
tag: saltTag
} = client.encryptSymmetric(salt, rootEncryptionKey);
return await new SecretBlindIndexData({
@ -130,16 +137,16 @@ export const createSecretBlindIndexDataHelper = async ({
saltIV,
saltTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_BASE64,
keyEncoding: ENCODING_SCHEME_BASE64
}).save();
} else {
const {
ciphertext: encryptedSaltCiphertext,
iv: saltIV,
tag: saltTag,
tag: saltTag
} = encryptSymmetric128BitHexKeyUTF8({
plaintext: salt,
key: encryptionKey,
key: encryptionKey
});
return await new SecretBlindIndexData({
@ -148,7 +155,7 @@ export const createSecretBlindIndexDataHelper = async ({
saltIV,
saltTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
}).save();
}
};
@ -160,7 +167,7 @@ export const createSecretBlindIndexDataHelper = async ({
* @returns
*/
export const getSecretBlindIndexSaltHelper = async ({
workspaceId,
workspaceId
}: {
workspaceId: Types.ObjectId;
}) => {
@ -168,36 +175,30 @@ export const getSecretBlindIndexSaltHelper = async ({
const rootEncryptionKey = await getRootEncryptionKey();
const secretBlindIndexData = await SecretBlindIndexData.findOne({
workspace: workspaceId,
workspace: workspaceId
}).select("+algorithm +keyEncoding");
if (!secretBlindIndexData) throw SecretBlindIndexDataNotFoundError();
if (
rootEncryptionKey &&
secretBlindIndexData.keyEncoding === ENCODING_SCHEME_BASE64
) {
if (rootEncryptionKey && secretBlindIndexData.keyEncoding === ENCODING_SCHEME_BASE64) {
return client.decryptSymmetric(
secretBlindIndexData.encryptedSaltCiphertext,
rootEncryptionKey,
secretBlindIndexData.saltIV,
secretBlindIndexData.saltTag
);
} else if (
encryptionKey &&
secretBlindIndexData.keyEncoding === ENCODING_SCHEME_UTF8
) {
} else if (encryptionKey && secretBlindIndexData.keyEncoding === ENCODING_SCHEME_UTF8) {
// decrypt workspace salt
return decryptSymmetric128BitHexKeyUTF8({
ciphertext: secretBlindIndexData.encryptedSaltCiphertext,
iv: secretBlindIndexData.saltIV,
tag: secretBlindIndexData.saltTag,
key: encryptionKey,
key: encryptionKey
});
}
throw InternalServerError({
message: "Failed to obtain workspace salt needed for secret blind indexing",
message: "Failed to obtain workspace salt needed for secret blind indexing"
});
};
@ -210,7 +211,7 @@ export const getSecretBlindIndexSaltHelper = async ({
*/
export const generateSecretBlindIndexWithSaltHelper = async ({
secretName,
salt,
salt
}: {
secretName: string;
salt: string;
@ -224,7 +225,7 @@ export const generateSecretBlindIndexWithSaltHelper = async ({
memoryCost: 65536, // default pool of 64 MiB per thread.
hashLength: 32,
parallelism: 1,
raw: true,
raw: true
})
).toString("base64");
@ -240,7 +241,7 @@ export const generateSecretBlindIndexWithSaltHelper = async ({
*/
export const generateSecretBlindIndexHelper = async ({
secretName,
workspaceId,
workspaceId
}: {
secretName: string;
workspaceId: Types.ObjectId;
@ -250,16 +251,13 @@ export const generateSecretBlindIndexHelper = async ({
const rootEncryptionKey = await getRootEncryptionKey();
const secretBlindIndexData = await SecretBlindIndexData.findOne({
workspace: workspaceId,
workspace: workspaceId
}).select("+algorithm +keyEncoding");
if (!secretBlindIndexData) throw SecretBlindIndexDataNotFoundError();
let salt;
if (
rootEncryptionKey &&
secretBlindIndexData.keyEncoding === ENCODING_SCHEME_BASE64
) {
if (rootEncryptionKey && secretBlindIndexData.keyEncoding === ENCODING_SCHEME_BASE64) {
salt = client.decryptSymmetric(
secretBlindIndexData.encryptedSaltCiphertext,
rootEncryptionKey,
@ -269,32 +267,29 @@ export const generateSecretBlindIndexHelper = async ({
const secretBlindIndex = await generateSecretBlindIndexWithSaltHelper({
secretName,
salt,
salt
});
return secretBlindIndex;
} else if (
encryptionKey &&
secretBlindIndexData.keyEncoding === ENCODING_SCHEME_UTF8
) {
} else if (encryptionKey && secretBlindIndexData.keyEncoding === ENCODING_SCHEME_UTF8) {
// decrypt workspace salt
salt = decryptSymmetric128BitHexKeyUTF8({
ciphertext: secretBlindIndexData.encryptedSaltCiphertext,
iv: secretBlindIndexData.saltIV,
tag: secretBlindIndexData.saltTag,
key: encryptionKey,
key: encryptionKey
});
const secretBlindIndex = await generateSecretBlindIndexWithSaltHelper({
secretName,
salt,
salt
});
return secretBlindIndex;
}
throw InternalServerError({
message: "Failed to generate secret blind index",
message: "Failed to generate secret blind index"
});
};
@ -323,38 +318,32 @@ export const createSecretHelper = async ({
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
secretPath = "/",
secretPath = "/"
}: CreateSecretParams) => {
const secretBlindIndex = await generateSecretBlindIndexHelper({
secretName,
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
// if using service token filter towards the folderId by secretpath
if (authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = authData.authPayload;
if (secretPath !== serviceTkScopedSecretPath) {
if (!isValidScope(authData.authPayload, environment, secretPath)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
const folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
);
const folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
const exists = await Secret.exists({
secretBlindIndex,
workspace: new Types.ObjectId(workspaceId),
folder: folderId,
type,
...(type === SECRET_PERSONAL ? getAuthDataPayloadUserObj(authData) : {}),
...(type === SECRET_PERSONAL ? getAuthDataPayloadUserObj(authData) : {})
});
if (exists)
throw BadRequestError({
message: "Failed to create secret that already exists",
message: "Failed to create secret that already exists"
});
if (type === SECRET_PERSONAL) {
@ -365,13 +354,12 @@ export const createSecretHelper = async ({
secretBlindIndex,
folder: folderId,
workspace: new Types.ObjectId(workspaceId),
type: SECRET_SHARED,
type: SECRET_SHARED
});
if (!exists)
throw BadRequestError({
message:
"Failed to create personal secret override for no corresponding shared secret",
message: "Failed to create personal secret override for no corresponding shared secret"
});
}
@ -394,7 +382,7 @@ export const createSecretHelper = async ({
secretCommentTag,
folder: folderId,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
}).save();
const secretVersion = new SecretVersion({
@ -414,12 +402,12 @@ export const createSecretHelper = async ({
secretValueIV,
secretValueTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
});
// (EE) add version for new secret
await EESecretService.addSecretVersions({
secretVersions: [secretVersion],
secretVersions: [secretVersion]
});
// (EE) create (audit) log
@ -427,7 +415,7 @@ export const createSecretHelper = async ({
name: ACTION_ADD_SECRETS,
...getAuthDataPayloadIdObj(authData),
workspaceId,
secretIds: [secret._id],
secretIds: [secret._id]
});
action &&
@ -436,14 +424,14 @@ export const createSecretHelper = async ({
workspaceId,
actions: [action],
channel: authData.authChannel,
ipAddress: authData.authIP,
ipAddress: authData.authIP
}));
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId,
environment,
folderId,
folderId
});
const postHogClient = await TelemetryService.getPostHogClient();
@ -452,7 +440,7 @@ export const createSecretHelper = async ({
postHogClient.capture({
event: "secrets added",
distinctId: await TelemetryService.getDistinctId({
authData,
authData
}),
properties: {
numberOfSecrets: 1,
@ -460,8 +448,8 @@ export const createSecretHelper = async ({
workspaceId,
folderId,
channel: authData.authChannel,
userAgent: authData.authUserAgent,
},
userAgent: authData.authUserAgent
}
});
}
@ -480,21 +468,16 @@ export const getSecretsHelper = async ({
workspaceId,
environment,
authData,
secretPath = "/",
secretPath = "/"
}: GetSecretsParams) => {
let secrets: ISecret[] = [];
// if using service token filter towards the folderId by secretpath
if (authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = authData.authPayload;
if (secretPath !== serviceTkScopedSecretPath) {
if (!isValidScope(authData.authPayload, environment, secretPath)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
const folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
);
const folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
// get personal secrets first
secrets = await Secret.find({
@ -502,8 +485,10 @@ export const getSecretsHelper = async ({
environment,
folder: folderId,
type: SECRET_PERSONAL,
...getAuthDataPayloadUserObj(authData),
}).populate("tags").lean();
...getAuthDataPayloadUserObj(authData)
})
.populate("tags")
.lean();
// concat with shared secrets
secrets = secrets.concat(
@ -513,9 +498,11 @@ export const getSecretsHelper = async ({
folder: folderId,
type: SECRET_SHARED,
secretBlindIndex: {
$nin: secrets.map((secret) => secret.secretBlindIndex),
},
}).populate("tags").lean()
$nin: secrets.map((secret) => secret.secretBlindIndex)
}
})
.populate("tags")
.lean()
);
// (EE) create (audit) log
@ -523,7 +510,7 @@ export const getSecretsHelper = async ({
name: ACTION_READ_SECRETS,
...getAuthDataPayloadIdObj(authData),
workspaceId,
secretIds: secrets.map((secret) => secret._id),
secretIds: secrets.map((secret) => secret._id)
});
action &&
@ -532,7 +519,7 @@ export const getSecretsHelper = async ({
workspaceId,
actions: [action],
channel: authData.authChannel,
ipAddress: authData.authIP,
ipAddress: authData.authIP
}));
const postHogClient = await TelemetryService.getPostHogClient();
@ -541,7 +528,7 @@ export const getSecretsHelper = async ({
postHogClient.capture({
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({
authData,
authData
}),
properties: {
numberOfSecrets: secrets.length,
@ -549,8 +536,8 @@ export const getSecretsHelper = async ({
workspaceId,
folderId,
channel: authData.authChannel,
userAgent: authData.authUserAgent,
},
userAgent: authData.authUserAgent
}
});
}
@ -573,25 +560,20 @@ export const getSecretHelper = async ({
environment,
type,
authData,
secretPath = "/",
secretPath = "/"
}: GetSecretParams) => {
const secretBlindIndex = await generateSecretBlindIndexHelper({
secretName,
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
let secret: ISecret | null = null;
// if using service token filter towards the folderId by secretpath
if (authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = authData.authPayload;
if (secretPath !== serviceTkScopedSecretPath) {
if (!isValidScope(authData.authPayload, environment, secretPath)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
const folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
);
const folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
// try getting personal secret first (if exists)
secret = await Secret.findOne({
@ -600,7 +582,7 @@ export const getSecretHelper = async ({
environment,
folder: folderId,
type: type ?? SECRET_PERSONAL,
...(type === SECRET_PERSONAL ? getAuthDataPayloadUserObj(authData) : {}),
...(type === SECRET_PERSONAL ? getAuthDataPayloadUserObj(authData) : {})
}).lean();
if (!secret) {
@ -611,7 +593,7 @@ export const getSecretHelper = async ({
workspace: new Types.ObjectId(workspaceId),
environment,
folder: folderId,
type: SECRET_SHARED,
type: SECRET_SHARED
}).lean();
}
@ -622,7 +604,7 @@ export const getSecretHelper = async ({
name: ACTION_READ_SECRETS,
...getAuthDataPayloadIdObj(authData),
workspaceId,
secretIds: [secret._id],
secretIds: [secret._id]
});
action &&
@ -631,7 +613,7 @@ export const getSecretHelper = async ({
workspaceId,
actions: [action],
channel: authData.authChannel,
ipAddress: authData.authIP,
ipAddress: authData.authIP
}));
const postHogClient = await TelemetryService.getPostHogClient();
@ -640,7 +622,7 @@ export const getSecretHelper = async ({
postHogClient.capture({
event: "secrets pull",
distinctId: await TelemetryService.getDistinctId({
authData,
authData
}),
properties: {
numberOfSecrets: 1,
@ -648,8 +630,8 @@ export const getSecretHelper = async ({
workspaceId,
folderId,
channel: authData.authChannel,
userAgent: authData.authUserAgent,
},
userAgent: authData.authUserAgent
}
});
}
@ -679,26 +661,21 @@ export const updateSecretHelper = async ({
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretPath,
secretPath
}: UpdateSecretParams) => {
const secretBlindIndex = await generateSecretBlindIndexHelper({
secretName,
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
let secret: ISecret | null = null;
// if using service token filter towards the folderId by secretpath
if (authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = authData.authPayload;
if (secretPath !== serviceTkScopedSecretPath) {
if (!isValidScope(authData.authPayload, environment, secretPath)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
const folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
);
const folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
if (type === SECRET_SHARED) {
// case: update shared secret
@ -708,16 +685,16 @@ export const updateSecretHelper = async ({
workspace: new Types.ObjectId(workspaceId),
environment,
folder: folderId,
type,
type
},
{
secretValueCiphertext,
secretValueIV,
secretValueTag,
$inc: { version: 1 },
$inc: { version: 1 }
},
{
new: true,
new: true
}
);
} else {
@ -730,16 +707,16 @@ export const updateSecretHelper = async ({
environment,
type,
folder: folderId,
...getAuthDataPayloadUserObj(authData),
...getAuthDataPayloadUserObj(authData)
},
{
secretValueCiphertext,
secretValueIV,
secretValueTag,
$inc: { version: 1 },
$inc: { version: 1 }
},
{
new: true,
new: true
}
);
}
@ -763,12 +740,12 @@ export const updateSecretHelper = async ({
secretValueIV,
secretValueTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
});
// (EE) add version for new secret
await EESecretService.addSecretVersions({
secretVersions: [secretVersion],
secretVersions: [secretVersion]
});
// (EE) create (audit) log
@ -776,7 +753,7 @@ export const updateSecretHelper = async ({
name: ACTION_UPDATE_SECRETS,
...getAuthDataPayloadIdObj(authData),
workspaceId,
secretIds: [secret._id],
secretIds: [secret._id]
});
action &&
@ -785,14 +762,14 @@ export const updateSecretHelper = async ({
workspaceId,
actions: [action],
channel: authData.authChannel,
ipAddress: authData.authIP,
ipAddress: authData.authIP
}));
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId,
environment,
folderId: secret?.folder,
folderId: secret?.folder
});
const postHogClient = await TelemetryService.getPostHogClient();
@ -801,7 +778,7 @@ export const updateSecretHelper = async ({
postHogClient.capture({
event: "secrets modified",
distinctId: await TelemetryService.getDistinctId({
authData,
authData
}),
properties: {
numberOfSecrets: 1,
@ -809,8 +786,8 @@ export const updateSecretHelper = async ({
workspaceId,
folderId,
channel: authData.authChannel,
userAgent: authData.authUserAgent,
},
userAgent: authData.authUserAgent
}
});
}
@ -833,26 +810,20 @@ export const deleteSecretHelper = async ({
environment,
type,
authData,
secretPath = "/",
secretPath = "/"
}: DeleteSecretParams) => {
const secretBlindIndex = await generateSecretBlindIndexHelper({
secretName,
workspaceId: new Types.ObjectId(workspaceId),
workspaceId: new Types.ObjectId(workspaceId)
});
// if using service token filter towards the folderId by secretpath
if (authData.authPayload instanceof ServiceTokenData) {
const { secretPath: serviceTkScopedSecretPath } = authData.authPayload;
if (secretPath !== serviceTkScopedSecretPath) {
if (!isValidScope(authData.authPayload, environment, secretPath)) {
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
const folderId = await getFolderIdFromServiceToken(
workspaceId,
environment,
secretPath
);
const folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
let secrets: ISecret[] = [];
let secret: ISecret | null = null;
@ -862,7 +833,7 @@ export const deleteSecretHelper = async ({
secretBlindIndex,
workspaceId: new Types.ObjectId(workspaceId),
environment,
folder: folderId,
folder: folderId
}).lean();
secret = await Secret.findOneAndDelete({
@ -870,14 +841,14 @@ export const deleteSecretHelper = async ({
workspaceId: new Types.ObjectId(workspaceId),
environment,
type,
folder: folderId,
folder: folderId
}).lean();
await Secret.deleteMany({
secretBlindIndex,
workspaceId: new Types.ObjectId(workspaceId),
environment,
folder: folderId,
folder: folderId
});
} else {
secret = await Secret.findOneAndDelete({
@ -886,7 +857,7 @@ export const deleteSecretHelper = async ({
workspaceId: new Types.ObjectId(workspaceId),
environment,
type,
...getAuthDataPayloadUserObj(authData),
...getAuthDataPayloadUserObj(authData)
}).lean();
if (secret) {
@ -897,7 +868,7 @@ export const deleteSecretHelper = async ({
if (!secret) throw SecretNotFoundError();
await EESecretService.markDeletedSecretVersions({
secretIds: secrets.map((secret) => secret._id),
secretIds: secrets.map((secret) => secret._id)
});
// (EE) create (audit) log
@ -905,22 +876,23 @@ export const deleteSecretHelper = async ({
name: ACTION_DELETE_SECRETS,
...getAuthDataPayloadIdObj(authData),
workspaceId,
secretIds: secrets.map((secret) => secret._id),
secretIds: secrets.map((secret) => secret._id)
});
action && (await EELogService.createLog({
...getAuthDataPayloadIdObj(authData),
workspaceId,
actions: [action],
channel: authData.authChannel,
ipAddress: authData.authIP,
}));
action &&
(await EELogService.createLog({
...getAuthDataPayloadIdObj(authData),
workspaceId,
actions: [action],
channel: authData.authChannel,
ipAddress: authData.authIP
}));
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId,
environment,
folderId: secret?.folder,
folderId: secret?.folder
});
const postHogClient = await TelemetryService.getPostHogClient();
@ -929,7 +901,7 @@ export const deleteSecretHelper = async ({
postHogClient.capture({
event: "secrets deleted",
distinctId: await TelemetryService.getDistinctId({
authData,
authData
}),
properties: {
numberOfSecrets: secrets.length,
@ -937,13 +909,13 @@ export const deleteSecretHelper = async ({
workspaceId,
folderId,
channel: authData.authChannel,
userAgent: authData.authUserAgent,
},
userAgent: authData.authUserAgent
}
});
}
return ({
return {
secrets,
secret,
});
secret
};
};

View File

@ -5,11 +5,12 @@ import express from "express";
require("express-async-errors");
import helmet from "helmet";
import cors from "cors";
import { DatabaseService } from "./services";
import { DatabaseService, GithubSecretScanningService } from "./services";
import { EELicenseService } from "./ee/services";
import { setUpHealthEndpoint } from "./services/health";
import cookieParser from "cookie-parser";
import swaggerUi = require("swagger-ui-express");
import { Probot, createNodeMiddleware } from "probot";
// eslint-disable-next-line @typescript-eslint/no-var-requires
const swaggerFile = require("../spec.json");
// eslint-disable-next-line @typescript-eslint/no-var-requires
@ -20,7 +21,7 @@ import {
organizations as eeOrganizationsRouter,
secret as eeSecretRouter,
secretSnapshot as eeSecretSnapshotRouter,
workspace as eeWorkspaceRouter,
workspace as eeWorkspaceRouter
} from "./ee/routes/v1";
import {
auth as v1AuthRouter,
@ -34,41 +35,44 @@ import {
organization as v1OrganizationRouter,
password as v1PasswordRouter,
secret as v1SecretRouter,
secretScanning as v1SecretScanningRouter,
secretsFolder as v1SecretsFolder,
serviceToken as v1ServiceTokenRouter,
signup as v1SignupRouter,
userAction as v1UserActionRouter,
user as v1UserRouter,
workspace as v1WorkspaceRouter,
webhooks as v1WebhooksRouter,
workspace as v1WorkspaceRouter
} from "./routes/v1";
import {
signup as v2SignupRouter,
auth as v2AuthRouter,
users as v2UsersRouter,
organizations as v2OrganizationsRouter,
signup as v2SignupRouter,
users as v2UsersRouter,
workspace as v2WorkspaceRouter,
secret as v2SecretRouter, // begin to phase out
secrets as v2SecretsRouter,
serviceTokenData as v2ServiceTokenDataRouter,
serviceAccounts as v2ServiceAccountsRouter,
apiKeyData as v2APIKeyDataRouter,
environment as v2EnvironmentRouter,
tags as v2TagsRouter,
tags as v2TagsRouter
} from "./routes/v2";
import {
auth as v3AuthRouter,
secrets as v3SecretsRouter,
signup as v3SignupRouter,
workspaces as v3WorkspacesRouter,
workspaces as v3WorkspacesRouter
} from "./routes/v3";
import { healthCheck } from "./routes/status";
import { getLogger } from "./utils/logger";
import { RouteNotFoundError } from "./utils/errors";
import { requestErrorHandler } from "./middleware/requestErrorHandler";
import { getNodeEnv, getPort, getSiteURL } from "./config";
import { getNodeEnv, getPort, getSecretScanningGitAppId, getSecretScanningPrivateKey, getSecretScanningWebhookProxy, getSecretScanningWebhookSecret, getSiteURL } from "./config";
import { setup } from "./utils/setup";
const SmeeClient = require('smee-client') // eslint-disable-line
const main = async () => {
await setup();
await EELicenseService.initGlobalFeatureSet();
@ -80,10 +84,30 @@ const main = async () => {
app.use(
cors({
credentials: true,
origin: await getSiteURL(),
origin: await getSiteURL()
})
);
if (await getSecretScanningGitAppId() && await getSecretScanningWebhookSecret() && await getSecretScanningPrivateKey()) {
const probot = new Probot({
appId: await getSecretScanningGitAppId(),
privateKey: await getSecretScanningPrivateKey(),
secret: await getSecretScanningWebhookSecret(),
});
if ((await getNodeEnv()) != "production") {
const smee = new SmeeClient({
source: await getSecretScanningWebhookProxy(),
target: "http://backend:4000/ss-webhook",
logger: console
})
smee.start()
}
app.use(createNodeMiddleware(GithubSecretScanningService, { probot, webhooksPath: "/ss-webhook" })); // secret scanning webhook
}
if ((await getNodeEnv()) === "production") {
// enable app-wide rate-limiting + helmet security
// in production
@ -125,6 +149,8 @@ const main = async () => {
app.use("/api/v1/integration", v1IntegrationRouter);
app.use("/api/v1/integration-auth", v1IntegrationAuthRouter);
app.use("/api/v1/folders", v1SecretsFolder);
app.use("/api/v1/secret-scanning", v1SecretScanningRouter);
app.use("/api/v1/webhooks", v1WebhooksRouter);
// v2 routes (improvements)
app.use("/api/v2/signup", v2SignupRouter);
@ -138,7 +164,6 @@ const main = async () => {
app.use("/api/v2/secrets", v2SecretsRouter); // note: in the process of moving to v3/secrets
app.use("/api/v2/service-token", v2ServiceTokenDataRouter);
app.use("/api/v2/service-accounts", v2ServiceAccountsRouter); // new
app.use("/api/v2/api-key", v2APIKeyDataRouter);
// v3 routes (experimental)
app.use("/api/v3/auth", v3AuthRouter);
@ -157,7 +182,7 @@ const main = async () => {
if (res.headersSent) return next();
next(
RouteNotFoundError({
message: `The requested source '(${req.method})${req.url}' was not found`,
message: `The requested source '(${req.method})${req.url}' was not found`
})
);
});
@ -165,9 +190,7 @@ const main = async () => {
app.use(requestErrorHandler);
const server = app.listen(await getPort(), async () => {
(await getLogger("backend-main")).info(
`Server started listening at port ${await getPort()}`
);
(await getLogger("backend-main")).info(`Server started listening at port ${await getPort()}`);
});
// await createTestUserForDevelopment();

View File

@ -9,15 +9,17 @@ import {
INTEGRATION_CHECKLY_API_URL,
INTEGRATION_CIRCLECI,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CLOUDFLARE_PAGES_API_URL,
INTEGRATION_FLYIO,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CLOUDFLARE_PAGES_API_URL,
INTEGRATION_GITLAB_API_URL,
INTEGRATION_HEROKU,
INTEGRATION_HEROKU_API_URL,
INTEGRATION_LARAVELFORGE,
INTEGRATION_LARAVELFORGE_API_URL,
INTEGRATION_NETLIFY,
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_RAILWAY,
@ -116,6 +118,12 @@ const getApps = async ({
accessToken,
});
break;
case INTEGRATION_LARAVELFORGE:
apps = await getAppsLaravelForge({
accessToken,
serverId: accessId
});
break;
case INTEGRATION_TRAVISCI:
apps = await getAppsTravisCI({
accessToken,
@ -398,6 +406,40 @@ const getAppsRailway = async ({ accessToken }: { accessToken: string }) => {
return apps;
};
/**
* Return list of sites for Laravel Forge integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Laravel Forge API
* @param {String} obj.serverId - server id of Laravel Forge
* @returns {Object[]} apps - names and ids of Laravel Forge sites
* @returns {String} apps.name - name of Laravel Forge sites
* @returns {String} apps.appId - id of Laravel Forge sites
*/
const getAppsLaravelForge = async ({
accessToken,
serverId
}: {
accessToken: string;
serverId?: string;
}) => {
const res = (
await standardRequest.get(`${INTEGRATION_LARAVELFORGE_API_URL}/api/v1/servers/${serverId}/sites`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"Content-Type": "application/json",
},
})
).data.sites;
const apps = res.map((a: any) => ({
name: a.name,
appId: a.id,
}));
return apps;
};
/**
* Return list of apps for Fly.io integration
* @param {Object} obj

View File

@ -18,6 +18,8 @@ import {
INTEGRATION_CHECKLY_API_URL,
INTEGRATION_CIRCLECI,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CLOUDFLARE_PAGES_API_URL,
INTEGRATION_FLYIO,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_GITHUB,
@ -26,6 +28,8 @@ import {
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_HEROKU_API_URL,
INTEGRATION_LARAVELFORGE,
INTEGRATION_LARAVELFORGE_API_URL,
INTEGRATION_NETLIFY,
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_RAILWAY,
@ -34,8 +38,6 @@ import {
INTEGRATION_RENDER_API_URL,
INTEGRATION_SUPABASE,
INTEGRATION_SUPABASE_API_URL,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CLOUDFLARE_PAGES_API_URL,
INTEGRATION_TRAVISCI,
INTEGRATION_TRAVISCI_API_URL,
INTEGRATION_VERCEL,
@ -154,6 +156,14 @@ const syncSecrets = async ({
accessToken,
});
break;
case INTEGRATION_LARAVELFORGE:
await syncSecretsLaravelForge({
integration,
secrets,
accessId,
accessToken,
});
break;
case INTEGRATION_TRAVISCI:
await syncSecretsTravisCI({
integration,
@ -168,58 +178,30 @@ const syncSecrets = async ({
accessToken,
});
break;
case INTEGRATION_FLYIO:
await syncSecretsFlyio({
case INTEGRATION_CHECKLY:
await syncSecretsCheckly({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_HASHICORP_VAULT:
await syncSecretsHashiCorpVault({
integration,
integrationAuth,
secrets,
accessId,
accessToken,
});
break;
case INTEGRATION_CLOUDFLARE_PAGES:
await syncSecretsCloudflarePages({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_CIRCLECI:
await syncSecretsCircleCI({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_TRAVISCI:
await syncSecretsTravisCI({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_SUPABASE:
await syncSecretsSupabase({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_CHECKLY:
await syncSecretsCheckly({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_HASHICORP_VAULT:
await syncSecretsHashiCorpVault({
integration,
integrationAuth,
secrets,
accessId,
accessToken,
});
break;
case INTEGRATION_CLOUDFLARE_PAGES:
await syncSecretsCloudflarePages({
integration,
secrets,
accessId,
accessToken
});
break;
accessToken
});
break;
}
};
@ -1167,6 +1149,48 @@ const syncSecretsRender = async ({
);
};
/**
* Sync/push [secrets] to Laravel Forge sites with id [integration.appId]
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for Laravel Forge integration
*/
const syncSecretsLaravelForge = async ({
integration,
secrets,
accessId,
accessToken,
}: {
integration: IIntegration;
secrets: any;
accessId: string | null;
accessToken: string;
}) => {
function transformObjectToString(obj: any) {
let result = "";
for (const key in obj) {
result += `${key}=${obj[key]}\n`;
}
return result;
}
await standardRequest.put(
`${INTEGRATION_LARAVELFORGE_API_URL}/api/v1/servers/${accessId}/sites/${integration.appId}/env`,
{
content: transformObjectToString(secrets),
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"Content-Type": "application/json",
},
}
);
};
/**
* Sync/push [secrets] to Railway project with id [integration.appId]
* @param {Object} obj
@ -1874,7 +1898,7 @@ const syncSecretsCloudflarePages = async ({
}
)
)
.data.result['deployment_configs'][integration.targetEnvironment]['env_vars'];
.data.result["deployment_configs"][integration.targetEnvironment]["env_vars"];
// copy the secrets object, so we can set deleted keys to null
const secretsObj: any = {...secrets};

View File

@ -1,6 +1,7 @@
import * as Sentry from "@sentry/node";
import { ErrorRequestHandler } from "express";
import { InternalServerError } from "../utils/errors";
import { TokenExpiredError } from "jsonwebtoken";
import { InternalServerError, UnauthorizedRequestError } from "../utils/errors";
import { getLogger } from "../utils/logger";
import RequestError, { LogLevel } from "../utils/requestError";
import { getNodeEnv } from "../config";
@ -19,7 +20,9 @@ export const requestErrorHandler: ErrorRequestHandler = async (
}
//TODO: Find better way to type check for error. In current setting you need to cast type to get the functions and variables from RequestError
if (!(error instanceof RequestError)) {
if (error instanceof TokenExpiredError) {
error = UnauthorizedRequestError({ stack: error.stack, message: "Token expired" });
} else if (!(error instanceof RequestError)) {
error = InternalServerError({
context: { exception: error.message },
stack: error.stack,

View File

@ -0,0 +1,34 @@
import { Schema, Types, model } from "mongoose";
type GitAppInstallationSession = {
id: string;
sessionId: string;
organization: Types.ObjectId;
user: Types.ObjectId;
}
const gitAppInstallationSession = new Schema<GitAppInstallationSession>({
id: {
required: true,
type: String,
},
sessionId: {
type: String,
required: true,
unique: true
},
organization: {
type: Schema.Types.ObjectId,
required: true,
unique: true
},
user: {
type: Schema.Types.ObjectId,
ref: "User"
}
});
const GitAppInstallationSession = model<GitAppInstallationSession>("git_app_installation_session", gitAppInstallationSession);
export default GitAppInstallationSession;

View File

@ -0,0 +1,31 @@
import { Schema, model } from "mongoose";
type Installation = {
installationId: string
organizationId: string
user: Schema.Types.ObjectId
};
const gitAppOrganizationInstallation = new Schema<Installation>({
installationId: {
type: String,
required: true,
unique: true
},
organizationId: {
type: String,
required: true,
unique: true
},
user: {
type: Schema.Types.ObjectId,
ref: "User",
required: true,
}
});
const GitAppOrganizationInstallation = model<Installation>("git_app_organization_installation", gitAppOrganizationInstallation);
export default GitAppOrganizationInstallation;

View File

@ -0,0 +1,152 @@
import { Schema, model } from "mongoose";
export const STATUS_RESOLVED_FALSE_POSITIVE = "RESOLVED_FALSE_POSITIVE";
export const STATUS_RESOLVED_REVOKED = "RESOLVED_REVOKED";
export const STATUS_RESOLVED_NOT_REVOKED = "RESOLVED_NOT_REVOKED";
export const STATUS_UNRESOLVED = "UNRESOLVED";
export type GitRisks = {
id: string;
description: string;
startLine: string;
endLine: string;
startColumn: string;
endColumn: string;
match: string;
secret: string;
file: string;
symlinkFile: string;
commit: string;
entropy: string;
author: string;
email: string;
date: string;
message: string;
tags: string[];
ruleID: string;
fingerprint: string;
fingerPrintWithoutCommitId: string
isFalsePositive: boolean; // New field for marking risks as false positives
isResolved: boolean; // New field for marking risks as resolved
riskOwner: string | null; // New field for setting a risk owner (nullable string)
installationId: string,
repositoryId: string,
repositoryLink: string
repositoryFullName: string
status: string
pusher: {
name: string,
email: string
},
organization: Schema.Types.ObjectId,
}
const gitRisks = new Schema<GitRisks>({
id: {
type: String,
},
description: {
type: String,
},
startLine: {
type: String,
},
endLine: {
type: String,
},
startColumn: {
type: String,
},
endColumn: {
type: String,
},
file: {
type: String,
},
symlinkFile: {
type: String,
},
commit: {
type: String,
},
entropy: {
type: String,
},
author: {
type: String,
},
email: {
type: String,
},
date: {
type: String,
},
message: {
type: String,
},
tags: {
type: [String],
},
ruleID: {
type: String,
},
fingerprint: {
type: String,
unique: true
},
fingerPrintWithoutCommitId: {
type: String,
},
isFalsePositive: {
type: Boolean,
default: false
},
isResolved: {
type: Boolean,
default: false
},
riskOwner: {
type: String,
default: null
},
installationId: {
type: String,
require: true
},
repositoryId: {
type: String
},
repositoryLink: {
type: String
},
repositoryFullName: {
type: String
},
pusher: {
name: {
type: String
},
email: {
type: String
},
},
organization: {
type: Schema.Types.ObjectId,
ref: "Organization",
},
status: {
type: String,
enum: [
STATUS_RESOLVED_FALSE_POSITIVE,
STATUS_RESOLVED_REVOKED,
STATUS_RESOLVED_NOT_REVOKED,
STATUS_UNRESOLVED
],
default: STATUS_UNRESOLVED
}
}, { timestamps: true });
const GitRisks = model<GitRisks>("GitRisks", gitRisks);
export default GitRisks;

View File

@ -16,13 +16,14 @@ import ServiceAccountKey, { IServiceAccountKey } from "./serviceAccountKey"; //
import ServiceAccountOrganizationPermission, { IServiceAccountOrganizationPermission } from "./serviceAccountOrganizationPermission"; // new
import ServiceAccountWorkspacePermission, { IServiceAccountWorkspacePermission } from "./serviceAccountWorkspacePermission"; // new
import TokenData, { ITokenData } from "./tokenData";
import User,{ AuthProvider, IUser } from "./user";
import User, { AuthProvider, IUser } from "./user";
import UserAction, { IUserAction } from "./userAction";
import Workspace, { IWorkspace } from "./workspace";
import ServiceTokenData, { IServiceTokenData } from "./serviceTokenData";
import APIKeyData, { IAPIKeyData } from "./apiKeyData";
import LoginSRPDetail, { ILoginSRPDetail } from "./loginSRPDetail";
import TokenVersion, { ITokenVersion } from "./tokenVersion";
import GitRisks, { STATUS_RESOLVED_FALSE_POSITIVE } from "./gitRisks";
export {
AuthProvider,
@ -76,4 +77,6 @@ export {
ILoginSRPDetail,
TokenVersion,
ITokenVersion,
GitRisks,
STATUS_RESOLVED_FALSE_POSITIVE
};

View File

@ -5,16 +5,17 @@ import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_CHECKLY,
INTEGRATION_CIRCLECI,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_FLYIO,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
INTEGRATION_RAILWAY,
INTEGRATION_RENDER,
INTEGRATION_SUPABASE,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL,
} from "../variables";
@ -48,6 +49,7 @@ export interface IIntegration {
| "railway"
| "flyio"
| "circleci"
| "laravel-forge"
| "travisci"
| "supabase"
| "checkly"
@ -136,6 +138,7 @@ const integrationSchema = new Schema<IIntegration>(
INTEGRATION_RAILWAY,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,

View File

@ -7,24 +7,25 @@ import {
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_CIRCLECI,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_FLYIO,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
INTEGRATION_RAILWAY,
INTEGRATION_RENDER,
INTEGRATION_SUPABASE,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL,
INTEGRATION_VERCEL
} from "../variables";
export interface IIntegrationAuth extends Document {
_id: Types.ObjectId;
workspace: Types.ObjectId;
integration: 'heroku' | 'vercel' | 'netlify' | 'github' | 'gitlab' | 'render' | 'railway' | 'flyio' | 'azure-key-vault' | 'circleci' | 'travisci' | 'supabase' | 'aws-parameter-store' | 'aws-secret-manager' | 'checkly' | 'cloudflare-pages';
integration: "heroku" | "vercel" | "netlify" | "github" | "gitlab" | "render" | "railway" | "flyio" | "azure-key-vault" | "laravel-forge" | "circleci" | "travisci" | "supabase" | "aws-parameter-store" | "aws-secret-manager" | "checkly" | "cloudflare-pages";
teamId: string;
accountId: string;
url: string;
@ -65,6 +66,7 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
INTEGRATION_RAILWAY,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_SUPABASE,
INTEGRATION_HASHICORP_VAULT,

View File

@ -4,7 +4,10 @@ export interface IServiceTokenData extends Document {
_id: Types.ObjectId;
name: string;
workspace: Types.ObjectId;
environment: string;
scopes: Array<{
environment: string;
secretPath: string;
}>;
user: Types.ObjectId;
serviceAccount: Types.ObjectId;
lastUsed: Date;
@ -13,7 +16,6 @@ export interface IServiceTokenData extends Document {
encryptedKey: string;
iv: string;
tag: string;
secretPath: string;
permissions: string[];
}
@ -21,68 +23,72 @@ const serviceTokenDataSchema = new Schema<IServiceTokenData>(
{
name: {
type: String,
required: true,
required: true
},
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace",
required: true,
required: true
},
environment: {
type: String,
required: true,
scopes: {
type: [
{
environment: {
type: String,
required: true
},
secretPath: {
type: String,
default: "/",
required: true
}
}
],
required: true
},
user: {
type: Schema.Types.ObjectId,
ref: "User",
required: true,
required: true
},
serviceAccount: {
type: Schema.Types.ObjectId,
ref: "ServiceAccount",
ref: "ServiceAccount"
},
lastUsed: {
type: Date,
type: Date
},
expiresAt: {
type: Date,
type: Date
},
secretHash: {
type: String,
required: true,
select: false,
select: false
},
encryptedKey: {
type: String,
select: false,
select: false
},
iv: {
type: String,
select: false,
select: false
},
tag: {
type: String,
select: false,
select: false
},
permissions: {
type: [String],
enum: ["read", "write"],
default: ["read"],
},
secretPath: {
type: String,
default: "/",
required: true,
},
default: ["read"]
}
},
{
timestamps: true,
timestamps: true
}
);
const ServiceTokenData = model<IServiceTokenData>(
"ServiceTokenData",
serviceTokenDataSchema
);
const ServiceTokenData = model<IServiceTokenData>("ServiceTokenData", serviceTokenDataSchema);
export default ServiceTokenData;

View File

@ -0,0 +1,85 @@
import { Document, Schema, Types, model } from "mongoose";
import { ALGORITHM_AES_256_GCM, ENCODING_SCHEME_BASE64, ENCODING_SCHEME_UTF8 } from "../variables";
export interface IWebhook extends Document {
_id: Types.ObjectId;
workspace: Types.ObjectId;
environment: string;
secretPath: string;
url: string;
lastStatus: "success" | "failed";
lastRunErrorMessage?: string;
isDisabled: boolean;
encryptedSecretKey: string;
iv: string;
tag: string;
algorithm: "aes-256-gcm";
keyEncoding: "base64" | "utf8";
}
const WebhookSchema = new Schema<IWebhook>(
{
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace",
required: true
},
environment: {
type: String,
required: true
},
secretPath: {
type: String,
required: true,
default: "/"
},
url: {
type: String,
required: true
},
lastStatus: {
type: String,
enum: ["success", "failed"]
},
lastRunErrorMessage: {
type: String
},
isDisabled: {
type: Boolean,
default: false
},
// used for webhook signature
encryptedSecretKey: {
type: String,
select: false
},
iv: {
type: String,
select: false
},
tag: {
type: String,
select: false
},
algorithm: {
// the encryption algorithm used
type: String,
enum: [ALGORITHM_AES_256_GCM],
required: true,
select: false
},
keyEncoding: {
type: String,
enum: [ENCODING_SCHEME_UTF8, ENCODING_SCHEME_BASE64],
required: true,
select: false
}
},
{
timestamps: true
}
);
const Webhook = model<IWebhook>("Webhook", WebhookSchema);
export default Webhook;

View File

@ -1,5 +1,5 @@
import express, { Request, Response } from "express";
import { getSmtpConfigured } from "../../config";
import { getSecretScanningGitAppId, getSecretScanningPrivateKey, getSecretScanningWebhookSecret, getSmtpConfigured } from "../../config";
const router = express.Router();
@ -10,6 +10,7 @@ router.get(
date: new Date(),
message: "Ok",
emailConfigured: await getSmtpConfigured(),
secretScanningConfigured: await getSecretScanningGitAppId() && await getSecretScanningWebhookSecret() && await getSecretScanningPrivateKey(),
})
}
);

View File

@ -15,23 +15,27 @@ import password from "./password";
import integration from "./integration";
import integrationAuth from "./integrationAuth";
import secretsFolder from "./secretsFolder";
import secretScanning from "./secretScanning";
import webhooks from "./webhook";
export {
signup,
auth,
bot,
user,
userAction,
organization,
workspace,
membershipOrg,
membership,
key,
inviteOrg,
secret,
serviceToken,
password,
integration,
integrationAuth,
secretsFolder,
signup,
auth,
bot,
user,
userAction,
organization,
workspace,
membershipOrg,
membership,
key,
inviteOrg,
secret,
serviceToken,
password,
integration,
integrationAuth,
secretsFolder,
secretScanning,
webhooks
};

View File

@ -0,0 +1,81 @@
import express from "express";
const router = express.Router();
import {
requireAuth,
requireOrganizationAuth,
validateRequest,
} from "../../middleware";
import { body, param } from "express-validator";
import { createInstallationSession, getCurrentOrganizationInstallationStatus, getRisksForOrganization, linkInstallationToOrganization, updateRisksStatus } from "../../controllers/v1/secretScanningController";
import { ACCEPTED, ADMIN, MEMBER, OWNER } from "../../variables";
router.post(
"/create-installation-session/organization/:organizationId",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
param("organizationId").exists().trim(),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
validateRequest,
createInstallationSession
);
router.post(
"/link-installation",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
body("installationId").exists().trim(),
body("sessionId").exists().trim(),
validateRequest,
linkInstallationToOrganization
);
router.get(
"/installation-status/organization/:organizationId",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
param("organizationId").exists().trim(),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
validateRequest,
getCurrentOrganizationInstallationStatus
);
router.get(
"/organization/:organizationId/risks",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
param("organizationId").exists().trim(),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
validateRequest,
getRisksForOrganization
);
router.post(
"/organization/:organizationId/risks/:riskId/status",
requireAuth({
acceptedAuthModes: ["jwt"],
}),
param("organizationId").exists().trim(),
param("riskId").exists().trim(),
body("status").exists(),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED],
}),
validateRequest,
updateRisksStatus
);
export default router;

View File

@ -0,0 +1,75 @@
import express from "express";
const router = express.Router();
import { requireAuth, requireWorkspaceAuth, validateRequest } from "../../middleware";
import { body, param, query } from "express-validator";
import { ADMIN, AUTH_MODE_JWT, AUTH_MODE_SERVICE_ACCOUNT, MEMBER } from "../../variables";
import { webhookController } from "../../controllers/v1";
router.post(
"/",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "body",
locationEnvironment: "body"
}),
body("workspaceId").exists().isString().trim(),
body("environment").exists().isString().trim(),
body("webhookUrl").exists().isString().isURL().trim(),
body("webhookSecretKey").isString().trim(),
body("secretPath").default("/").isString().trim(),
validateRequest,
webhookController.createWebhook
);
router.patch(
"/:webhookId",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT]
}),
param("webhookId").exists().isString().trim(),
body("isDisabled").default(false).isBoolean(),
validateRequest,
webhookController.updateWebhook
);
router.post(
"/:webhookId/test",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT]
}),
param("webhookId").exists().isString().trim(),
validateRequest,
webhookController.testWebhook
);
router.delete(
"/:webhookId",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT]
}),
param("webhookId").exists().isString().trim(),
validateRequest,
webhookController.deleteWebhook
);
router.get(
"/",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "query",
locationEnvironment: "query"
}),
query("workspaceId").exists().isString().trim(),
query("environment").optional().isString().trim(),
query("secretPath").optional().isString().trim(),
validateRequest,
webhookController.listWebhooks
);
export default router;

View File

@ -1,42 +0,0 @@
import express from "express";
const router = express.Router();
import { body, param } from "express-validator";
import {
requireAuth,
validateRequest,
} from "../../middleware";
import { apiKeyDataController } from "../../controllers/v2";
import {
AUTH_MODE_JWT,
} from "../../variables";
router.get(
"/",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
apiKeyDataController.getAPIKeyData
);
router.post(
"/",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
body("name").exists().trim(),
body("expiresIn"), // measured in ms
validateRequest,
apiKeyDataController.createAPIKeyData
);
router.delete(
"/:apiKeyDataId",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
param("apiKeyDataId").exists().trim(),
validateRequest,
apiKeyDataController.deleteAPIKeyData
);
export default router;

View File

@ -7,7 +7,6 @@ import secret from "./secret"; // deprecated
import secrets from "./secrets";
import serviceTokenData from "./serviceTokenData";
import serviceAccounts from "./serviceAccounts";
import apiKeyData from "./apiKeyData";
import environment from "./environment"
import tags from "./tags"
@ -21,7 +20,6 @@ export {
secrets,
serviceTokenData,
serviceAccounts,
apiKeyData,
environment,
tags,
}

View File

@ -5,7 +5,7 @@ import {
requireAuth,
requireSecretsAuth,
requireWorkspaceAuth,
validateRequest,
validateRequest
} from "../../middleware";
import { validateClientForSecrets } from "../../validation";
import { body, query } from "express-validator";
@ -20,22 +20,18 @@ import {
PERMISSION_READ_SECRETS,
PERMISSION_WRITE_SECRETS,
SECRET_PERSONAL,
SECRET_SHARED,
SECRET_SHARED
} from "../../variables";
import { BatchSecretRequest } from "../../types/secret";
router.post(
"/batch",
requireAuth({
acceptedAuthModes: [
AUTH_MODE_JWT,
AUTH_MODE_API_KEY,
AUTH_MODE_SERVICE_TOKEN,
],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY, AUTH_MODE_SERVICE_TOKEN]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "body",
locationWorkspaceId: "body"
}),
body("workspaceId").exists().isString().trim(),
body("folderId").default("root").isString().trim(),
@ -52,10 +48,8 @@ router.post(
if (secretIds.length > 0) {
req.secrets = await validateClientForSecrets({
authData: req.authData,
secretIds: secretIds.map(
(secretId: string) => new Types.ObjectId(secretId)
),
requiredPermissions: [],
secretIds: secretIds.map((secretId: string) => new Types.ObjectId(secretId)),
requiredPermissions: []
});
}
}
@ -76,14 +70,11 @@ router.post(
.custom((value) => {
if (Array.isArray(value)) {
// case: create multiple secrets
if (value.length === 0)
throw new Error("secrets cannot be an empty array");
if (value.length === 0) throw new Error("secrets cannot be an empty array");
for (const secret of value) {
if (
!secret.type ||
!(
secret.type === SECRET_PERSONAL || secret.type === SECRET_SHARED
) ||
!(secret.type === SECRET_PERSONAL || secret.type === SECRET_SHARED) ||
!secret.secretKeyCiphertext ||
!secret.secretKeyIV ||
!secret.secretKeyTag ||
@ -108,9 +99,7 @@ router.post(
!value.secretValueIV ||
!value.secretValueTag
) {
throw new Error(
"secrets object is missing required secret properties"
);
throw new Error("secrets object is missing required secret properties");
}
} else {
throw new Error("secrets must be an object or an array of objects");
@ -120,17 +109,13 @@ router.post(
}),
validateRequest,
requireAuth({
acceptedAuthModes: [
AUTH_MODE_JWT,
AUTH_MODE_API_KEY,
AUTH_MODE_SERVICE_TOKEN,
],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY, AUTH_MODE_SERVICE_TOKEN]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "body",
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requiredPermissions: [PERMISSION_WRITE_SECRETS]
}),
secretsController.createSecrets
);
@ -148,14 +133,14 @@ router.get(
AUTH_MODE_JWT,
AUTH_MODE_API_KEY,
AUTH_MODE_SERVICE_TOKEN,
AUTH_MODE_SERVICE_ACCOUNT,
],
AUTH_MODE_SERVICE_ACCOUNT
]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "query",
locationEnvironment: "query",
requiredPermissions: [PERMISSION_READ_SECRETS],
requiredPermissions: [PERMISSION_READ_SECRETS]
}),
secretsController.getSecrets
);
@ -167,8 +152,7 @@ router.patch(
.custom((value) => {
if (Array.isArray(value)) {
// case: update multiple secrets
if (value.length === 0)
throw new Error("secrets cannot be an empty array");
if (value.length === 0) throw new Error("secrets cannot be an empty array");
for (const secret of value) {
if (!secret.id) {
throw new Error("Each secret must contain a ID property");
@ -187,15 +171,11 @@ router.patch(
}),
validateRequest,
requireAuth({
acceptedAuthModes: [
AUTH_MODE_JWT,
AUTH_MODE_API_KEY,
AUTH_MODE_SERVICE_TOKEN,
],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY, AUTH_MODE_SERVICE_TOKEN]
}),
requireSecretsAuth({
acceptedRoles: [ADMIN, MEMBER],
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requiredPermissions: [PERMISSION_WRITE_SECRETS]
}),
secretsController.updateSecrets
);
@ -210,8 +190,7 @@ router.delete(
if (Array.isArray(value)) {
// case: delete multiple secrets
if (value.length === 0)
throw new Error("secrets cannot be an empty array");
if (value.length === 0) throw new Error("secrets cannot be an empty array");
return value.every((id: string) => typeof id === "string");
}
@ -221,15 +200,11 @@ router.delete(
.isEmpty(),
validateRequest,
requireAuth({
acceptedAuthModes: [
AUTH_MODE_JWT,
AUTH_MODE_API_KEY,
AUTH_MODE_SERVICE_TOKEN,
],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY, AUTH_MODE_SERVICE_TOKEN]
}),
requireSecretsAuth({
acceptedRoles: [ADMIN, MEMBER],
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requiredPermissions: [PERMISSION_WRITE_SECRETS]
}),
secretsController.deleteSecrets
);

View File

@ -4,7 +4,7 @@ import {
requireAuth,
requireServiceTokenDataAuth,
requireWorkspaceAuth,
validateRequest,
validateRequest
} from "../../middleware";
import { body, param } from "express-validator";
import {
@ -13,14 +13,14 @@ import {
AUTH_MODE_SERVICE_ACCOUNT,
AUTH_MODE_SERVICE_TOKEN,
MEMBER,
PERMISSION_WRITE_SECRETS,
PERMISSION_WRITE_SECRETS
} from "../../variables";
import { serviceTokenDataController } from "../../controllers/v2";
router.get(
"/",
requireAuth({
acceptedAuthModes: [AUTH_MODE_SERVICE_TOKEN],
acceptedAuthModes: [AUTH_MODE_SERVICE_TOKEN]
}),
serviceTokenDataController.getServiceTokenData
);
@ -28,33 +28,30 @@ router.get(
router.post(
"/",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_SERVICE_ACCOUNT],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_SERVICE_ACCOUNT]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "body",
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requiredPermissions: [PERMISSION_WRITE_SECRETS]
}),
body("name").exists().isString().trim(),
body("workspaceId").exists().isString().trim(),
body("environment").exists().isString().trim(),
body("scopes").exists().isArray(),
body("scopes.*.environment").exists().isString().trim(),
body("scopes.*.secretPath").exists().isString().trim(),
body("encryptedKey").exists().isString().trim(),
body("iv").exists().isString().trim(),
body("secretPath").isString().default("/").trim(),
body("tag").exists().isString().trim(),
body("expiresIn").exists().isNumeric(), // measured in ms
body("permissions")
.isArray({ min: 1 })
.custom((value: string[]) => {
const allowedPermissions = ["read", "write"];
const invalidValues = value.filter(
(v) => !allowedPermissions.includes(v)
);
const invalidValues = value.filter((v) => !allowedPermissions.includes(v));
if (invalidValues.length > 0) {
throw new Error(
`permissions contains invalid values: ${invalidValues.join(", ")}`
);
throw new Error(`permissions contains invalid values: ${invalidValues.join(", ")}`);
}
return true;
@ -66,10 +63,10 @@ router.post(
router.delete(
"/:serviceTokenDataId",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
acceptedAuthModes: [AUTH_MODE_JWT]
}),
requireServiceTokenDataAuth({
acceptedRoles: [ADMIN, MEMBER],
acceptedRoles: [ADMIN, MEMBER]
}),
param("serviceTokenDataId").exists().trim(),
validateRequest,

View File

@ -4,7 +4,7 @@ import {
requireAuth,
validateRequest,
} from "../../middleware";
import { body } from "express-validator";
import { body, param } from "express-validator";
import { usersController } from "../../controllers/v2";
import {
AUTH_MODE_API_KEY,
@ -37,4 +37,49 @@ router.get(
usersController.getMyOrganizations
);
router.get(
"/me/api-keys",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
usersController.getMyAPIKeys
);
router.post(
"/me/api-keys",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
body("name").exists().isString().trim(),
body("expiresIn").isNumeric(),
validateRequest,
usersController.createAPIKey
);
router.delete(
"/me/api-keys/:apiKeyDataId",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
param("apiKeyDataId").exists().trim(),
validateRequest,
usersController.deleteAPIKey
);
router.get( // new
"/me/sessions",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
usersController.getMySessions
);
router.delete( // new
"/me/sessions",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
usersController.deleteMySessions
);
export default router;

View File

@ -0,0 +1,248 @@
import { Probot } from "probot";
import { exec } from "child_process";
import { mkdir, readFile, rm, writeFile } from "fs";
import { tmpdir } from "os";
import { join } from "path"
import GitRisks from "../models/gitRisks";
import GitAppOrganizationInstallation from "../models/gitAppOrganizationInstallation";
import MembershipOrg from "../models/membershipOrg";
import { ADMIN, OWNER } from "../variables";
import User from "../models/user";
import { sendMail } from "../helpers";
import TelemetryService from "./TelemetryService";
type SecretMatch = {
Description: string;
StartLine: number;
EndLine: number;
StartColumn: number;
EndColumn: number;
Match: string;
Secret: string;
File: string;
SymlinkFile: string;
Commit: string;
Entropy: number;
Author: string;
Email: string;
Date: string;
Message: string;
Tags: string[];
RuleID: string;
Fingerprint: string;
FingerPrintWithoutCommitId: string
};
export default async (app: Probot) => {
app.on("installation.deleted", async (context) => {
const { payload } = context;
const { installation, repositories } = payload;
if (installation.repository_selection == "all") {
await GitRisks.deleteMany({ installationId: installation.id })
await GitAppOrganizationInstallation.deleteOne({ installationId: installation.id })
} else {
if (repositories) {
for (const repository of repositories) {
await GitRisks.deleteMany({ repositoryId: repository.id })
}
}
}
})
app.on("push", async (context) => {
const { payload } = context;
const { commits, repository, installation, pusher } = payload;
const [owner, repo] = repository.full_name.split("/");
if (!commits || !repository || !installation || !pusher) {
return
}
const installationLinkToOrgExists = await GitAppOrganizationInstallation.findOne({ installationId: installation?.id }).lean()
if (!installationLinkToOrgExists) {
return
}
const allFindingsByFingerprint: { [key: string]: SecretMatch; } = {}
for (const commit of commits) {
for (const filepath of [...commit.added, ...commit.modified]) {
try {
const fileContentsResponse = await context.octokit.repos.getContent({
owner,
repo,
path: filepath,
});
const data: any = fileContentsResponse.data;
const fileContent = Buffer.from(data.content, "base64").toString();
const findings = await scanContentAndGetFindings(`\n${fileContent}`) // extra line to count lines correctly
for (const finding of findings) {
const fingerPrintWithCommitId = `${commit.id}:${filepath}:${finding.RuleID}:${finding.StartLine}`
const fingerPrintWithoutCommitId = `${filepath}:${finding.RuleID}:${finding.StartLine}`
finding.Fingerprint = fingerPrintWithCommitId
finding.FingerPrintWithoutCommitId = fingerPrintWithoutCommitId
finding.Commit = commit.id
finding.File = filepath
finding.Author = commit.author.name
finding.Email = commit?.author?.email ? commit?.author?.email : ""
allFindingsByFingerprint[fingerPrintWithCommitId] = finding
}
} catch (error) {
console.error(`Error fetching content for ${filepath}`, error); // eslint-disable-line
}
}
}
// change to update
for (const key in allFindingsByFingerprint) {
const risk = await GitRisks.findOneAndUpdate({ fingerprint: allFindingsByFingerprint[key].Fingerprint },
{
...convertKeysToLowercase(allFindingsByFingerprint[key]),
installationId: installation.id,
organization: installationLinkToOrgExists.organizationId,
repositoryFullName: repository.full_name,
repositoryId: repository.id
}, {
upsert: true
}).lean()
}
// get emails of admins
const adminsOfWork = await MembershipOrg.find({
organization: installationLinkToOrgExists.organizationId,
$or: [
{ role: OWNER },
{ role: ADMIN }
]
}).lean()
const userEmails = await User.find({
_id: {
$in: [adminsOfWork.map(orgMembership => orgMembership.user)]
}
}).select("email").lean()
const adminOrOwnerEmails = userEmails.map(userObject => userObject.email)
const usersToNotify = pusher?.email ? [pusher.email, ...adminOrOwnerEmails] : [...adminOrOwnerEmails]
await sendMail({
template: "secretLeakIncident.handlebars",
subjectLine: `Incident alert: leaked secrets found in Github repository ${repository.full_name}`,
recipients: usersToNotify,
substitutions: {
numberOfSecrets: Object.keys(allFindingsByFingerprint).length,
pusher_email: pusher.email,
pusher_name: pusher.name
}
});
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient) {
postHogClient.capture({
event: "cloud secret scan",
distinctId: pusher.email,
properties: {
numberOfCommitsScanned: commits.length,
numberOfRisksFound: Object.keys(allFindingsByFingerprint).length,
}
});
}
});
};
async function scanContentAndGetFindings(textContent: string): Promise<SecretMatch[]> {
const tempFolder = await createTempFolder();
const filePath = join(tempFolder, "content.txt");
const findingsPath = join(tempFolder, "findings.json");
try {
await writeTextToFile(filePath, textContent);
await runInfisicalScan(filePath, findingsPath);
const findingsData = await readFindingsFile(findingsPath);
return JSON.parse(findingsData);
} finally {
await deleteTempFolder(tempFolder);
}
}
function createTempFolder(): Promise<string> {
return new Promise((resolve, reject) => {
const tempDir = tmpdir()
const tempFolderName = Math.random().toString(36).substring(2);
const tempFolderPath = join(tempDir, tempFolderName);
mkdir(tempFolderPath, (err: any) => {
if (err) {
reject(err);
} else {
resolve(tempFolderPath);
}
});
});
}
function writeTextToFile(filePath: string, content: string): Promise<void> {
return new Promise((resolve, reject) => {
writeFile(filePath, content, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
function runInfisicalScan(inputPath: string, outputPath: string): Promise<void> {
return new Promise((resolve, reject) => {
const command = `cat "${inputPath}" | infisical scan --exit-code=77 --pipe -r "${outputPath}"`;
exec(command, (error) => {
if (error && error.code != 77) {
reject(error);
} else {
resolve();
}
});
});
}
function readFindingsFile(filePath: string): Promise<string> {
return new Promise((resolve, reject) => {
readFile(filePath, "utf8", (err, data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
}
function deleteTempFolder(folderPath: string): Promise<void> {
return new Promise((resolve, reject) => {
rm(folderPath, { recursive: true }, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
function convertKeysToLowercase<T>(obj: T): T {
const convertedObj = {} as T;
for (const key in obj) {
if (Object.prototype.hasOwnProperty.call(obj, key)) {
const lowercaseKey = key.charAt(0).toLowerCase() + key.slice(1);
convertedObj[lowercaseKey as keyof T] = obj[key];
}
}
return convertedObj;
}

View File

@ -0,0 +1,93 @@
import axios from "axios";
import crypto from "crypto";
import { Types } from "mongoose";
import picomatch from "picomatch";
import { client, getRootEncryptionKey } from "../config";
import Webhook, { IWebhook } from "../models/webhooks";
export const triggerWebhookRequest = async (
{ url, encryptedSecretKey, iv, tag }: IWebhook,
payload: Record<string, unknown>
) => {
const headers: Record<string, string> = {};
payload["timestamp"] = Date.now();
if (encryptedSecretKey) {
const rootEncryptionKey = await getRootEncryptionKey();
const secretKey = client.decryptSymmetric(encryptedSecretKey, rootEncryptionKey, iv, tag);
const webhookSign = crypto
.createHmac("sha256", secretKey)
.update(JSON.stringify(payload))
.digest("hex");
headers["x-infisical-signature"] = `t=${payload["timestamp"]};${webhookSign}`;
}
const req = await axios.post(url, payload, { headers });
return req;
};
export const getWebhookPayload = (
eventName: string,
workspaceId: string,
environment: string,
secretPath?: string
) => ({
event: eventName,
project: {
workspaceId,
environment,
secretPath
}
});
export const triggerWebhook = async (
workspaceId: string,
environment: string,
secretPath: string
) => {
const webhooks = await Webhook.find({ workspace: workspaceId, environment, isDisabled: false });
// TODO(akhilmhdh): implement retry policy later, for that a cron job based approach is needed
// for exponential backoff
const toBeTriggeredHooks = webhooks.filter(({ secretPath: hookSecretPath }) =>
picomatch.isMatch(secretPath, hookSecretPath, { strictSlashes: false })
);
const webhooksTriggered = await Promise.allSettled(
toBeTriggeredHooks.map((hook) =>
triggerWebhookRequest(
hook,
getWebhookPayload("secrets.modified", workspaceId, environment, secretPath)
)
)
);
const successWebhooks: Types.ObjectId[] = [];
const failedWebhooks: Array<{ id: Types.ObjectId; error: string }> = [];
webhooksTriggered.forEach((data, index) => {
if (data.status === "rejected") {
failedWebhooks.push({ id: toBeTriggeredHooks[index]._id, error: data.reason.message });
return;
}
successWebhooks.push(toBeTriggeredHooks[index]._id);
});
// dont remove the workspaceid and environment filter. its used to reduce the dataset before $in check
await Webhook.bulkWrite([
{
updateMany: {
filter: { workspace: workspaceId, environment, _id: { $in: successWebhooks } },
update: { lastStatus: "success", lastRunErrorMessage: null }
}
},
...failedWebhooks.map(({ id, error }) => ({
updateOne: {
filter: {
workspace: workspaceId,
environment,
_id: id
},
update: {
lastStatus: "failed",
lastRunErrorMessage: error
}
}
}))
]);
};

View File

@ -6,6 +6,7 @@ import EventService from "./EventService";
import IntegrationService from "./IntegrationService";
import TokenService from "./TokenService";
import SecretService from "./SecretService";
import GithubSecretScanningService from "./GithubSecretScanningService"
export {
TelemetryService,
@ -15,4 +16,5 @@ export {
IntegrationService,
TokenService,
SecretService,
GithubSecretScanningService
}

View File

@ -2,9 +2,10 @@ import nodemailer from "nodemailer";
import {
SMTP_HOST_GMAIL,
SMTP_HOST_MAILGUN,
SMTP_HOST_OFFICE365,
SMTP_HOST_SENDGRID,
SMTP_HOST_SOCKETLABS,
SMTP_HOST_ZOHOMAIL,
SMTP_HOST_ZOHOMAIL
} from "../variables";
import SMTPConnection from "nodemailer/lib/smtp-connection";
import * as Sentry from "@sentry/node";
@ -15,6 +16,7 @@ import {
getSmtpSecure,
getSmtpUsername,
} from "../config";
import { getLogger } from "../utils/logger";
export const initSmtp = async () => {
const mailOpts: SMTPConnection.Options = {
@ -58,6 +60,12 @@ export const initSmtp = async () => {
ciphers: "TLSv1.2",
}
break;
case SMTP_HOST_OFFICE365:
mailOpts.requireTLS = true;
mailOpts.tls = {
ciphers: "TLSv1.2"
}
break;
default:
if ((await getSmtpHost()).includes("amazonaws.com")) {
mailOpts.tls = {
@ -73,10 +81,12 @@ export const initSmtp = async () => {
const transporter = nodemailer.createTransport(mailOpts);
transporter
.verify()
.then((err) => {
.then(async () => {
Sentry.setUser(null);
Sentry.captureMessage("SMTP - Successfully connected");
console.log("SMTP - Successfully connected")
(await getLogger("backend-main")).info(
"SMTP - Successfully connected"
);
})
.catch(async (err) => {
Sentry.setUser(null);

View File

@ -0,0 +1,25 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<title>Incident alert: secret leaked</title>
</head>
<body>
<h3>Infisical has uncovered {{numberOfSecrets}} secret(s) from your recent push</h3>
<p><a href="https://app.infisical.com/secret-scanning"><strong>View leaked secrets</strong></a></p>
<p>You are receiving this notification because one or more secret leaks have been detected in a recent commit pushed
by {{pusher_name}} ({{pusher_email}}). If
these are test secrets, please add `infisical-scan:ignore` at the end of the line containing the secret as comment
in the given programming. This will prevent future notifications from being sent out for those secret(s).</p>
<p>If these are production secrets, please rotate them immediately.</p>
<p>Once you have taken action, be sure to update the status of the risk in your <a
href="https://app.infisical.com/">Infisical
dashboard</a>.</p>
</body>
</html>

View File

@ -27,7 +27,7 @@ export const UnauthorizedRequestError = (error?: Partial<RequestErrorContext>) =
context: error?.context,
stack: error?.stack,
});
export const ForbiddenRequestError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
statusCode: error?.statusCode ?? 403,
@ -46,6 +46,15 @@ export const BadRequestError = (error?: Partial<RequestErrorContext>) => new Req
stack: error?.stack,
});
export const ResourceNotFound = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
statusCode: error?.statusCode ?? 404,
type: error?.type ?? "resource_not_found",
message: error?.message ?? "The requested resource is not found",
context: error?.context,
stack: error?.stack,
});
export const InternalServerError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.ERROR,
statusCode: error?.statusCode ?? 500,
@ -229,6 +238,6 @@ export const BotNotFoundError = (error?: Partial<RequestErrorContext>) => new Re
message: error?.message ?? "The requested bot was not found",
context: error?.context,
stack: error?.stack,
})
})
//* ----->[MISC ERRORS]<-----

View File

@ -13,14 +13,14 @@ import {
Secret,
SecretBlindIndexData,
ServiceTokenData,
Workspace,
Workspace
} from "../../models";
import { generateKeyPair } from "../../utils/crypto";
import { client, getEncryptionKey, getRootEncryptionKey } from "../../config";
import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8,
ENCODING_SCHEME_UTF8
} from "../../variables";
import { InternalServerError } from "../errors";
@ -29,10 +29,7 @@ import { InternalServerError } from "../errors";
* corresponding secret versions
*/
export const backfillSecretVersions = async () => {
await Secret.updateMany(
{ version: { $exists: false } },
{ $set: { version: 1 } }
);
await Secret.updateMany({ version: { $exists: false } }, { $set: { version: 1 } });
const unversionedSecrets: ISecret[] = await Secret.aggregate([
{
@ -40,14 +37,14 @@ export const backfillSecretVersions = async () => {
from: "secretversions",
localField: "_id",
foreignField: "secret",
as: "versions",
},
as: "versions"
}
},
{
$match: {
versions: { $size: 0 },
},
},
versions: { $size: 0 }
}
}
]);
if (unversionedSecrets.length > 0) {
@ -62,9 +59,9 @@ export const backfillSecretVersions = async () => {
workspace: s.workspace,
environment: s.environment,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
})
),
)
});
}
console.log("Migration: Secret version migration v1 complete");
@ -80,8 +77,8 @@ export const backfillBots = async () => {
const workspaceIdsWithBot = await Bot.distinct("workspace");
const workspaceIdsToAddBot = await Workspace.distinct("_id", {
_id: {
$nin: workspaceIdsWithBot,
},
$nin: workspaceIdsWithBot
}
});
if (workspaceIdsToAddBot.length === 0) return;
@ -94,7 +91,7 @@ export const backfillBots = async () => {
const {
ciphertext: encryptedPrivateKey,
iv,
tag,
tag
} = client.encryptSymmetric(privateKey, rootEncryptionKey);
return new Bot({
@ -106,16 +103,16 @@ export const backfillBots = async () => {
iv,
tag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_BASE64,
keyEncoding: ENCODING_SCHEME_BASE64
});
} else if (encryptionKey) {
const {
ciphertext: encryptedPrivateKey,
iv,
tag,
tag
} = encryptSymmetric128BitHexKeyUTF8({
plaintext: privateKey,
key: encryptionKey,
key: encryptionKey
});
return new Bot({
@ -127,13 +124,12 @@ export const backfillBots = async () => {
iv,
tag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
});
}
throw InternalServerError({
message:
"Failed to backfill workspace bots due to missing encryption key",
message: "Failed to backfill workspace bots due to missing encryption key"
});
})
);
@ -149,13 +145,11 @@ export const backfillSecretBlindIndexData = async () => {
const encryptionKey = await getEncryptionKey();
const rootEncryptionKey = await getRootEncryptionKey();
const workspaceIdsBlindIndexed = await SecretBlindIndexData.distinct(
"workspace"
);
const workspaceIdsBlindIndexed = await SecretBlindIndexData.distinct("workspace");
const workspaceIdsToBlindIndex = await Workspace.distinct("_id", {
_id: {
$nin: workspaceIdsBlindIndexed,
},
$nin: workspaceIdsBlindIndexed
}
});
if (workspaceIdsToBlindIndex.length === 0) return;
@ -168,7 +162,7 @@ export const backfillSecretBlindIndexData = async () => {
const {
ciphertext: encryptedSaltCiphertext,
iv: saltIV,
tag: saltTag,
tag: saltTag
} = client.encryptSymmetric(salt, rootEncryptionKey);
return new SecretBlindIndexData({
@ -177,16 +171,16 @@ export const backfillSecretBlindIndexData = async () => {
saltIV,
saltTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_BASE64,
keyEncoding: ENCODING_SCHEME_BASE64
});
} else if (encryptionKey) {
const {
ciphertext: encryptedSaltCiphertext,
iv: saltIV,
tag: saltTag,
tag: saltTag
} = encryptSymmetric128BitHexKeyUTF8({
plaintext: salt,
key: encryptionKey,
key: encryptionKey
});
return new SecretBlindIndexData({
@ -195,13 +189,12 @@ export const backfillSecretBlindIndexData = async () => {
saltIV,
saltTag,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
keyEncoding: ENCODING_SCHEME_UTF8
});
}
throw InternalServerError({
message:
"Failed to backfill secret blind index data due to missing encryption key",
message: "Failed to backfill secret blind index data due to missing encryption key"
});
})
);
@ -219,17 +212,17 @@ export const backfillEncryptionMetadata = async () => {
await Secret.updateMany(
{
algorithm: {
$exists: false,
$exists: false
},
keyEncoding: {
$exists: false,
},
$exists: false
}
},
{
$set: {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
},
keyEncoding: ENCODING_SCHEME_UTF8
}
}
);
@ -237,17 +230,17 @@ export const backfillEncryptionMetadata = async () => {
await SecretVersion.updateMany(
{
algorithm: {
$exists: false,
$exists: false
},
keyEncoding: {
$exists: false,
},
$exists: false
}
},
{
$set: {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
},
keyEncoding: ENCODING_SCHEME_UTF8
}
}
);
@ -255,17 +248,17 @@ export const backfillEncryptionMetadata = async () => {
await SecretBlindIndexData.updateMany(
{
algorithm: {
$exists: false,
$exists: false
},
keyEncoding: {
$exists: false,
},
$exists: false
}
},
{
$set: {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
},
keyEncoding: ENCODING_SCHEME_UTF8
}
}
);
@ -273,17 +266,17 @@ export const backfillEncryptionMetadata = async () => {
await Bot.updateMany(
{
algorithm: {
$exists: false,
$exists: false
},
keyEncoding: {
$exists: false,
},
$exists: false
}
},
{
$set: {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
},
keyEncoding: ENCODING_SCHEME_UTF8
}
}
);
@ -291,17 +284,17 @@ export const backfillEncryptionMetadata = async () => {
await BackupPrivateKey.updateMany(
{
algorithm: {
$exists: false,
$exists: false
},
keyEncoding: {
$exists: false,
},
$exists: false
}
},
{
$set: {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
},
keyEncoding: ENCODING_SCHEME_UTF8
}
}
);
@ -309,17 +302,17 @@ export const backfillEncryptionMetadata = async () => {
await IntegrationAuth.updateMany(
{
algorithm: {
$exists: false,
$exists: false
},
keyEncoding: {
$exists: false,
},
$exists: false
}
},
{
$set: {
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8,
},
keyEncoding: ENCODING_SCHEME_UTF8
}
}
);
};
@ -328,26 +321,26 @@ export const backfillSecretFolders = async () => {
await Secret.updateMany(
{
folder: {
$exists: false,
},
$exists: false
}
},
{
$set: {
folder: "root",
},
folder: "root"
}
}
);
await SecretVersion.updateMany(
{
folder: {
$exists: false,
},
$exists: false
}
},
{
$set: {
folder: "root",
},
folder: "root"
}
}
);
@ -355,20 +348,20 @@ export const backfillSecretFolders = async () => {
await SecretVersion.updateMany(
{
tags: {
$exists: false,
},
$exists: false
}
},
{
$set: {
tags: [],
},
tags: []
}
}
);
let secretSnapshots = await SecretSnapshot.find({
environment: {
$exists: false,
},
$exists: false
}
})
.populate<{ secretVersions: ISecretVersion[] }>("secretVersions")
.limit(50);
@ -377,8 +370,7 @@ export const backfillSecretFolders = async () => {
for (const secSnapshot of secretSnapshots) {
const groupSnapByEnv: Record<string, Array<ISecretVersion>> = {};
secSnapshot.secretVersions.forEach((secVer) => {
if (!groupSnapByEnv?.[secVer.environment])
groupSnapByEnv[secVer.environment] = [];
if (!groupSnapByEnv?.[secVer.environment]) groupSnapByEnv[secVer.environment] = [];
groupSnapByEnv[secVer.environment].push(secVer);
});
@ -390,7 +382,7 @@ export const backfillSecretFolders = async () => {
...secSnapshot.toObject({ virtuals: false }),
_id: new Types.ObjectId(),
environment: snapEnv,
secretVersions: secretIdsOfEnvGroup,
secretVersions: secretIdsOfEnvGroup
};
});
@ -400,8 +392,8 @@ export const backfillSecretFolders = async () => {
secretSnapshots = await SecretSnapshot.find({
environment: {
$exists: false,
},
$exists: false
}
})
.populate<{ secretVersions: ISecretVersion[] }>("secretVersions")
.limit(50);
@ -414,13 +406,13 @@ export const backfillServiceToken = async () => {
await ServiceTokenData.updateMany(
{
secretPath: {
$exists: false,
},
$exists: false
}
},
{
$set: {
secretPath: "/",
},
secretPath: "/"
}
}
);
console.log("Migration: Service token migration v1 complete");
@ -430,14 +422,33 @@ export const backfillIntegration = async () => {
await Integration.updateMany(
{
secretPath: {
$exists: false,
},
$exists: false
}
},
{
$set: {
secretPath: "/",
},
secretPath: "/"
}
}
);
console.log("Migration: Integration migration v1 complete");
};
export const backfillServiceTokenMultiScope = async () => {
await ServiceTokenData.updateMany(
{
scopes: {
$exists: false
}
},
[
{
$set: {
scopes: [{ environment: "$environment", secretPath: "$secretPath" }]
}
}
]
);
console.log("Migration: Service token migration v2 complete");
};

View File

@ -14,17 +14,15 @@ import {
backfillSecretFolders,
backfillSecretVersions,
backfillServiceToken,
backfillServiceTokenMultiScope
} from "./backfillData";
import {
reencryptBotPrivateKeys,
reencryptSecretBlindIndexDataSalts,
} from "./reencryptData";
import { reencryptBotPrivateKeys, reencryptSecretBlindIndexDataSalts } from "./reencryptData";
import {
getClientIdGoogle,
getClientSecretGoogle,
getMongoURL,
getNodeEnv,
getSentryDSN,
getSentryDSN
} from "../../config";
import { initializePassport } from "../auth";
@ -79,6 +77,7 @@ export const setup = async () => {
await backfillSecretFolders();
await backfillServiceToken();
await backfillIntegration();
await backfillServiceTokenMultiScope();
// re-encrypt any data previously encrypted under server hex 128-bit ENCRYPTION_KEY
// to base64 256-bit ROOT_ENCRYPTION_KEY
@ -90,7 +89,7 @@ export const setup = async () => {
dsn: await getSentryDSN(),
tracesSampleRate: 1.0,
debug: (await getNodeEnv()) === "production" ? false : true,
environment: await getNodeEnv(),
environment: await getNodeEnv()
});
await createTestUserForDevelopment();

View File

@ -1,22 +1,19 @@
import { Types } from "mongoose";
import {
ISecret,
IServiceAccount,
IServiceTokenData,
IUser,
ServiceAccount,
ServiceTokenData,
User,
ISecret,
IServiceAccount,
IServiceTokenData,
IUser,
ServiceAccount,
ServiceTokenData,
User
} from "../models";
import {
ServiceTokenDataNotFoundError,
UnauthorizedRequestError,
} from "../utils/errors";
import { ServiceTokenDataNotFoundError, UnauthorizedRequestError } from "../utils/errors";
import {
AUTH_MODE_API_KEY,
AUTH_MODE_JWT,
AUTH_MODE_SERVICE_ACCOUNT,
AUTH_MODE_SERVICE_TOKEN,
AUTH_MODE_API_KEY,
AUTH_MODE_JWT,
AUTH_MODE_SERVICE_ACCOUNT,
AUTH_MODE_SERVICE_TOKEN
} from "../variables";
import { validateUserClientForWorkspace } from "./user";
import { validateServiceAccountClientForWorkspace } from "./serviceAccount";
@ -30,65 +27,71 @@ import { validateServiceAccountClientForWorkspace } from "./serviceAccount";
* @param {Array<'admin' | 'member'>} obj.acceptedRoles - accepted workspace roles
*/
export const validateClientForServiceTokenData = async ({
authData,
serviceTokenDataId,
acceptedRoles,
authData,
serviceTokenDataId,
acceptedRoles
}: {
authData: {
authMode: string;
authPayload: IUser | IServiceAccount | IServiceTokenData;
};
serviceTokenDataId: Types.ObjectId;
acceptedRoles: Array<"admin" | "member">;
authData: {
authMode: string;
authPayload: IUser | IServiceAccount | IServiceTokenData;
};
serviceTokenDataId: Types.ObjectId;
acceptedRoles: Array<"admin" | "member">;
}) => {
const serviceTokenData = await ServiceTokenData
.findById(serviceTokenDataId)
.select("+encryptedKey +iv +tag")
.populate<{ user: IUser }>("user");
const serviceTokenData = await ServiceTokenData.findById(serviceTokenDataId)
.select("+encryptedKey +iv +tag")
.populate<{ user: IUser }>("user");
if (!serviceTokenData) throw ServiceTokenDataNotFoundError({
message: "Failed to find service token data",
if (!serviceTokenData)
throw ServiceTokenDataNotFoundError({
message: "Failed to find service token data"
});
if (authData.authMode === AUTH_MODE_JWT && authData.authPayload instanceof User) {
await validateUserClientForWorkspace({
user: authData.authPayload,
workspaceId: serviceTokenData.workspace,
acceptedRoles,
});
return serviceTokenData;
}
if (authData.authMode === AUTH_MODE_JWT && authData.authPayload instanceof User) {
await validateUserClientForWorkspace({
user: authData.authPayload,
workspaceId: serviceTokenData.workspace,
acceptedRoles
});
if (authData.authMode === AUTH_MODE_SERVICE_ACCOUNT && authData.authPayload instanceof ServiceAccount) {
await validateServiceAccountClientForWorkspace({
serviceAccount: authData.authPayload,
workspaceId: serviceTokenData.workspace,
});
return serviceTokenData;
}
return serviceTokenData;
}
if (authData.authMode === AUTH_MODE_SERVICE_TOKEN && authData.authPayload instanceof ServiceTokenData) {
throw UnauthorizedRequestError({
message: "Failed service token authorization for service token data",
});
}
if (
authData.authMode === AUTH_MODE_SERVICE_ACCOUNT &&
authData.authPayload instanceof ServiceAccount
) {
await validateServiceAccountClientForWorkspace({
serviceAccount: authData.authPayload,
workspaceId: serviceTokenData.workspace
});
if (authData.authMode === AUTH_MODE_API_KEY && authData.authPayload instanceof User) {
await validateUserClientForWorkspace({
user: authData.authPayload,
workspaceId: serviceTokenData.workspace,
acceptedRoles,
});
return serviceTokenData;
}
return serviceTokenData;
}
if (
authData.authMode === AUTH_MODE_SERVICE_TOKEN &&
authData.authPayload instanceof ServiceTokenData
) {
throw UnauthorizedRequestError({
message: "Failed client authorization for service token data",
message: "Failed service token authorization for service token data"
});
}
}
if (authData.authMode === AUTH_MODE_API_KEY && authData.authPayload instanceof User) {
await validateUserClientForWorkspace({
user: authData.authPayload,
workspaceId: serviceTokenData.workspace,
acceptedRoles
});
return serviceTokenData;
}
throw UnauthorizedRequestError({
message: "Failed client authorization for service token data"
});
};
/**
* Validate that service token (client) can access workspace
@ -101,42 +104,42 @@ export const validateClientForServiceTokenData = async ({
* @param {String[]} requiredPermissions - required permissions as part of the endpoint
*/
export const validateServiceTokenDataClientForWorkspace = async ({
serviceTokenData,
workspaceId,
environment,
requiredPermissions,
serviceTokenData,
workspaceId,
environment,
requiredPermissions
}: {
serviceTokenData: IServiceTokenData;
workspaceId: Types.ObjectId;
environment?: string;
requiredPermissions?: string[];
serviceTokenData: IServiceTokenData;
workspaceId: Types.ObjectId;
environment?: string;
requiredPermissions?: string[];
}) => {
if (!serviceTokenData.workspace.equals(workspaceId)) {
// case: invalid workspaceId passed
if (!serviceTokenData.workspace.equals(workspaceId)) {
// case: invalid workspaceId passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace"
});
}
if (environment) {
// case: environment is specified
if (!serviceTokenData.scopes.find(({ environment: tkEnv }) => tkEnv === environment)) {
// case: invalid environment passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace environment"
});
}
requiredPermissions?.forEach((permission) => {
if (!serviceTokenData.permissions.includes(permission)) {
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace",
message: `Failed service token authorization for the given workspace environment action: ${permission}`
});
}
if (environment) {
// case: environment is specified
if (serviceTokenData.environment !== environment) {
// case: invalid environment passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace environment",
});
}
requiredPermissions?.forEach((permission) => {
if (!serviceTokenData.permissions.includes(permission)) {
throw UnauthorizedRequestError({
message: `Failed service token authorization for the given workspace environment action: ${permission}`,
});
}
});
}
}
}
});
}
};
/**
* Validate that service token (client) can access secrets
@ -147,36 +150,35 @@ export const validateServiceTokenDataClientForWorkspace = async ({
* @param {string[]} requiredPermissions - required permissions as part of the endpoint
*/
export const validateServiceTokenDataClientForSecrets = async ({
serviceTokenData,
secrets,
requiredPermissions,
serviceTokenData,
secrets,
requiredPermissions
}: {
serviceTokenData: IServiceTokenData;
secrets: ISecret[];
requiredPermissions?: string[];
serviceTokenData: IServiceTokenData;
secrets: ISecret[];
requiredPermissions?: string[];
}) => {
secrets.forEach((secret: ISecret) => {
if (!serviceTokenData.workspace.equals(secret.workspace)) {
// case: invalid workspaceId passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace"
});
}
secrets.forEach((secret: ISecret) => {
if (!serviceTokenData.workspace.equals(secret.workspace)) {
// case: invalid workspaceId passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace",
});
}
if (serviceTokenData.environment !== secret.environment) {
// case: invalid environment passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace environment",
});
}
requiredPermissions?.forEach((permission) => {
if (!serviceTokenData.permissions.includes(permission)) {
throw UnauthorizedRequestError({
message: `Failed service token authorization for the given workspace environment action: ${permission}`,
});
}
if (!serviceTokenData.scopes.find(({ environment: tkEnv }) => tkEnv === secret.environment)) {
// case: invalid environment passed
throw UnauthorizedRequestError({
message: "Failed service token authorization for the given workspace environment"
});
}
requiredPermissions?.forEach((permission) => {
if (!serviceTokenData.permissions.includes(permission)) {
throw UnauthorizedRequestError({
message: `Failed service token authorization for the given workspace environment action: ${permission}`
});
}
});
}
});
};

View File

@ -1,2 +1,3 @@
export const EVENT_PUSH_SECRETS = "pushSecrets";
export const EVENT_PULL_SECRETS = "pullSecrets";
export const EVENT_PULL_SECRETS = "pullSecrets";
export const EVENT_START_INTEGRATION = "startIntegration";

View File

@ -19,12 +19,13 @@ export const INTEGRATION_GITLAB = "gitlab";
export const INTEGRATION_RENDER = "render";
export const INTEGRATION_RAILWAY = "railway";
export const INTEGRATION_FLYIO = "flyio";
export const INTEGRATION_LARAVELFORGE = "laravel-forge"
export const INTEGRATION_CIRCLECI = "circleci";
export const INTEGRATION_TRAVISCI = "travisci";
export const INTEGRATION_SUPABASE = 'supabase';
export const INTEGRATION_CHECKLY = 'checkly';
export const INTEGRATION_HASHICORP_VAULT = 'hashicorp-vault';
export const INTEGRATION_CLOUDFLARE_PAGES = 'cloudflare-pages';
export const INTEGRATION_SUPABASE = "supabase";
export const INTEGRATION_CHECKLY = "checkly";
export const INTEGRATION_HASHICORP_VAULT = "hashicorp-vault";
export const INTEGRATION_CLOUDFLARE_PAGES = "cloudflare-pages";
export const INTEGRATION_SET = new Set([
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
@ -35,6 +36,7 @@ export const INTEGRATION_SET = new Set([
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
@ -65,9 +67,10 @@ export const INTEGRATION_RAILWAY_API_URL = "https://backboard.railway.app/graphq
export const INTEGRATION_FLYIO_API_URL = "https://api.fly.io/graphql";
export const INTEGRATION_CIRCLECI_API_URL = "https://circleci.com/api";
export const INTEGRATION_TRAVISCI_API_URL = "https://api.travis-ci.com";
export const INTEGRATION_SUPABASE_API_URL = 'https://api.supabase.com';
export const INTEGRATION_CHECKLY_API_URL = 'https://api.checklyhq.com';
export const INTEGRATION_CLOUDFLARE_PAGES_API_URL = 'https://api.cloudflare.com';
export const INTEGRATION_SUPABASE_API_URL = "https://api.supabase.com";
export const INTEGRATION_LARAVELFORGE_API_URL = "https://forge.laravel.com";
export const INTEGRATION_CHECKLY_API_URL = "https://api.checklyhq.com";
export const INTEGRATION_CLOUDFLARE_PAGES_API_URL = "https://api.cloudflare.com";
export const getIntegrationOptions = async () => {
const INTEGRATION_OPTIONS = [
@ -144,6 +147,15 @@ export const getIntegrationOptions = async () => {
clientId: "",
docsLink: "",
},
{
name: "Laravel Forge",
slug: "laravel-forge",
image: "Laravel Forge.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "AWS Secret Manager",
slug: "aws-secret-manager",
@ -221,20 +233,20 @@ export const getIntegrationOptions = async () => {
slug: "gcp",
image: "Google Cloud Platform.png",
isAvailable: false,
type: '',
clientId: '',
docsLink: ''
type: "",
clientId: "",
docsLink: ""
},
{
name: 'Cloudflare Pages',
slug: 'cloudflare-pages',
image: 'Cloudflare.png',
name: "Cloudflare Pages",
slug: "cloudflare-pages",
image: "Cloudflare.png",
isAvailable: true,
type: 'pat',
clientId: '',
docsLink: ''
type: "pat",
clientId: "",
docsLink: ""
}
]
return INTEGRATION_OPTIONS;
}
}

View File

@ -3,3 +3,4 @@ export const SMTP_HOST_MAILGUN = "smtp.mailgun.org";
export const SMTP_HOST_SOCKETLABS = "smtp.socketlabs.com";
export const SMTP_HOST_ZOHOMAIL = "smtp.zoho.com";
export const SMTP_HOST_GMAIL = "smtp.gmail.com";
export const SMTP_HOST_OFFICE365 = "smtp.office365.com";

View File

@ -246,7 +246,11 @@ func CallGetSecretsV3(httpClient *resty.Client, request GetEncryptedSecretsV3Req
}
if response.IsError() {
return GetEncryptedSecretsV3Response{}, fmt.Errorf("CallGetSecretsV3: Unsuccessful response. Please make sure your secret path, workspace and environment name are all correct [response=%s]", response)
if response.StatusCode() == 401 {
return GetEncryptedSecretsV3Response{}, fmt.Errorf("CallGetSecretsV3: Request to access secrets with [environment=%v] [path=%v] [workspaceId=%v] is denied. Please check if your authentication method has access to requested scope", request.Environment, request.SecretPath, request.WorkspaceId)
} else {
return GetEncryptedSecretsV3Response{}, fmt.Errorf("CallGetSecretsV3: Unsuccessful response. Please make sure your secret path, workspace and environment name are all correct [response=%v]", response.RawResponse)
}
}
return secretsResponse, nil

View File

@ -181,14 +181,16 @@ type GetServiceTokenDetailsResponse struct {
ID string `json:"_id"`
Name string `json:"name"`
Workspace string `json:"workspace"`
Environment string `json:"environment"`
ExpiresAt time.Time `json:"expiresAt"`
EncryptedKey string `json:"encryptedKey"`
Iv string `json:"iv"`
Tag string `json:"tag"`
CreatedAt time.Time `json:"createdAt"`
UpdatedAt time.Time `json:"updatedAt"`
V int `json:"__v"`
Scopes []struct {
Environment string `json:"environment"`
SecretPath string `json:"secretPath"`
} `json:"scopes"`
}
type GetAccessibleEnvironmentsRequest struct {

View File

@ -70,7 +70,12 @@ var exportCmd = &cobra.Command{
util.HandleError(err, "Unable to parse flag")
}
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs, WorkspaceId: projectId})
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs, WorkspaceId: projectId, SecretsPath: secretsPath})
if err != nil {
util.HandleError(err, "Unable to fetch secrets")
}
@ -83,7 +88,7 @@ var exportCmd = &cobra.Command{
var output string
if shouldExpandSecrets {
substitutions := util.SubstituteSecrets(secrets)
substitutions := util.ExpandSecrets(secrets, infisicalToken)
output, err = formatEnvs(substitutions, format)
if err != nil {
util.HandleError(err)
@ -110,6 +115,7 @@ func init() {
exportCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")
exportCmd.Flags().StringP("tags", "t", "", "filter secrets by tag slugs")
exportCmd.Flags().String("projectId", "", "manually set the projectId to fetch secrets from")
exportCmd.Flags().String("path", "/", "get secrets within a folder path")
}
// Format according to the format flag

View File

@ -73,7 +73,6 @@ var loginCmd = &cobra.Command{
return
}
}
//override domain
domainQuery := true
if config.INFISICAL_URL_MANUAL_OVERRIDE != "" && config.INFISICAL_URL_MANUAL_OVERRIDE != util.INFISICAL_DEFAULT_API_URL {
@ -322,6 +321,8 @@ func DomainOverridePrompt() (bool, error) {
)
options := []string{PRESET, OVERRIDE}
//trim the '/' from the end of the domain url
config.INFISICAL_URL_MANUAL_OVERRIDE = strings.TrimRight(config.INFISICAL_URL_MANUAL_OVERRIDE, "/")
optionsPrompt := promptui.Select{
Label: fmt.Sprintf("Current INFISICAL_API_URL Domain Override: %s", config.INFISICAL_URL_MANUAL_OVERRIDE),
Items: options,
@ -380,7 +381,8 @@ func askForDomain() error {
if err != nil {
return err
}
//trimmed the '/' from the end of the self hosting url
domain = strings.TrimRight(domain, "/")
//set api and login url
config.INFISICAL_URL = fmt.Sprintf("%s/api", domain)
config.INFISICAL_LOGIN_URL = fmt.Sprintf("%s/login", domain)

View File

@ -100,7 +100,7 @@ var runCmd = &cobra.Command{
}
if shouldExpandSecrets {
secrets = util.SubstituteSecrets(secrets)
secrets = util.ExpandSecrets(secrets, infisicalToken)
}
secretsByKey := getSecretsByKeys(secrets)

View File

@ -65,7 +65,7 @@ var secretsCmd = &cobra.Command{
}
if shouldExpandSecrets {
secrets = util.SubstituteSecrets(secrets)
secrets = util.ExpandSecrets(secrets, infisicalToken)
}
visualize.PrintAllSecretDetails(secrets)

View File

@ -48,7 +48,7 @@ var vaultSetCmd = &cobra.Command{
return
}
fmt.Printf("\nSuccessfully, switched vault backend from [%s] to [%s]. Please login in again to store your login details in the new vault with [infisical login]", currentVaultBackend, wantedVaultTypeName)
fmt.Printf("\nSuccessfully, switched vault backend from [%s] to [%s]. Please login in again to store your login details in the new vault with [infisical login]\n", currentVaultBackend, wantedVaultTypeName)
Telemetry.CaptureEvent("cli-command:vault set", posthog.NewProperties().Set("currentVault", currentVaultBackend).Set("wantedVault", wantedVaultTypeName).Set("version", util.CLI_VERSION))
} else {
@ -81,7 +81,7 @@ func printAvailableVaultBackends() {
Telemetry.CaptureEvent("cli-command:vault", posthog.NewProperties().Set("currentVault", currentVaultBackend).Set("version", util.CLI_VERSION))
fmt.Printf("\n\nYou are currently using [%s] vault to store your login credentials", string(currentVaultBackend))
fmt.Printf("\n\nYou are currently using [%s] vault to store your login credentials\n", string(currentVaultBackend))
}
// Checks if the vault that the user wants to switch to is a valid available vault

View File

@ -45,5 +45,5 @@ func PrintErrorMessageAndExit(messages ...string) {
}
func printError(e error) {
color.New(color.FgRed).Fprintf(os.Stderr, "Hmm, we ran into an error: %v", e)
color.New(color.FgRed).Fprintf(os.Stderr, "Hmm, we ran into an error: %v\n", e)
}

View File

@ -6,6 +6,7 @@ import (
"errors"
"fmt"
"os"
"path"
"regexp"
"strings"
@ -16,7 +17,7 @@ import (
"github.com/rs/zerolog/log"
)
func GetPlainTextSecretsViaServiceToken(fullServiceToken string) ([]models.SingleEnvironmentVariable, api.GetServiceTokenDetailsResponse, error) {
func GetPlainTextSecretsViaServiceToken(fullServiceToken string, environment string, secretPath string) ([]models.SingleEnvironmentVariable, api.GetServiceTokenDetailsResponse, error) {
serviceTokenParts := strings.SplitN(fullServiceToken, ".", 4)
if len(serviceTokenParts) < 4 {
return nil, api.GetServiceTokenDetailsResponse{}, fmt.Errorf("invalid service token entered. Please double check your service token and try again")
@ -34,9 +35,19 @@ func GetPlainTextSecretsViaServiceToken(fullServiceToken string) ([]models.Singl
return nil, api.GetServiceTokenDetailsResponse{}, fmt.Errorf("unable to get service token details. [err=%v]", err)
}
// if multiple scopes are there then user needs to specify which environment and secret path
if environment == "" {
if len(serviceTokenDetails.Scopes) != 1 {
return nil, api.GetServiceTokenDetailsResponse{}, fmt.Errorf("you need to provide the --env for multiple environment scoped token")
} else {
environment = serviceTokenDetails.Scopes[0].Environment
}
}
encryptedSecrets, err := api.CallGetSecretsV3(httpClient, api.GetEncryptedSecretsV3Request{
WorkspaceId: serviceTokenDetails.Workspace,
Environment: serviceTokenDetails.Environment,
Environment: environment,
SecretPath: secretPath,
})
if err != nil {
@ -188,11 +199,7 @@ func GetAllEnvironmentVariables(params models.GetAllSecretsParameters) ([]models
} else {
log.Debug().Msg("Trying to fetch secrets using service token")
secretsToReturn, _, errorToReturn = GetPlainTextSecretsViaServiceToken(infisicalToken)
// if serviceTokenDetails.Environment != params.Environment {
// PrintErrorMessageAndExit(fmt.Sprintf("Fetch secrets failed: token allows [%s] environment access, not [%s]. Service tokens are environment-specific; no need for --env flag.", params.Environment, serviceTokenDetails.Environment))
// }
secretsToReturn, _, errorToReturn = GetPlainTextSecretsViaServiceToken(infisicalToken, params.Environment, params.SecretsPath)
}
return secretsToReturn, errorToReturn
@ -278,22 +285,103 @@ func getExpandedEnvVariable(secrets []models.SingleEnvironmentVariable, variable
return "${" + variableWeAreLookingFor + "}"
}
func SubstituteSecrets(secrets []models.SingleEnvironmentVariable) []models.SingleEnvironmentVariable {
hashMapOfCompleteVariables := make(map[string]string)
hashMapOfSelfRefs := make(map[string]string)
expandedSecrets := []models.SingleEnvironmentVariable{}
for _, secret := range secrets {
expandedVariable := getExpandedEnvVariable(secrets, secret.Key, hashMapOfCompleteVariables, hashMapOfSelfRefs)
expandedSecrets = append(expandedSecrets, models.SingleEnvironmentVariable{
Key: secret.Key,
Value: expandedVariable,
Type: secret.Type,
})
var secRefRegex = regexp.MustCompile(`\${([^\}]*)}`)
func recursivelyExpandSecret(expandedSecs map[string]string, interpolatedSecs map[string]string, crossSecRefFetch func(env string, path []string, key string) string, key string) string {
if v, ok := expandedSecs[key]; ok {
return v
}
return expandedSecrets
interpolatedVal, ok := interpolatedSecs[key]
if !ok {
HandleError(fmt.Errorf("Could not find refered secret - %s", key), "Kindly check whether its provided")
}
refs := secRefRegex.FindAllStringSubmatch(interpolatedVal, -1)
for _, val := range refs {
// key: "${something}" val: [${something},something]
interpolatedExp, interpolationKey := val[0], val[1]
ref := strings.Split(interpolationKey, ".")
// ${KEY1} => [key1]
if len(ref) == 1 {
val := recursivelyExpandSecret(expandedSecs, interpolatedSecs, crossSecRefFetch, interpolationKey)
interpolatedVal = strings.ReplaceAll(interpolatedVal, interpolatedExp, val)
continue
}
// cross board reference ${env.folder.key1} => [env folder key1]
if len(ref) > 1 {
secEnv, tmpSecPath, secKey := ref[0], ref[1:len(ref)-1], ref[len(ref)-1]
interpolatedSecs[interpolationKey] = crossSecRefFetch(secEnv, tmpSecPath, secKey) // get the reference value
val := recursivelyExpandSecret(expandedSecs, interpolatedSecs, crossSecRefFetch, interpolationKey)
interpolatedVal = strings.ReplaceAll(interpolatedVal, interpolatedExp, val)
}
}
expandedSecs[key] = interpolatedVal
return interpolatedVal
}
func getSecretsByKeys(secrets []models.SingleEnvironmentVariable) map[string]models.SingleEnvironmentVariable {
secretMapByName := make(map[string]models.SingleEnvironmentVariable, len(secrets))
for _, secret := range secrets {
secretMapByName[secret.Key] = secret
}
return secretMapByName
}
func ExpandSecrets(secrets []models.SingleEnvironmentVariable, infisicalToken string) []models.SingleEnvironmentVariable {
expandedSecs := make(map[string]string)
interpolatedSecs := make(map[string]string)
// map[env.secret-path][keyname]Secret
crossEnvRefSecs := make(map[string]map[string]models.SingleEnvironmentVariable) // a cache to hold all cross board reference secrets
for _, sec := range secrets {
// get all references in a secret
refs := secRefRegex.FindAllStringSubmatch(sec.Value, -1)
// nil means its a secret without reference
if refs == nil {
expandedSecs[sec.Key] = sec.Value // atomic secrets without any interpolation
} else {
interpolatedSecs[sec.Key] = sec.Value
}
}
for i, sec := range secrets {
// already present pick that up
if expandedVal, ok := expandedSecs[sec.Key]; ok {
secrets[i].Value = expandedVal
continue
}
expandedVal := recursivelyExpandSecret(expandedSecs, interpolatedSecs, func(env string, secPaths []string, secKey string) string {
secPaths = append([]string{"/"}, secPaths...)
secPath := path.Join(secPaths...)
secPathDot := strings.Join(secPaths, ".")
uniqKey := fmt.Sprintf("%s.%s", env, secPathDot)
if crossRefSec, ok := crossEnvRefSecs[uniqKey]; !ok {
// if not in cross reference cache, fetch it from server
refSecs, err := GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: env, InfisicalToken: infisicalToken, SecretsPath: secPath})
if err != nil {
HandleError(err, fmt.Sprintf("Could not fetch secrets in environment: %s secret-path: %s", env, secPath), "If you are using a service token to fetch secrets, please ensure it is valid")
}
refSecsByKey := getSecretsByKeys(refSecs)
// save it to avoid calling api again for same environment and folder path
crossEnvRefSecs[uniqKey] = refSecsByKey
return refSecsByKey[secKey].Value
} else {
return crossRefSec[secKey].Value
}
}, sec.Key)
secrets[i].Value = expandedVal
}
return secrets
}
func OverrideSecrets(secrets []models.SingleEnvironmentVariable, secretType string) []models.SingleEnvironmentVariable {

View File

@ -41,6 +41,20 @@ services:
networks:
- infisical
# secret-scanning-git-app:
# container_name: infisical-secret-scanning-git-app
# restart: unless-stopped
# depends_on:
# - backend
# - frontend
# - mongo
# ports:
# - "3000:3001"
# image: infisical/staging_deployment_secret-scanning-git-app
# env_file: .env
# networks:
# - infisical
mongo:
container_name: infisical-mongo
image: mongo

View File

@ -4,6 +4,21 @@ title: "Changelog"
The changelog below reflects new product developments and updates on a monthly basis; it will be updated later this quarter to include issues-addressed on a weekly basis.
## July 2023
- Released [secret referencing](https://infisical.com/docs/documentation/platform/secret-reference) across folders and environments.
- Added the [intergation with Laravel Forge](https://infisical.com/docs/integrations/cloud/laravel-forge).
- Redesigned the project/organization experience.
## June 2023
- Released the [Terraform Provider](https://infisical.com/docs/integrations/frameworks/terraform#5-run-terraform).
- Updated the usage and billing page. Added the free trial for the professional tier.
- Added the intergation with [Checkly](https://infisical.com/docs/integrations/cloud/checkly), [Hashicorp Vault](https://infisical.com/docs/integrations/cloud/hashicorp-vault), and [Cloudflare Pages](https://infisical.com/docs/integrations/cloud/cloudflare-pages).
- Comleted a penetration test with a `very good` result.
- Added support for multi-line secrets.
## May 2023
- Released secret scanning capability for the CLI.
@ -11,8 +26,7 @@ The changelog below reflects new product developments and updates on a monthly b
- Completed penetration test.
- Released new landing page.
- Started SOC 2 (Type II) compliance certification preparation.
More coming soon.
- Released new deployment options for Fly.io, Digital Ocean and Render.
## April 2023
@ -107,4 +121,4 @@ More coming soon.
- Added search bar to dashboard to query for keys on client-side.
- Added capability to rename a project.
- Added user roles for projects.
- Added incident contacts.
- Added incident contacts.

View File

@ -99,6 +99,15 @@ Export environment variables from the platform into a file format.
Default value: `true`
</Accordion>
<Accordion title="--path">
The `--path` flag indicates which project folder secrets will be injected from.
```bash
# Example
infisical export --path="/path/to/folder" --env=dev
```
</Accordion>
<Accordion title="--tags">
When working with tags, you can use this flag to filter and retrieve only secrets that are associated with a specific tag(s).

View File

@ -113,4 +113,14 @@ Inject secrets from Infisical into your application process.
By default, all secrets are fetched
</Accordion>
<Accordion title="--path">
The `--path` flag indicates which project folder secrets will be injected from.
```bash
# Example
infisical run --path="/nextjs" -- npm run dev
```
</Accordion>
</Accordion>

View File

@ -51,6 +51,14 @@ This command enables you to perform CRUD (create, read, update, delete) operatio
Default value: `dev`
</Accordion>
<Accordion title="--path">
The `--path` flag indicates which project folder secrets will be injected from.
```bash
# Example
infisical secrets --path="/" --env=dev
```
</Accordion>
</Accordion>
@ -90,6 +98,14 @@ $ infisical secrets set STRIPE_API_KEY=sjdgwkeudyjwe DOMAIN=example.com HASH=jeb
Default value: `dev`
</Accordion>
<Accordion title="--path">
Used to select the project folder in which the secrets will be set. This is useful when creating new secrets under a particular path.
```bash
# Example
infisical secrets set DOMAIN=example.com --path="common/backend"
```
</Accordion>
</Accordion>
<Accordion title="infisical secrets delete">
@ -108,6 +124,14 @@ $ infisical secrets set STRIPE_API_KEY=sjdgwkeudyjwe DOMAIN=example.com HASH=jeb
Default value: `dev`
</Accordion>
<Accordion title="--path">
The `--path` flag indicates which project folder secrets will be injected from.
```bash
# Example
infisical secrets delete <keyName1> <keyName2>... --path="/"
```
</Accordion>
</Accordion>
<Accordion title="infisical secrets generate-example-env">

View File

@ -3,9 +3,7 @@ title: 'Install'
description: "Infisical's CLI is one of the best way to manage environments and secrets. Install it here"
---
Prerequisite: Set up an account with [Infisical Cloud](https://app.infisical.com) or via a [self-hosted installation](/self-hosting/overview).
The Infisical CLI provides a way to inject environment variables from the platform into your apps and infrastructure.
The Infisical CLI can be used to access secrets across various environments, whether it's local development, CI/CD, staging, or production.
## Installation
@ -100,49 +98,3 @@ The Infisical CLI provides a way to inject environment variables from the platfo
</Tab>
</Tabs>
### Log in to the Infisical CLI
```bash
infisical login
```
<Accordion title="Optional: point CLI to self-hosted">
The CLI is set to connect to Infisical Cloud by default, but if you're running your own instance of Infisical, you can direct the CLI to it using one of the methods provided below.
#### Method 1: Use the updated CLI
Beginning with CLI version V0.4.0, it is now possible to choose between logging in through the Infisical cloud or your own self-hosted instance. Simply execute the `infisical login` command and follow the on-screen instructions.
#### Method 2: Export environment variable
You can point the CLI to the self hosted Infisical instance by exporting the environment variable `INFISICAL_API_URL` in your terminal.
<Tabs>
<Tab title="Linux/MacOs">
```bash
# Set backend host
export INFISICAL_API_URL="https://your-self-hosted-infisical.com/api"
# Remove backend host
unset INFISICAL_API_URL
```
</Tab>
<Tab title="Windows Powershell">
```bash
# Set backend host
setx INFISICAL_API_URL "https://your-self-hosted-infisical.com/api"
# Remove backend host
setx INFISICAL_API_URL ""
# NOTE: Once set or removed, please restart powershell for the change to take effect
```
</Tab>
</Tabs>
#### Method 3: Set manually on every command
Another option to point the CLI to your self hosted Infisical instance is to set it via a flag on every command you run.
```bash
# Example
infisical <any-command> --domain="https://your-self-hosted-infisical.com/api"
```
</Accordion>

View File

@ -1,77 +1,137 @@
---
title: "Usage"
description: "How to manage you secrets with Infisical's CLI?"
title: "Quick usage"
description: "Manage secrets with Infisical CLI"
---
Prerequisite: [Install the CLI](/cli/overview)
The CLI is designed for a variety of applications, ranging from local secret management to CI/CD and production scenarios.
The distinguishing factor, however, is the authentication method used.
## Authenticate
<Tabs>
<Tab title="Local development">
To use the Infisical CLI in your development environment, you can run the command below.
This will allow you to access the features and functionality provided by the CLI.
To use the Infisical CLI in your development environment, simply run the following command and follow the interactive guide.
```bash
infisical login
```
<Note>
If you are in a containerized environment such as WSL 2 or Codespaces, run `infisical login -i` to avoid browser based login
</Note>
## Initialize Infisical for your project
```bash
# navigate to your project
cd /path/to/project
# initialize infisical
infisical init
```
This will create `.infisical.json` file at the location the command was executed. This file contains your [local project settings](./project-config). It does not contain any sensitive data.
</Tab>
<Tab title="Infisical Token">
To use Infisical CLI in environments where you cannot run the `infisical login` command, you can authenticate via a
Infisical Token instead. Learn more about [Infisical Token](/documentation/platform/token).
<Tab title="CI/CD, Production usage, etc">
To use Infisical for non local development scenarios, please create a [service token](../documentation/platform/token). The service token will allow you to authenticate and interact with Infisical.
Once you have created a service token with the required permissions, you'll need to feed the token to the CLI.
#### Pass as flag
You may use the --token flag to set the token
```
infisical export --token=<>
infisical secrets --token=<>
infisical run --token=<> -- npm run dev
```
#### Pass via shell environment variable
The CLI is configured to look for an environment variable named `INFISICAL_TOKEN`. If set, it'll attempt to use it for authentication.
```
export INFISICAL_TOKEN=<>
```
</Tab>
</Tabs>
## Initialize Infisical for your project
```bash
# navigate to your project
cd /path/to/project
# initialize infisical
infisical init
```
## Inject environment variables
<Tabs>
<Tab title="Feed secrets to your application">
```bash
infisical run -- [your application start command]
<Accordion title="Injecting environment variables directly" defaultOpen="true">
```bash
# inject environment variables into app
infisical run -- [your application start command]
```
</Accordion>
# example with node (nodemon)
infisical run --env=dev --path=/apps/firefly -- nodemon index.js
<Accordion title="Injecting environment variables in custom aliases">
Custom aliases can utilize secrets from Infisical. Suppose there is a custom alias `yd` in `custom.sh` that runs `yarn dev` and needs the secrets provided by Infisical.
```bash
#!/bin/sh
# example with flask
infisical run -- flask run
yd() {
yarn dev
}
```
# example with spring boot - maven
infisical run -- ./mvnw spring-boot:run --quiet
```
</Tab>
<Tab title="Feed secrets via custom aliases (advanced)">
Custom aliases can utilize secrets from Infisical. Suppose there is a custom alias `yd` in `custom.sh` that runs `yarn dev` and needs the secrets provided by Infisical.
```bash
#!/bin/sh
To make the secrets available from Infisical to `yd`, you can run the following command:
yd() {
yarn dev
}
```
```bash
infisical run --command="source custom.sh && yd"
```
</Accordion>
To make the secrets available from Infisical to `yd`, you can run the following command:
```bash
infisical run --command="source custom.sh && yd"
```
</Tab>
</Tabs>
View all available options for `run` command [here](./commands/run)
## Examples:
## Connect CLI to self hosted Infisical
```bash
# example with node
infisical run -- node index.js
<Accordion title="Optional: point CLI to self-hosted">
The CLI is set to connect to Infisical Cloud by default, but if you're running your own instance of Infisical, you can direct the CLI to it using one of the methods provided below.
# example with node (nodemon)
infisical run -- nodemon index.js
#### Method 1: Use the updated CLI
Beginning with CLI version V0.4.0, it is now possible to choose between logging in through the Infisical cloud or your own self-hosted instance. Simply execute the `infisical login` command and follow the on-screen instructions.
# example with node (nodemon) pulling in secrets from test environment
infisical run --env=test -- nodemon index.js
#### Method 2: Export environment variable
You can point the CLI to the self hosted Infisical instance by exporting the environment variable `INFISICAL_API_URL` in your terminal.
# example with flask
infisical run -- flask run
<Tabs>
<Tab title="Linux/MacOs">
```bash
# Set backend host
export INFISICAL_API_URL="https://your-self-hosted-infisical.com/api"
# Remove backend host
unset INFISICAL_API_URL
```
</Tab>
<Tab title="Windows Powershell">
```bash
# Set backend host
setx INFISICAL_API_URL "https://your-self-hosted-infisical.com/api"
# Remove backend host
setx INFISICAL_API_URL ""
# NOTE: Once set or removed, please restart powershell for the change to take effect
```
</Tab>
</Tabs>
#### Method 3: Set manually on every command
Another option to point the CLI to your self hosted Infisical instance is to set it via a flag on every command you run.
```bash
# Example
infisical <any-command> --domain="https://your-self-hosted-infisical.com/api"
```
</Accordion>

View File

@ -30,7 +30,7 @@ If you're ever in doubt about whether or not a proposed feature aligns with Infi
## Writing and submitting code
Anyone can contribute code to Infisical. To get started, check out the [local development guide](/contributing/developing), make your changes, and submit a pull request to the main repository
adhering to the [pull request guide](/).
adhering to the [pull request guide](/contributing/pull-requests).
## Licensing

View File

@ -20,7 +20,7 @@ Start syncing environment variables with [Infisical Cloud](https://app.infisical
## Integrate with Infisical
<CardGroup cols={2}>
<Card href="/documentation/getting-started/cli" title="Command Line Interface (CLI)" icon="square-terminal" color="#3775a9">
<Card href="../../cli/overview" title="Command Line Interface (CLI)" icon="square-terminal" color="#3775a9">
Inject secrets into any application process/environment
</Card>
<Card

View File

@ -61,14 +61,18 @@ metadata:
spec:
# The host that should be used to pull secrets from. If left empty, the value specified in Global configuration will be used
hostAPI: https://app.infisical.com/api
resyncInterval:
authentication:
serviceToken:
serviceTokenSecretReference: # <-- The secret's namespaced name that holds the project token for authentication in step 1
serviceToken:
serviceTokenSecretReference:
secretName: service-token
secretNamespace: option
managedSecretReference:
secretsScope:
envSlug: dev
secretsPath: "/"
managedSecretReference:
secretName: managed-secret # <-- the name of kubernetes secret that will be created
secretNamespace: default # <-- in what namespace it will be created in
secretNamespace: default # <-- where the kubernetes secret should be created
```
```

View File

@ -1,46 +1,30 @@
---
title: "Folder"
description: "How Infisical structures secrets into folders"
title: "Folders"
description: "Organize your secrets with folders"
---
Folders can be used to group secrets into multiple levels, which can help organize secrets in monorepos or microservice-based architectures. For example, you could create a folder for each environment, such as production, staging, and development.
Folders provide a powerful and intuitive way to structure your secrets.
They offer a system to keep your secrets organized and easily accessible, which becomes increasingly important as your collection of secrets grow.
Within each environment folder, you could create subfolders for different types of secrets, such as database credentials, API keys, and SSH keys. This can help to keep your secrets organized and easy to find.
With folders in Infisical, you can now create a hierarchy of folders to organize your secrets, mirroring your application's architecture or any logical grouping that suits your needs.
Whether you follow a microservices architecture or work with monorepos, folders make it simpler to locate, manage and collaborate between teams.
## Dashboard
![dashboard with folders](../../images/dashboard-folders.png)
## Creating a folder
Only alphabets, numbers, and dashes are allowed in folder names. You can create a folder for each environment from the dashboard.
To create a folder, head over to the environment where you'd like to create the folder. Once there, click the `Add folder` button as shown below.
If you wish to create nested folders, simply click into the folder of choice and click `Add folder` button again.
![dashboard add folders](../../images/dashboard-add-folder.png)
To create a nested folder or access the secrets of a folder, click on an existing folder to open it. You will then be able to modify the secrets of that folder and create new folders inside it.
<Info>
Folder names can only contain alphabets, numbers, and dashes
</Info>
## Dashboard Secret Overview
## Compare folders across environments
The overview screen provides a comprehensive view of all your secrets and folders, organized by environment.
![dashboard secret overview with folders](../../images/dashboard-folder-overview.png)
When you click on a folder, the overview will be updated to show only the secrets and folders in that folder. This makes it easy to find the information you need, no matter how deeply nested it is.
## Integrations
You can easily scope injected secrets to a folder during integrations by providing the secret path option.
![integrations scoped with folders](../../images/integration-folders.png)
For more information on integrations, [refer infisical integration](/integrations/overview)
## Service Tokens
You can scope the secrets that can be read and written using an Infisical token by providing the secret path option when creating the token.
![folder scoped service token](../../images/project-folder-token.png)
For more information, [refer infisical token section.](./token)
## Point-In-Time Recovery
For more information on how PIT recovery works on folders, [please refer to this section.](./pit-recovery)
When you click on a folder, the overview will be updated to show only the secrets and folders in that folder. This allows you to compare secrets across environment regardless of how deeply nested your folders are.

View File

@ -3,28 +3,26 @@ title: "Point-in-Time Recovery"
description: "How to rollback secrets and configs to any commit with Infisical."
---
Point-in-time recovery allows environment variables to be rolled back to any point in time. It's powered by snapshots that get captured after mutations to environment variables.
Point-in-time recovery allows secrets to be rolled back to any point in time.
It's powered by snapshots that get created after every mutations to a secret within a given [folder](./folder) and environment.
## Commits
Similar to Git, a commit in Infisical is a snapshot of your project's secrets at a specific point in time. You can browse and view your project's snapshots via the "Point-in-Time Recovery" sidebar.
Similar to Git, a commit in Infisical is a snapshot of your project's secrets at a specific point in time scoped to the environment and [folder](./folder) it is in. You can browse and view your project's snapshots via the "Point-in-Time Recovery" sidebar.
![PIT commits](../../images/pit-commits.png)
![PIT snapshots](../../images/pit-snapshots.png)
## Rolling back
Environment variables can be rolled back to any point in time via the "Rollback to this snapshot" button.
Secrets can be rolled back to any point in time via the "Rollback to this snapshot" button. This will roll back the changes within the given [folder](./folder) and environment to the chosen time.
It's important to note that this rollback action is localized and does not affect other folders within the same environment. This means each [folder](./folder) maintains its own independent history of changes, offering precise and isolated control over rollback actions.
In essence, every [folder](./folder) possesses a distinct and separate timeline, providing granular control when managing your secrets.
![PIT snapshot](../../images/pit-snapshot.png)
<Note>
Rolling back environment variables to a past snapshot creates a new commit and
snapshot at the top of the stack and updates secret versions.
Rolling back secrets to a past snapshot creates a new commit,
creates a snapshot at the top of the stack and updates secret versions.
</Note>
## Folders
Any folder operation, such as creating, updating, or deleting a folder, will create a new commit.
When you roll back the contents of a folder, the folder will be restored to its latest snapshot. The nested folders will also be restored to their respective latest versions.

View File

@ -0,0 +1,37 @@
---
title: "Reference Secrets"
description: "How to use reference secrets in Infisical"
---
Secret referencing is a powerful feature that allows you to create a secret whose value is linked to one or more other secrets.
This is useful when you need to use a single secret's value across multiple other secrets.
Consider a scenario where you have a database password. In order to utilize this password, you may need to incorporate it into a database connection string.
With secret referencing, you can easily construct these more intricate secrets by directly referencing the base secret.
This centralizes the management of your base secret, as any updates made to it will automatically propagate to all the secrets that depend on it.
## Referencing syntax
<img src="../../images/example-secret-referencing.png" />
Secret referencing relies on interpolation syntax. This syntax allows you to reference a secret in any environment or [folder](./folder).
To reference a secret named 'mysecret' in the same [folder](./folder) and environment, you'd use `${mysecret}`.
However, to reference the same secret at the root of a different environment, for instance `dev` environment, you'd use `${dev.mysecret}`.
Here are a few more examples to help you understand how to reference secrets in different contexts:
| Reference syntax | Environment | Folder | Secret Key |
| --------------------- | ----------- | ------------ | ---------- |
| `${KEY1}` | same env | same folder | KEY1 |
| `${dev.KEY2}` | `dev` | `/` (root of dev environment) | KEY2 |
| `${prod.frontend.KEY2}` | `prod` | `/frontend` | KEY2 |
## Fetching fully constructed values
Secret referencing combines multiple secrets into one unified value, reconstructed only on the client side. To retrieve this value, you need access to read the environment and [folder](./folder) from where the secrets originate.
For instance, to access a secret 'A' composed of secrets 'B' and 'C' from different environments, you must have read access to both.
When using [service tokens](./token) to fetch referenced secrets, ensure the service token has read access to all referenced environments and folders.
Without proper permissions, the final secret value may be incomplete.

View File

@ -1,21 +1,37 @@
---
title: "Infisical Token"
description: "Use the Infisical Token as one of the authentication methods."
title: "Service token"
description: "Infisical service tokens allows you to programmatically interact with Infisical"
---
An Infisical Token is useful for:
Service tokens play an integral role in allowing programmatic interactions with an Infisical project, functioning as digital token that open access to specific project resources such as secrets.
- Authenticating the [Infisical CLI](/cli/overview) when there isn't an easy way to input your login credentials.
- Granting the [Infisical SDKs](/sdks/overview) access to secrets scoped to a project and environment.
When you generate a service token, you can define its access level, not only by specifying the paths and environments it can interact with, but also by determining the level of mutation it can perform, such as read-only, write, or both.
It's also useful for CI/CD environments and integrations such as [Docker](/integrations/platforms/docker) and [Docker Compose](/integrations/platforms/docker-compose).
This level of control not only ensures maximum flexibility but also significantly enhances security as it allows you to define fine grained access to project resources.
To generate the the token, head over to your project settings as shown below. On creating a service token you can scope it to a path to limit the access.
## Creating a service token
To generate the token, head over to your project settings as shown below. On creating a service token you can scope it to a path to limit the access.
![token add](../../images/project-token-add.png)
## Feeding Infisical Token to the CLI
### Service token permissions
![token add](../../images/service-token-permissions.png)
The Infisical CLI checks for the presence of an environment variable called `INFISICAL_TOKEN`.
If it detects this variable in the terminal where it is being run, it will use it to authenticate and retrieve the environment variables that the token is authorized to access.
This allows you to use the CLI in environments where you are unable to run the `infisical login` command.
Service tokens can be scoped to multiple environments and paths. To add a new permission, choose the environment you want to give access to and then choose the path you'd like to give access to within that environment.
Permissions for paths are powered by [Glob pattern](https://www.malikbrowne.com/blog/a-beginners-guide-glob-patterns/). This means you can create advanced folder permissions with a simple Glob patterns.
**Examples of common Glob pattens**
<Accordion title="Examples of common Glob pattens">
1. `/**`: This pattern matches all folders at any depth in the directory structure. For example, it would match folders like `/folder1/`, `/folder1/subfolder/`, and so on.
2. `/*`: This pattern matches all immediate subfolders in the current directory. It does not match any folders at a deeper level. For example, it would match folders like `/folder1/`, `/folder2/`, but not `/folder1/subfolder/`.
3. `/*/*`: This pattern matches all subfolders at a depth of two levels in the current directory. It does not match any folders at a shallower or deeper level. For example, it would match folders like `/folder1/subfolder/`, `/folder2/subfolder/`, but not `/folder1/` or `/folder1/subfolder/subsubfolder/`.
4. `/folder1/*`: This pattern matches all immediate subfolders within the `/folder1/` directory. It does not match any folders outside of `/folder1/`, nor does it match any subfolders within those immediate subfolders. For example, it would match folders like `/folder1/subfolder1/`, `/folder1/subfolder2/`, but not `/folder2/subfolder/`.
</Accordion>

View File

@ -0,0 +1,36 @@
---
title: "Webhooks"
description: "How Infisical webhooks works?"
---
Webhooks can be used to trigger changes to your integrations when secrets are modified, providing smooth integration with other third-party applications.
![webhooks](../../images/webhooks.png)
To create a webhook for a particular project, go to `Project Settings > Webhooks`.
When creating a webhook, you can specify an environment and folder path (using glob patterns) to trigger only specific integrations.
## Secret Key Verification
A secret key is a way for users to verify that a webhook request was sent by Infisical and is intended for the correct integration.
When you provide a secret key, Infisical will sign the payload of the webhook request using the key and attach a header called `x-infisical-signature` to the request with a payload.
The header will be in the format `t=<timestamp>;<signature>`. You can then generate the signature yourself by generating a SHA256 hash of the payload with the secret key that you know.
If the signature in the header matches the signature that you generated, then you can be sure that the request was sent by Infisical and is intended for your integration. The timestamp in the header ensures that the request is not replayed.
### Webhook Payload Format
```json
{
"event": "secret.modified",
"project": {
"workspaceId":"the workspace id",
"environment": "project environment",
"secretPath": "project folder path"
},
"timestamp": ""
}
```

Binary file not shown.

After

Width:  |  Height:  |  Size: 246 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 160 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 450 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 451 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 210 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 249 KiB

Some files were not shown because too many files have changed in this diff Show More