Compare commits

..

665 Commits

Author SHA1 Message Date
625fa0725e add approvals-needed and sent api 2023-02-26 19:05:36 -05:00
e32cb8c24f add index for requestedChanges.approvers.userId 2023-02-26 19:04:44 -05:00
de724d2804 improve add/remove approvers into workspace 2023-02-26 18:29:35 -05:00
42d94521a4 extend Secres schema in requestedChangeSchema 2023-02-26 15:15:41 -05:00
203a603769 complete merge API and improve reject/approve api 2023-02-26 15:15:41 -05:00
e1a88b2d1a add method to check required secret fields 2023-02-26 15:15:41 -05:00
53237dd52c merge approval requests 2023-02-26 15:15:41 -05:00
6a906b17ad add approvals request reject, approve and merge api 2023-02-26 15:15:41 -05:00
f38a364d3b add approve request api 2023-02-26 15:15:41 -05:00
4749e243bb get approvals by requester & improve create request 2023-02-26 15:15:41 -05:00
eb055e8b16 create approval 2023-02-26 15:15:41 -05:00
0e17c9a6db Merge pull request #383 from 5h4k4r/use-multi-stage-build
Builds Docker image of backend component with multi-stage Dockerfile
2023-02-26 10:50:56 -05:00
1c4dd78dea use node alpine base image 2023-02-26 10:49:01 -05:00
23418b3a09 Builds Docker image of backend component with multi-stage Dockerfile
Updates package.json `npm start` command
Delete unnecessary `npm run start` commands

Signed-off-by: Shakar <5h4k4r.b4kr@gmail.com>
2023-02-26 14:50:49 +03:00
0f143adbde Merge pull request #377 from caioluis/main
feat(webui localization): add support for pt-PT
2023-02-25 12:01:51 -08:00
1f3f4b7900 Merge pull request #353 from Grraahaam/chore/helm-docs
chore(docs): helm charts + local cluster setup script
2023-02-25 13:25:47 -05:00
2c5f26380e Merge pull request #373 from jon4hz/default-env
fix: properly support default environment
2023-02-25 13:08:00 -05:00
8f974fb087 Add docs for default environment 2023-02-25 13:07:07 -05:00
a0722b4ca5 Merge pull request #372 from jon4hz/format-cfg
fix: pretty print workspace file
2023-02-25 12:42:46 -05:00
41e039578a Merge pull request #374 from jon4hz/shell
fix: always execute cmd in subshell
2023-02-25 12:39:08 -05:00
c89e8e8a96 Merge pull request #375 from jon4hz/workspace
feat: search for workspace config in parent dir
2023-02-25 11:27:34 -05:00
cac83ab927 add debug log to workspace path location 2023-02-25 11:25:49 -05:00
0f0b894363 Revert "feat(webui-localization): fix and update pt-br copies"
I was dizzy and forgot to make a branch

This reverts commit 43f9af1bc6d27b0e6fc477535f5a17d963df6b51.

Signed-off-by: Caio Gomes <ocaioluis@gmail.com>
2023-02-25 08:05:43 +00:00
43f9af1bc6 feat(webui-localization): fix and update pt-br copies 2023-02-25 08:04:02 +00:00
f5ed14c84c fix(localization): add pt-PT to the options and fix a copy 2023-02-25 07:22:34 +00:00
2dd57d7c73 feat(localization): add locales for pt-PT 2023-02-25 06:53:02 +00:00
0b1891b64a Merge pull request #371 from jon4hz/env-filter
Improve env filter
2023-02-24 23:12:20 -05:00
5614b0f58a nit: method name change 2023-02-24 22:21:20 -05:00
3bb178976d Remove auto delete index 2023-02-24 21:57:13 -05:00
1777f98aef feat: search for workspace config in parent dir 2023-02-25 01:25:39 +01:00
45e3706335 fix: always execute cmd in subshell 2023-02-25 00:19:58 +01:00
337ed1fc46 fix: properly support default environment 2023-02-24 23:20:30 +01:00
d1ea76e5a0 fix: pretty print workspace file 2023-02-24 22:28:28 +01:00
4a72d725b1 fix: use function to get secrets by key 2023-02-24 22:03:50 +01:00
1693db3199 fix: preallocate map size 2023-02-24 22:03:13 +01:00
1ff42991b3 fix: improve filtering of reserved env vars 2023-02-24 21:57:48 +01:00
978423ba5b Merge pull request #364 from alexdanilowicz/patch-3
chore(docs): Update README MIT License badge link
2023-02-24 11:41:28 -08:00
4d0dc0d7b7 Update README.md 2023-02-24 11:40:03 -08:00
3817e666a9 Merge pull request #365 from Infisical/travisci
Add Travis CI Docs
2023-02-24 16:44:15 +07:00
b61350f6a4 Add Travis CI docs, minor edits 2023-02-24 16:37:26 +07:00
0fb1a1dc6f Merge remote-tracking branch 'origin' into travisci 2023-02-24 16:08:53 +07:00
9eefc87b7a Merge pull request #350 from Aashish-Upadhyay-101/Aashish-Upadhyay-101/TravisCI-integration
Feat: travis ci integration
2023-02-24 16:08:19 +07:00
53d35757ee Make minor changes to TravisCI sync, faster, reliable 2023-02-24 15:59:12 +07:00
e80e8e00b1 Replace axios with request 2023-02-24 15:31:46 +07:00
0b08e574c7 Merge remote-tracking branch 'refs/remotes/origin/main' into Aashish-Upadhyay-101/TravisCI-integration 2023-02-24 13:36:35 +05:45
499323d0e3 Sync travis-ci 2023-02-24 13:36:17 +05:45
89ad2f163a chore(docs): README
Another small README typo that I noticed. Links to a different repo.
2023-02-23 23:27:28 -08:00
7f04617b7d update brew command to update cli 2023-02-23 19:37:19 -05:00
44904628bc Merge pull request #360 from bngmnn/main
add german as readme language
2023-02-23 15:10:44 -08:00
fafde7b1ad update other languages' readme files with 'de' flag 2023-02-23 23:56:48 +01:00
7e65314670 add german readme 2023-02-23 23:56:17 +01:00
df52c56e83 patch login bug in cli 2023-02-23 14:14:41 -05:00
4276fb54cc Add docs for git branch mapping 2023-02-23 13:07:18 -05:00
bb5a0db79c Merge pull request #359 from Infisical/axios-retry
Add axios-retry for Vercel integration for now
2023-02-24 00:59:29 +07:00
b906048ea1 Add axios-retry for Vercel integration for now 2023-02-24 00:52:43 +07:00
7ce9c816c5 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-23 21:38:21 +07:00
3fef6e4849 Replace Promise.all with for-await Vercel getting envars 2023-02-23 21:38:11 +07:00
e7ce1e36e7 Update README.md 2023-02-23 06:25:48 -08:00
734c915206 Merge pull request #358 from alisson-acioli/main
Readme translation to PT-BR.
2023-02-23 06:23:35 -08:00
783174adc6 Add for-await for better Vercel integration reliability 2023-02-23 20:21:17 +07:00
d769db7668 integrações world 2023-02-23 09:52:58 -03:00
00e532fce4 remove english 2023-02-23 09:52:08 -03:00
7cf8cba54b remove word 2023-02-23 09:51:30 -03:00
70b26811d9 Remove word API 2023-02-23 09:50:13 -03:00
e7aafecbc2 Update language readme options 2023-02-23 09:49:03 -03:00
949fb052cd Readme PT-BR 2023-02-23 09:45:58 -03:00
fcb1f5a51b Merge pull request #357 from Infisical/vercel-integration-patch
Patch Vercel case where secrets can be of type plain and sensitive
2023-02-23 16:53:44 +07:00
e24f70b891 Patch Vercel case where secrets can be of type plain and sensitive 2023-02-23 16:47:21 +07:00
bd233ebe9b allow git branch mapping to env 2023-02-23 00:00:56 -05:00
f92269f2ec Merge pull request #274 from Grraahaam/feat/pr-template
feat(docs): added a base pull request template
2023-02-23 11:16:19 +07:00
2143db5eb5 Update messaging 2023-02-22 17:21:44 -08:00
0c72f50b5e Updated readme's 2023-02-22 16:29:53 -08:00
3c4c616242 update 2fa prompt in cli 2023-02-22 18:52:09 -05:00
153baad49f Merge pull request #349 from ImBIOS/docs/translation-indonesia
translate: add Bahasa Indonesia for README
2023-02-22 15:22:17 -08:00
75a2ab636c Merge pull request #354 from akhilmhdh/feat/table-loader
Table loading state and empty states
2023-02-22 14:57:23 -08:00
05a77e612c Minor style updates 2023-02-22 14:54:59 -08:00
d02bc06dce Merge pull request #355 from alexdanilowicz/patch-2
docs: fix nit typos in README
2023-02-22 14:54:53 -05:00
e1f88f1a7b docs: fix nit typos in README
Fix 'development' typo in sub header of README and do not hyphenate use-cases.
2023-02-22 11:51:41 -08:00
86a2647134 feat(ui): added table skeleton and loading for settings page,
fix(ui): resolved missing loading state in add new member and whitespace in project settings page
2023-02-22 23:29:26 +05:30
621b640af4 feat(ui): added new components empty state and skeleton 2023-02-22 23:27:36 +05:30
40c80f417c chore(script): added warning 2023-02-22 10:30:33 +01:00
7bb2c1c278 fix(script): auto generate secrets at runtime 2023-02-22 10:25:08 +01:00
a5278affe6 chore(docs): improved charts related documentation 2023-02-22 10:18:22 +01:00
2f953192d6 feat(script): kind local development setup 2023-02-22 10:18:22 +01:00
af64582efd Merge pull request #352 from Infisical/socket-labs-smtp
Add support and docs for SocketLabs email SMTP
2023-02-22 15:47:53 +07:00
6ad70f24a2 Add support and docs for SocketLabs email SMTP 2023-02-22 15:44:47 +07:00
8bf8968588 Other frontend configurations for travis-ci 2023-02-22 13:36:22 +05:45
7e9ce0360a Create.tsx for travis-ci 2023-02-22 13:26:10 +05:45
1d35c41dcb Authorize.tsx for travis-ci 2023-02-22 13:20:34 +05:45
824315f773 model steup for travis-ci 2023-02-22 13:14:06 +05:45
8a74799d64 variable setup for travis-ci 2023-02-22 12:52:42 +05:45
f0f6e8a988 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-02-22 12:29:49 +05:45
89bc9a823c translate: add Bahasa Indonesia for README 2023-02-22 12:03:57 +07:00
40250b7ecf Merge pull request #344 from Infisical/ip-alerts
New device (IP and user agent) login detection and alert
2023-02-21 18:11:16 +07:00
2d6d32923d Finish alert for new device login detection 2023-02-21 18:01:26 +07:00
7cb6aee3f7 Add docs for MFA 2023-02-21 16:08:09 +07:00
469d042f4b Add CircleCI docs 2023-02-21 13:07:17 +07:00
c38ccdb915 Merge pull request #343 from Infisical/mfa
MFA
2023-02-21 12:47:56 +07:00
baaa92427f Remove dependency cycle 2023-02-21 12:43:17 +07:00
1ff2c61b3a Remove storage of protected key 2023-02-21 12:31:19 +07:00
0b356e0e83 Update README.md 2023-02-20 21:14:50 -08:00
eb55c053eb Merge pull request #341 from umrak11/feature/improve-backup-pdf-generation
Refactored and improved PDF backup generation
2023-02-20 20:56:15 -08:00
07b307e4b1 Merge pull request #333 from esau-morais/issue-318
Add missing `forgot-password` in pt-BR and save selected env on URL
2023-02-20 20:44:27 -08:00
5bee6a5e24 Merge branch 'issue-318' of https://github.com/esau-morais/infisical into issue-318 2023-02-20 20:41:00 -08:00
bdc99e34cc Fix TS issue 2023-02-20 20:40:32 -08:00
cee10fb507 Merge branch 'main' into issue-318 2023-02-20 20:36:44 -08:00
74e78bb967 Merge remote-tracking branch 'origin' into mfa 2023-02-21 11:31:17 +07:00
ea5811c24c Fixed the bug with the tab indicators 2023-02-20 20:30:03 -08:00
d31b7ae4af Merge pull request #335 from animeshdas2000/fix/undefined-url
Fix: After trying to delete the last remaining project, it keeps loading and returns undefined in the URL
2023-02-20 19:54:41 -08:00
75eac1b972 Add LoginSRPDetail to v2 auth route 2023-02-21 10:52:31 +07:00
c65ce14de3 Merge branch 'main' into fix/undefined-url 2023-02-20 19:49:04 -08:00
f8c4ccd64c Updated readme 2023-02-20 19:17:04 -08:00
43ce222725 Minor style updates 2023-02-20 18:12:29 -08:00
c7ebeecb6b Merge pull request #337 from akhilmhdh/feat/new-org-settings
Revamped the org settings page
2023-02-20 17:25:27 -08:00
243c6ca22e feat: changed delete org membership to v2 2023-02-20 23:22:34 +05:30
66f1c57a2a feat(ui): completed org settings page revamp 2023-02-20 23:22:34 +05:30
c0d1495761 feat(ui): api hooks for new org settings page 2023-02-20 23:22:34 +05:30
e5f6ed3dc7 feat(ui): components changes for new org settings page 2023-02-20 23:22:32 +05:30
ab62d91b09 Refactored and improved PDF backup generation 2023-02-20 10:12:38 +01:00
59beabb445 Fix change password private key removal 2023-02-20 13:27:16 +07:00
d5bc377e3d Added notifications to 2FA and fixed state 2023-02-19 20:11:24 -08:00
2bdb20f42f Merge pull request #339 from Infisical/mfa
MFA
2023-02-19 18:33:11 -05:00
0062df58a2 Fixed dashboard for the corner case of purely personal secrets 2023-02-19 12:02:30 -08:00
b6bbfc08ad increase helm chart 2023-02-19 12:52:28 -05:00
5baccc73c9 Add mongo root password 2023-02-19 12:52:28 -05:00
20e7eae4fe Add more Accept-Encoding to integrations syncs 2023-02-19 13:42:37 +07:00
8432f71d58 add secrets auto reload 2023-02-19 00:44:38 -05:00
604c22d64d Merge remote-tracking branch 'origin' into mfa 2023-02-19 12:41:23 +07:00
c1deb08df8 always pull gamma iamge 2023-02-19 00:08:07 -05:00
66f201746f update gamma values 2023-02-19 00:06:38 -05:00
1c61ffbd36 patch upload script 2023-02-19 00:00:58 -05:00
e5ba8eb281 fix upload script 2023-02-18 23:50:35 -05:00
f542e07c33 download dependency for helm chart in upload step 2023-02-18 23:40:10 -05:00
1082d7f869 update helm docs for self install 2023-02-18 23:28:52 -05:00
4a3adaa347 Begin in-memory privat key storage 2023-02-19 10:54:54 +07:00
1659dab87d Merge pull request #314 from Grraahaam/feat/helm-mongodb-persistence
feat(chart): mongodb persistence
2023-02-18 18:06:39 -05:00
d88599714f update helm chart to refect dep update 2023-02-18 11:33:39 -05:00
71bf56a2b7 Update k8 operator dependencies 2023-02-18 11:32:46 -05:00
0fba78ad16 Update http net package 2023-02-18 10:16:27 -05:00
92560f5e1f Merge remote-tracking branch 'origin' into mfa 2023-02-18 16:49:51 +07:00
0d484b93eb Merge pull request #338 from Infisical/smoothen-integrations
Smoothen integrations
2023-02-18 16:30:56 +07:00
5f3b8c55b8 Merge branch 'main' into smoothen-integrations 2023-02-18 16:27:58 +07:00
553416689c Fix merge conflicts with main 2023-02-18 16:23:05 +07:00
b0744fd21d Wired frontend to use the batch stucture 2023-02-18 00:02:52 -08:00
be38844a5b Add 2FA in CLI 2023-02-17 20:48:19 -05:00
54e2b661bc ran prettier to fix indentation 2023-02-17 14:20:58 +05:30
b81d8eba25 added notification to .env errors 2023-02-16 21:05:38 -08:00
dbcd2b0988 Patch undefined req.secrets 2023-02-17 11:56:06 +07:00
1d11f11eaf Refactor login cmd 2023-02-16 15:49:14 -08:00
f2d7401d1d Add support for version 2 auth 2023-02-16 15:47:59 -08:00
91cb9750b4 Merge branch 'main' into fix/undefined-url 2023-02-17 00:33:55 +05:30
3e0d4cb70a fixed infinite loading in dashboard 2023-02-17 00:31:00 +05:30
dab677b360 Merge branch 'smoothen-integrations' of https://github.com/Infisical/infisical into smoothen-integrations 2023-02-17 01:18:31 +07:00
625c0785b5 Add validation to batch secret endpoint 2023-02-17 01:12:13 +07:00
540a8b4201 Fixed the bug with updating tags 2023-02-16 09:53:33 -08:00
11f86da1f6 Fixed the bug with updating tags 2023-02-16 09:35:57 -08:00
ab5ffa9ee6 Updated billing in the settings tab 2023-02-16 08:13:49 -08:00
65bec23292 Begin rewiring frontend to use batch route for CRUD secret ops 2023-02-16 22:43:53 +07:00
635ae941d7 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-15 23:56:33 -08:00
a9753fb784 Update the text in start guide 2023-02-15 23:56:16 -08:00
b587d9b35a Update README.md 2023-02-15 23:03:51 -08:00
aa68bc05d9 Updated 'are you sure?' message 2023-02-15 13:46:13 -08:00
66566a401f Updated the infisical guide 2023-02-15 13:23:14 -08:00
5aa75ecd3f feat: save selected env on url 2023-02-15 12:46:54 -03:00
0a77f9a0c8 fix(i18n): add missing forgot-password in pt-BR 2023-02-15 11:06:48 -03:00
b5d4cfed03 chore(docs): chart documentation generation 2023-02-15 09:43:17 +01:00
c57394bdab Merge remote-tracking branch 'origin' into mfa 2023-02-15 11:44:42 +07:00
da857f321b allow to set default env in project file 2023-02-14 19:57:27 -08:00
754ea09400 Add simple tries left for MFA 2023-02-15 10:28:23 +07:00
f28a2ea151 Small nits 2023-02-14 18:40:02 -08:00
c7dd028771 Update README.md 2023-02-14 18:34:41 -08:00
3c94bacda9 Update README.md 2023-02-14 18:31:45 -08:00
8e85847de3 improve example .env file format 2023-02-14 17:33:24 -08:00
0c10bbb569 send error message to standard error out 2023-02-14 13:43:40 -08:00
b710944630 Add more edge-cases to MFA 2023-02-15 01:29:40 +07:00
280f482fc8 Fix merge conflicts 2023-02-14 17:40:42 +07:00
e1ad8fbee8 Refactoring functions into services, helper functions, hooks, patch bugs 2023-02-14 17:38:58 +07:00
56ca6039ba fix(chart): backend service typos 2023-02-14 10:50:34 +01:00
fba54ae0c6 Add tags query to secrets api 2023-02-13 22:28:59 -08:00
e243c72ca6 add tags flag to secrets related command 2023-02-13 22:28:30 -08:00
23ea6fd4f9 filter secrets by tags 2023-02-13 20:51:43 -08:00
3f9f2ef238 Merge pull request #329 from fervillarrealm/feature/52-save-changes-user-leaving-dashboard
feat(ui): save changes when user leaving dashboard
2023-02-13 19:47:10 -08:00
77cb20f5c7 Fixed a TS error 2023-02-13 19:44:18 -08:00
ddf630c269 Fixed a TS error 2023-02-13 19:00:43 -08:00
39adb9a0c2 Merge pull request #328 from akhilmhdh/feat/ui-improvements
feat(ui): add new button style, improved select ui and linted app layout
2023-02-13 17:44:41 -08:00
97fde96b7b Merge branch 'main' into feat/ui-improvements 2023-02-13 17:33:23 -08:00
190391e493 Fixed bugs with organizations and sidebars 2023-02-13 17:27:21 -08:00
d3fcb69c50 fix(chart): backend service missing configuration 2023-02-14 01:18:41 +01:00
2db4a29ad7 chore(docs): chart setup + chart release notes.txt 2023-02-14 01:18:41 +01:00
4df82a6ff1 feat(chart): mailhog for local development 2023-02-14 01:18:41 +01:00
cdf73043e1 fix(chart): mongodb custom users + docs 2023-02-14 01:18:41 +01:00
ca07d1c50e fix(chart): helpers template auth.password typo 2023-02-14 01:18:41 +01:00
868011479b feat(chart): mongodb persistence 2023-02-14 01:18:41 +01:00
6f6df3e63a Update approverSchema 2023-02-13 11:27:02 -08:00
23c740d225 setHasUnsavedChanges to false when user selects another env and they agree to not save changes 2023-02-13 11:07:22 -06:00
702d4de3b5 feature/52-save-changes-user-leaving-dashboard 2023-02-13 10:18:52 -06:00
445fa35ab5 Add Aashish to README 2023-02-13 18:55:09 +07:00
9868476965 Merge pull request #298 from Aashish-Upadhyay-101/circleci-integration-branch
Circleci integration branch
2023-02-13 18:03:22 +07:00
bfa6b955ca handleAuthorizedIntegrationOptionPress case circleci 2023-02-13 16:29:06 +05:45
13b1805d04 Checkpoint 2 2023-02-13 17:11:31 +07:00
c233fd8ed1 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-02-13 15:39:38 +05:45
90f5934440 projects displaying issue fixed using circleci v1.1 2023-02-13 15:32:37 +05:45
30b2b85446 Fix merge conflicts 2023-02-13 15:24:19 +07:00
0adc3d2027 create secret approval data model 2023-02-12 23:20:27 -08:00
e53fd110f6 Checkpoint weaving frontend and backend MFA 2023-02-13 12:29:36 +07:00
edf0294d51 Remove docker fine cli 2023-02-12 19:59:40 -08:00
8850b44115 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-12 17:54:55 -08:00
17f9e53779 Updated the dashabord, members, and settings pages 2023-02-12 17:54:22 -08:00
a61233d2ba Release docker images for cli 2023-02-12 14:22:59 -08:00
2022988e77 Only allow sign up when invted 2023-02-12 10:34:32 -08:00
409de81bd2 Allow sign up disable 2023-02-12 09:34:52 -08:00
2b289ddf77 feat(ui): add new button style, improved select ui and linted app layout 2023-02-12 17:31:22 +05:30
b066a55ead Show only secret keys if write only access 2023-02-11 23:41:51 -08:00
8dfc0138f5 circleci project name issue fixed 2023-02-12 09:41:34 +05:45
517f508e44 circleci Current Integrations section error fixed 2023-02-12 08:32:04 +05:45
2f1a671121 add workspace-memberships api 2023-02-11 15:16:33 -08:00
2fb4b261a8 Turn off auto delete and manual check ttl for token 2023-02-11 11:08:46 -08:00
9c3c745fdf small changes 2023-02-11 18:10:58 +05:45
6a75147719 circleci-done 2023-02-11 17:57:01 +05:45
295b363d8a Merge remote-tracking branch 'refs/remotes/origin/main' into circleci-integration-branch 2023-02-11 17:55:59 +05:45
d96b5943b9 circleci integration create.jsx and authorize.jsx created 2023-02-11 17:08:03 +05:45
17406e413d Merge remote-tracking branch 'origin/2fa' into mfa 2023-02-11 16:08:13 +07:00
9b219f67b0 Update package.json 2023-02-11 16:07:32 +07:00
8fd2578a6d fixed the bug with no projects 2023-02-10 23:16:37 -08:00
cc809a6bc0 Merge pull request #315 from akhilmhdh/feat/new-layout
Feat new layout synced with api changes
2023-02-10 22:20:11 -08:00
66659c8fc8 Bug/typo/style fixes and some minor improvements 2023-02-10 22:17:39 -08:00
31293bbe06 Remove tags from secrets when tag is deleted 2023-02-10 19:33:40 -08:00
1c3488f8db add reset infisical docs 2023-02-10 17:41:31 -08:00
20e536cec0 Remove printing pathToDir 2023-02-10 17:25:01 -08:00
e8b498ca6d Minor style tweaks 2023-02-10 16:45:31 -08:00
b82f8606a8 add ValidateEnvironmentName method 2023-02-10 15:08:12 -08:00
ab27fbccf7 add reset command 2023-02-10 14:19:04 -08:00
d50de9366b Add docs for generate-example-env command 2023-02-10 12:29:47 -08:00
4c56bca4e7 Remove newline after heading in .sample-env 2023-02-10 12:24:29 -08:00
a60774a3f4 Merge pull request #327 from Infisical/parameter-store
Add support and docs for AWS parameter store and secret manager
2023-02-11 01:52:29 +07:00
03426ee7f2 Fix lint errors 2023-02-11 01:49:53 +07:00
428022d1a2 Add support and docs for AWS parameter store and secret manager 2023-02-11 01:40:18 +07:00
b5bcd0a308 feat(ui): updated merge conflicts in layout with new design 2023-02-10 22:09:22 +05:30
03c72ea00f feat(ui): added back layout change made for integrations page 2023-02-10 22:06:02 +05:30
a486390015 feat(ui): added new layout 2023-02-10 22:05:59 +05:30
8dc47110a0 feat(ui): added org context and user context 2023-02-10 22:03:57 +05:30
52a6fe64a7 feat(ui): new layout added queries 2023-02-10 22:03:57 +05:30
081ef94399 hard code site url frontend 2023-02-09 22:49:58 -08:00
eebde3ad12 Updated env variables and emails 2023-02-09 22:27:30 -08:00
669861d7a8 General frontend structure for 2FA - done 2023-02-09 15:49:47 -08:00
6ab6147ac8 Fixed service token bug 2023-02-09 13:40:33 -08:00
dd7e8d254b Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-09 18:24:23 +07:00
2765f7e488 Fix Vercel get apps response encoding 2023-02-09 18:24:10 +07:00
2d3a276dc2 Merge pull request #309 from RashidUjang/fix/issue-308-sidebar-issue
fix: handle duplicate edge case for sidebar loading
2023-02-08 23:50:41 -08:00
55eddee6ce Returned back @RashidUjang's change with secretIds 2023-02-08 23:48:25 -08:00
ab751d0db3 Merge branch 'main' into fix/issue-308-sidebar-issue 2023-02-08 23:42:46 -08:00
b2bd0ba340 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-08 23:38:25 -08:00
224fa25fdf Minor style fixes 2023-02-08 23:38:00 -08:00
e6539a5566 Merge remote-tracking branch 'refs/remotes/origin/main' into circleci-integration-branch 2023-02-09 13:16:43 +05:45
6115a311ad Merge pull request #317 from Infisical/gen-example-env-command
generate example .env file command
2023-02-08 18:51:23 -08:00
a685ac3e73 update regex to capature comment 2023-02-08 18:48:45 -08:00
9a22975732 When comments are empty, return empty byte 2023-02-08 17:29:35 -08:00
cd0b2e3a26 Change default secret comments 2023-02-08 14:36:56 -08:00
80a3c196ae Fixed errors with undefined tags 2023-02-08 14:32:57 -08:00
b0c541f8dc generate example .env file command 2023-02-08 13:46:57 -08:00
6188b04544 Switch azure integration off 2023-02-08 13:53:12 +07:00
8ba4f964d4 Switch Azure KV integration on 2023-02-08 13:42:49 +07:00
0d2caddb12 Merge pull request #292 from HasanMansoor4/auto-capitalization-toggle
Auto capitalization toggle for secrets
2023-02-07 21:55:05 -08:00
4570c35658 Merge pull request #313 from Infisical/debug-new-integrations
Fix more encoding issues with integrations
2023-02-08 12:38:49 +07:00
72f7d81b80 Fix more encoding issues with integrations 2023-02-08 12:38:15 +07:00
231fa61805 Merge branch 'main' into auto-capitalization-toggle 2023-02-07 21:32:29 -08:00
9f74affd3a Merge pull request #300 from kanhaiya38/feat/merge-env
feat(ui): allow user to merge secrets while uploading file
2023-02-07 21:29:38 -08:00
f58e1e1d6c Minor style changes 2023-02-07 21:27:21 -08:00
074cf695b2 Merge branch 'main' into feat/merge-env 2023-02-07 19:57:50 -08:00
07c056523f circle-ci integration done 2023-02-08 09:24:48 +05:45
65eb037020 Merge branch 'main' into auto-capitalization-toggle 2023-02-08 05:23:41 +03:00
c84add0a2a Merge pull request #312 from Infisical/secret-tagging
Added tags to secrets in the dashboard
2023-02-07 16:57:01 -08:00
ace0e9c56f Fixed the bug of wrong data structure 2023-02-07 16:54:13 -08:00
498705f330 Fixed the login error with tags 2023-02-07 16:47:05 -08:00
7892624709 Added tags to secrets in the dashboard 2023-02-07 16:29:15 -08:00
d8889beaf7 mark gitlab as complete 2023-02-07 12:58:39 -08:00
6e67304e92 Update wording of k8 2023-02-07 12:54:09 -08:00
8b23e89a64 add k8 diagram 2023-02-07 12:38:58 -08:00
7611b999fe Merge pull request #311 from Infisical/debug-new-integrations
Patch encoding header issue for some integrations for getting their apps
2023-02-08 01:30:02 +07:00
aba8feb985 Patch encoding header issue for some integrations for getting their apps 2023-02-08 01:28:46 +07:00
747cc1134c Merge pull request #310 from Infisical/refactor-integration-pages
Refactor integration pages into separate steps for authorization and integration creation.
2023-02-07 23:29:42 +07:00
db05412865 Fix incorrect imports, build errors 2023-02-07 23:27:21 +07:00
679b1d9c23 Move existing integration authorization and creation into separate steps 2023-02-07 23:10:31 +07:00
a37cf91702 fix: handle duplicate edge case for sidebar loading
This changes the SideBar's data prop to be filtered by id instead of key.

fixes issue #308
2023-02-07 21:35:13 +08:00
80d219c3e0 circle-ci integration on progress 2023-02-07 13:20:39 +05:45
5ea5887146 Begin refactoring all integrations to separate integration pages by step 2023-02-07 11:48:17 +07:00
13838861fb Merge pull request #305 from Infisical/azure
Finish v1 Azure Key Vault integration
2023-02-06 18:15:57 +07:00
09c60322db Merge branch 'main' into azure 2023-02-06 18:15:44 +07:00
68bf0b9efe Finish v1 Azure Key Vault integration 2023-02-06 17:57:47 +07:00
3ec68daf2e Merge branch 'main' into auto-capitalization-toggle 2023-02-06 11:17:08 +03:00
9fafe02e16 Merge branch 'main' into feat/merge-env 2023-02-05 23:16:19 -08:00
56da34d343 Merge pull request #304 from Infisical/secret-tagging
Revamped the dashboard look
2023-02-05 20:36:49 -08:00
086dd621b5 Revamped the dashabord look 2023-02-05 20:29:27 -08:00
56a14925da Add githlab to integ overview 2023-02-05 19:23:52 -08:00
c13cb23942 Add gitlab integ docs 2023-02-05 19:21:07 -08:00
31df4a26fa Update cli docs to be more clear and consistent 2023-02-05 16:05:34 -08:00
9f9273bb02 Add tags support for secrets 2023-02-05 12:54:42 -08:00
86fd876850 change api from post to patch, fix spelling mistakes 2023-02-05 20:51:53 +03:00
b56d9287e4 feat(ui): allow user to merge secrets while uploading file 2023-02-05 18:07:54 +05:30
a35e235744 remove console log 2023-02-05 06:25:40 +03:00
77a44b4490 Refactor into component and use React Query 2023-02-05 06:21:58 +03:00
594f846943 Merge remote-tracking branch 'origin/main' into auto-capitalization-toggle 2023-02-05 03:19:06 +03:00
8ae43cdcf6 Merge pull request #296 from akhilmhdh/fix/ws-redirect
feat(ui): removed workspace context redirect and added redirect when ws is deleted
2023-02-04 10:50:23 -08:00
1d72d310e5 Add offline support to faq 2023-02-04 08:48:01 -08:00
b0ffac2f00 fetch apps from circleci 2023-02-04 16:50:34 +05:45
5ba851adff circleci-integration-setup 2023-02-04 15:28:04 +05:45
e72e6cf2b7 feat(ui): removed workspace context redirect and added redirect when project is deleted 2023-02-04 14:24:10 +05:30
0ac40acc40 Merge pull request #295 from mocherfaoui/inf-compare-secrets
add new modal to compare secrets across environments
2023-02-03 23:55:17 -08:00
56710657bd Minor styling updates 2023-02-03 23:49:03 -08:00
92f4979715 Merge branch 'main' into inf-compare-secrets 2023-02-03 21:24:24 -08:00
1e9118df33 delete backup secrets when new user login 2023-02-03 21:14:56 -08:00
e16c0e53ff Add offline secrets fetch feature 2023-02-03 21:02:36 -08:00
0d57a26925 Add token flag to export command 2023-02-03 21:02:36 -08:00
1bd180596e Merge pull request #294 from akhilmhdh/feat/new-settings-page
New Project Settings Page
2023-02-03 20:11:30 -08:00
fca003dfd7 Minor typos fixed and style changes 2023-02-03 20:09:28 -08:00
f1ef23874c Add token flag to read secrets via service token 2023-02-03 16:55:40 -08:00
16883cf168 make some params optional 2023-02-03 22:34:18 +01:00
1781b71399 add new modal to compare secrets across environments 2023-02-03 22:33:39 +01:00
fb62fa4d32 feat(ui): updated select design due to rebase changes 2023-02-03 22:59:01 +05:30
ed148a542d feat(ui): implemented the new project settings page 2023-02-03 22:22:51 +05:30
a4f7843727 feat(ui): global workspace and subscription context 2023-02-03 22:21:14 +05:30
48cd84ce77 feat(ui): fine tuning components library with exiting app design 2023-02-03 22:21:09 +05:30
3859a7e09b feat(ui): added new react-query hooks for settings page 2023-02-03 22:20:04 +05:30
76d0127029 Add docs for PM2 integration, update Docker/Docker-Compose integration docs 2023-02-03 15:50:26 +07:00
a94cd8c85c Merge pull request #293 from Infisical/ip-address
Ip address
2023-02-03 12:49:32 +07:00
ee555f3f15 Rename loginSRPDetail file 2023-02-03 12:46:14 +07:00
bd230a8b7d Remove comment from loginSRPDetail 2023-02-03 12:40:35 +07:00
a4926d8833 Add back requestIp middleware 2023-02-03 12:32:54 +07:00
7560d2f673 Merge remote-tracking branch 'origin' into ip-address 2023-02-03 11:02:07 +07:00
44b2bc1795 modify method to check for cli updates 2023-02-02 12:58:05 -08:00
3ccc6e5d5c Merge pull request #280 from Neeraj138/faster-redirect-from-login
login.tsx: Faster redirect from login to dashboard.
2023-02-02 08:46:04 -08:00
ccb579ecfd Merge pull request #123 from Infisical/snyk-upgrade-168622761b1452230387c1e39953ec92
[Snyk] Upgrade @sentry/node from 7.19.0 to 7.21.1
2023-02-02 08:34:01 -08:00
29f5e8aa78 Merge branch 'main' into snyk-upgrade-168622761b1452230387c1e39953ec92 2023-02-02 08:31:38 -08:00
d64357af61 Merge tag 'main' into snyk-upgrade-168622761b1452230387c1e39953ec92 2023-02-02 08:17:27 -08:00
37c91ae652 Merge pull request #122 from Infisical/snyk-upgrade-b8de592fd7591ed26eb63611e9e90c65
[Snyk] Upgrade @sentry/tracing from 7.19.0 to 7.21.1
2023-02-02 08:16:49 -08:00
3a4cfa0834 Merge branch 'main' into snyk-upgrade-b8de592fd7591ed26eb63611e9e90c65 2023-02-02 08:14:54 -08:00
cef45c2155 Merge tag 'main' into snyk-upgrade-b8de592fd7591ed26eb63611e9e90c65 2023-02-02 08:10:42 -08:00
5143fc6eee Merge pull request #126 from Infisical/snyk-upgrade-69b188452db2966945d5ae119d7209d2
[Snyk] Upgrade mongoose from 6.7.2 to 6.7.3
2023-02-02 08:04:40 -08:00
186382619c Merge branch 'main' into snyk-upgrade-69b188452db2966945d5ae119d7209d2 2023-02-02 08:02:22 -08:00
91e70c5476 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-02 07:59:11 -08:00
216ace9f61 Updated readme and its translations; added contributors 2023-02-02 07:59:02 -08:00
6b99582a66 Merge pull request #124 from Infisical/snyk-upgrade-356fdb4c3069d260010f638026680c3c
[Snyk] Upgrade axios from 1.1.3 to 1.2.0
2023-02-02 07:56:11 -08:00
ea0fe1b92e Merge branch 'main' into snyk-upgrade-356fdb4c3069d260010f638026680c3c 2023-02-02 07:53:26 -08:00
72810acf2e Merge pull request #284 from KunalSin9h/fix-pdf-login-url
fix site url on pdf to be .env/SITE_URL & typo in website titles & Wrong Copyright message
2023-02-02 07:46:56 -08:00
a013768313 fix copyright label in go source 2023-02-02 19:45:24 +05:30
a660261678 fix type in Title -> 2023-02-02 19:25:36 +05:30
7d181f334c fix site url on pdf to be .env/SITE_URL 2023-02-02 18:49:58 +05:30
46ab27af1a Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-01 22:29:49 -08:00
25bb966a32 Added ability to change a role in an organization 2023-02-01 22:29:33 -08:00
c086579260 Merge pull request #281 from jon4hz/main
fix homebrew
2023-02-01 21:44:22 -08:00
3d14bc9a00 remove env name check 2023-02-01 20:31:25 -08:00
75cd7a0f15 integrate frontend with backend for auto capitalization setting 2023-02-02 05:30:22 +03:00
4722bb8fcd add auto capitalization api controllers and routes with mongo schema updated 2023-02-02 05:27:07 +03:00
f2175b948c Merge pull request #282 from nirga/main
chore: fix typo in quick start guide
2023-02-01 13:17:54 -08:00
6f3d102ecb chore: fix typo in quick start guide 2023-02-01 23:15:39 +02:00
54fa39f347 Fixed issues with breadcrumbs and redirects of forgot password 2023-02-01 12:22:41 -08:00
52697dea97 login.tsx: Faster redirect from login to dashboard. 2023-02-01 19:02:37 +05:30
c99b207e9e ci: maybe fix brew 2023-02-01 14:28:12 +01:00
4886537a56 Revert "Revert "Merge pull request #279 from jon4hz/main""
This reverts commit 1878bed10a1e06a8340d8b02385a8d0081394d61.
2023-02-01 14:19:49 +01:00
ca688764a3 Add login/logout logs 2023-02-01 11:56:38 +07:00
71cf54c28b add auto cli version to all-other-builds 2023-01-31 20:19:42 -08:00
1878bed10a Revert "Merge pull request #279 from jon4hz/main"
This reverts commit 87fd5e33f11a354a622990fb58d185d8094f29c6, reversing
changes made to 2c4e066f6421c461e28129fedc14fb6fb6b2b1b9.
2023-01-31 20:17:15 -08:00
87fd5e33f1 Merge pull request #279 from jon4hz/main
CI Improvements
2023-01-31 19:39:46 -08:00
ffda30bd65 ci: mark goreleaser snapshots as such 2023-02-01 03:54:22 +01:00
716795532e ci: bump goreleaser action 2023-02-01 03:38:18 +01:00
f9ff99748b ci: remove obsolete var 2023-02-01 03:37:07 +01:00
723fa153be ci: completion and manpages for homebrew 2023-02-01 03:36:26 +01:00
1871d1a842 fix: improve goreleaser 2023-02-01 03:35:54 +01:00
2c4e066f64 bring back auto cli version in CI 2023-01-31 17:34:50 -08:00
b371dad506 Increase cli version 2023-01-31 17:22:44 -08:00
a6d4431940 Auto add cli version from tag 2023-01-31 17:03:19 -08:00
871d80aad5 when login expired, do not ask to override login 2023-01-31 16:37:56 -08:00
6711979445 Disallow service token creation based on permission 2023-01-31 09:24:55 -08:00
cb080b356c increase cli version 2023-01-30 22:17:02 -08:00
9950c5e02d empty commit 2023-01-30 22:15:44 -08:00
22a11be4e0 Update host rules for permissioning 2023-01-30 21:38:09 -08:00
6e01c80282 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-30 21:14:41 -08:00
4e14f84df9 Allow editing personal permissions 2023-01-30 21:14:22 -08:00
55522404b4 Merge pull request #275 from Infisical/dependabot/npm_and_yarn/backend/cookiejar-2.1.4
Bump cookiejar from 2.1.3 to 2.1.4 in /backend
2023-01-30 20:37:44 -08:00
4ef8c273f7 Wired access controls for environemnts to frontend 2023-01-30 20:36:04 -08:00
61c17ccc5e update getAllAccessibleEnvironmentsOfWorkspace controller 2023-01-30 19:39:45 -08:00
2832476c2b Add write permission status 2023-01-30 19:38:40 -08:00
c0fc74b62a Add write permission status 2023-01-30 19:22:52 -08:00
bb752863fa Fix merge conflicts 2023-01-30 19:48:12 +07:00
54caaffe3a Bump cookiejar from 2.1.3 to 2.1.4 in /backend
Bumps [cookiejar](https://github.com/bmeck/node-cookiejar) from 2.1.3 to 2.1.4.
- [Release notes](https://github.com/bmeck/node-cookiejar/releases)
- [Commits](https://github.com/bmeck/node-cookiejar/commits)

---
updated-dependencies:
- dependency-name: cookiejar
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-01-30 12:41:45 +00:00
cf5603c8e3 Finish preliminary backwards-compatible transition from user encryption scheme v1 to v2 with argon2 and protected key 2023-01-30 19:38:13 +07:00
77b1011207 feat(docs): added a pull request template 2023-01-30 11:54:52 +01:00
55f0a491cb Release fly.io integrartion 2023-01-29 22:38:20 -08:00
a940fa210a Add deny api/get envs api 2023-01-29 21:12:41 -08:00
5162ba9b91 add basic auth model for Organization 2023-01-29 21:12:41 -08:00
3b6022de64 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-29 15:55:22 -08:00
bf743f5f72 Make the loading animation smaller 2023-01-29 15:55:01 -08:00
3e177539d5 Remove state from password controllers 2023-01-29 15:48:42 -08:00
5743dd3a8c Merge pull request #272 from Neeraj138/subscription-check
Add check for subscriptions call before setting the current plan
2023-01-29 09:25:04 -08:00
9f8ad95a59 Revert "correct tags in docker image workflow"
This reverts commit 3ef2ac8a77b50c1fbac1fa2173acccbf1736a011.
2023-01-29 09:17:22 -08:00
3c05a4cebd Add check for subscriptions call before setting the current plan 2023-01-29 14:16:05 +05:30
bc955a9afd increase cli version 2023-01-28 22:32:23 -08:00
ec8d86e662 Merge pull request #256 from akhilmhdh/feat/react-query
feat(ui): added new auth guard with react-query and axios
2023-01-29 12:21:08 +07:00
bc70bedb78 Fixed the bug with empty variables 2023-01-28 20:41:54 -08:00
7a4b77ce59 Update README.md 2023-01-28 14:31:29 -08:00
8600cee54c Merge pull request #265 from sanyamjain04/tailwind-plugin
added prettier-plugin-tailwindcss
2023-01-28 14:14:51 -08:00
fe9573ea3c Merge pull request #264 from asheliahut/patch-1
Include Id on project
2023-01-28 14:11:32 -08:00
61db6c54c2 Merge pull request #269 from kimcore/main
Skip update check if github returns non-200
2023-01-28 14:08:35 -08:00
65093c73c5 Merge pull request #257 from mocherfaoui/inf-nsc-pt
New secrets are now added to the top in the dashboard UI
2023-01-28 13:54:24 -08:00
9986521e41 Merge pull request #270 from kimcore/readme-ko
Translate README.md to korean
2023-01-28 13:35:13 -08:00
655f015109 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-28 12:53:51 -08:00
3cea59ce5d Improved docs SEO 2023-01-28 12:53:44 -08:00
a184192452 Inform k8 self host about latest tags 2023-01-28 12:34:04 -08:00
2dbcab32d5 update gamma pull image policy 2023-01-28 12:03:53 -08:00
13aeeb4731 console.log in posthog 2023-01-28 11:22:16 -08:00
233a468127 Revert "add console.log for post"
This reverts commit dd960aa5f045f62a556e67f81bf172372401a465.
2023-01-28 11:22:16 -08:00
8a9e05b08f Revert "add test comment for docker build issue"
This reverts commit fdac590a023433113ae21295dbe1abf165fb5500.
2023-01-28 11:22:16 -08:00
3ef2ac8a77 correct tags in docker image workflow 2023-01-28 11:14:00 -08:00
fdac590a02 add test comment for docker build issue 2023-01-28 10:25:42 -08:00
dd960aa5f0 add console.log for post 2023-01-28 10:04:34 -08:00
0bd9a848c4 add back depot 2023-01-28 09:53:06 -08:00
1b86c58f91 remove depot from docker build 2023-01-28 09:24:58 -08:00
d5166d343d Remove depot docker 2023-01-28 09:17:54 -08:00
b315cf6022 Translate README.md to korean 2023-01-29 00:20:10 +09:00
37de32ec90 return proper error 2023-01-28 23:13:36 +09:00
6eb81802c3 Skip update check if github returns non-200 2023-01-28 23:06:37 +09:00
e6068a6f7f Merge pull request #247 from samsbg/main
Adding Spanish translation to the README 🌎ES
2023-01-27 18:39:35 -08:00
c059c088d1 update k8 selfhost docs values.yaml file 2023-01-27 12:42:47 -08:00
b530847edc increase chart version 2023-01-27 12:34:46 -08:00
c87c2dadd7 add readinessProbe check for pods 2023-01-27 12:31:46 -08:00
7b1ff04436 add deployment annotations 2023-01-27 10:45:42 -08:00
83aa440b62 Remove mongo url from envs 2023-01-27 10:43:23 -08:00
a555ef836b remove default sensitive keys 2023-01-27 09:33:49 -08:00
528601e442 Merge pull request #266 from Infisical/patch-empty-values
Allow empty values for secrets
2023-01-27 22:16:24 +07:00
13acb19e9f Allow empty values for secrets 2023-01-27 22:07:56 +07:00
079063157f added prettier-plugin-tailwindcss 2023-01-27 12:16:23 +05:30
e38933c0b3 Include Id on project
The project should have its id exposed.
2023-01-26 19:32:44 -08:00
d09b406c4e Merge pull request #262 from kmlgkcy/turkish-translation
translation: Turkish
2023-01-26 16:46:00 -08:00
a5eba8e722 Updated the billing engine for Cloud 2023-01-26 16:32:51 -08:00
7acb4cc22a fix helm deploymentAnnotations 2023-01-26 14:43:53 -08:00
b95ab6c6a1 added deploymentAnnotations to helm chart 2023-01-26 14:17:51 -08:00
038445e13e change from cal.com to calendly 2023-01-26 11:30:36 -08:00
07e9dd5a39 add managed secrets to deployment in gamma 2023-01-26 00:39:08 -08:00
6ec520d358 update helm values for k8 self host 2023-01-26 00:30:54 -08:00
06bfd2429b Update gemma helm chart with auto reload 2023-01-26 00:17:07 -08:00
099c4836e6 update helm charts to be more flexible 2023-01-26 00:14:07 -08:00
ddf8ceb45d translation: Turkish 2023-01-26 10:46:47 +03:00
8a49e0817a add error to failed org creation 2023-01-25 21:58:53 -08:00
88908297f5 add error object to log 2023-01-25 21:30:32 -08:00
cf0e111c09 increase replica count for gamma 2023-01-25 20:12:41 -08:00
ae0ee727fa Make backend login stateless 2023-01-25 20:09:57 -08:00
be2945c445 Merge pull request #259 from Infisical/stripe-adjustment
Update backend envars types and add STRIPE_PRODUCT_TEAM envar
2023-01-26 10:22:54 +07:00
237a10da1e Update backend envars types and add STRIPE_PRODUCT_TEAM envar 2023-01-26 10:20:42 +07:00
1baf14084d new secrets are added to the top 2023-01-25 19:55:48 +01:00
a6387e7552 feat(ui): added new auth guard with react-query and axios 2023-01-26 00:14:01 +05:30
a6f480d3f8 increase CLI 2023-01-24 19:59:45 -08:00
0413059fbe patch executeMultipleCommandWithEnvs when no /bin/zsh 2023-01-24 19:59:45 -08:00
65f049f6ac Merge pull request #254 from franky47/patch-1
docs: Fix typo in encryption overview
2023-01-24 19:51:10 -08:00
62f886a3b3 docs: Fix typo in encryption overview 2023-01-25 04:31:04 +01:00
271ca148e3 Make support link clickable 2023-01-24 11:01:49 -08:00
8aa294309f remove icon from support link 2023-01-24 10:53:46 -08:00
ca3233110b add support link for 1 on 1 in docs 2023-01-24 10:52:09 -08:00
1e4f6a4b9d increase CLI 2023-01-24 00:10:42 -08:00
a73fc6de19 add cli version check before every command 2023-01-24 00:08:42 -08:00
0bb750488b fix typo in docs 2023-01-23 23:00:24 -08:00
32f98f83c5 update nav name for development instructions 2023-01-23 22:56:41 -08:00
6943785ce5 remove stay alive 2023-01-23 22:55:14 -08:00
86558a8221 simplify contribution docs 2023-01-23 22:55:14 -08:00
f2c35a302d Merge pull request #239 from akhilmhdh/feat/component-update-1
Infisical component library foundations
2023-01-23 21:47:28 -08:00
0794b6132a auto create user for dev mode 2023-01-23 20:57:59 -08:00
062c287e75 feat(ui): changed to interfonts 2023-01-23 22:10:51 +05:30
e67d68a7b9 feat(ui): implemented basic table component 2023-01-23 22:10:51 +05:30
054acc689a feat(ui): implemented dropdown component 2023-01-23 22:10:51 +05:30
9b95d18b85 feat(ui): implemented switch component 2023-01-23 22:10:51 +05:30
7f9bc77253 feat(ui): added checkbox component 2023-01-23 22:10:51 +05:30
b92907aca6 feat(ui): added textarea component 2023-01-23 22:10:51 +05:30
c4ee03c73b featIui): added menu component 2023-01-23 22:10:51 +05:30
89ba80740b feat: added card and modal component 2023-01-23 22:10:50 +05:30
606a5e5317 feat(ui): added card component 2023-01-23 22:10:50 +05:30
f859bf528e feat(ui): added icon button component, updated secondary button style and added select to barrel export 2023-01-23 22:10:50 +05:30
ad504fa84e feat(ui) added select, spinner components 2023-01-23 22:10:50 +05:30
e7ac74c5a0 feat(ui): implemented form control components 2023-01-23 22:10:50 +05:30
b80504ae00 feat(ui): implemented new input component 2023-01-23 22:10:50 +05:30
68f1887d66 feat(ui): implemented button component 2023-01-23 22:10:50 +05:30
5cadb9e2f9 Finish MFA v1 and refactor all tokens into separate TokenService with modified collection 2023-01-23 22:10:15 +07:00
201c8352e3 fix typo in k8 self host 2023-01-22 16:50:45 -08:00
a0f0ffe566 improve i-dev command 2023-01-22 14:03:09 -08:00
4b4e8e2bfc Update docker integration 2023-01-22 12:50:06 -08:00
4db4c172c1 Fix the start guide redirect issue 2023-01-22 00:35:36 -08:00
08c54a910f Adding Spanish translation to the README 2023-01-22 01:42:25 -06:00
00fee63ff3 Merge pull request #246 from Infisical/more-integrations
Adjust integration bot authorization sequence
2023-01-22 11:17:15 +07:00
6b80cd6590 Modify case where integration bot was authorized but user didn't finish inputting their PAT -> should result in not sharing keys with bot 2023-01-22 11:12:47 +07:00
840efbdc2f Update API_URL to INFISICAL_API_URL 2023-01-21 13:14:39 -08:00
b91dc9e43e increase cli version 2023-01-21 13:04:14 -08:00
7470cd7af5 Merge pull request #242 from asheliahut/add-domain-env
Allow INFISICAL_URL to use domain for self-hosted
2023-01-21 12:56:24 -08:00
d3a6977938 update docs for change cli api 2023-01-21 12:53:18 -08:00
7cc341ea40 update INFISICAL_DEFAULT_API_URL constant name 2023-01-21 12:26:26 -08:00
5297133c07 Set INFISICAL_URL to nothing 2023-01-21 12:25:56 -08:00
7a6230f2f8 Change INFISICAL_URL to API_URL 2023-01-21 12:24:24 -08:00
ffe66a3b8e Merge pull request #244 from caioluis/main
docs(README): fix typo - Portueguese should be Portuguese
2023-01-21 12:10:23 -08:00
936cd51f29 Update README.md 2023-01-21 11:13:38 -08:00
0c24671d8b Merge pull request #245 from Infisical/more-integrations
Render & Fly.io integrations, reduce page reloads for integrations page.
2023-01-22 00:20:09 +07:00
6969593b38 Fix Mintlify mint.json issue and list Render, Fly.io integrations as availalbe 2023-01-22 00:18:00 +07:00
0c351c0925 Add Render, Fly.io integrations and reduce integrations page reloads 2023-01-22 00:09:37 +07:00
656c408034 docs(README): fix typo - Portueguese should be Portuguese 2023-01-21 15:43:38 +00:00
74fb64bbb9 fix edge case of input same as default clobber env 2023-01-21 01:05:25 -08:00
3af85f9fba fix typo of company name 2023-01-21 00:26:52 -08:00
3c282460b2 Allow INFISCAL_URL to use domain for self-hosted 2023-01-21 00:18:43 -08:00
68b7e6e5ab Merge pull request #241 from alexdanilowicz/patch-1
docs(README): nit typo - Frence should say French
2023-01-20 18:10:42 -08:00
9594157f3e docs(README): nit typo - Frence to French
Update the translations section to say French instead of Frence.
2023-01-20 17:49:43 -08:00
b6ed6ad61e increase cli version 2023-01-19 17:10:03 -08:00
3fc68ffc50 patch secret override for run/export command 2023-01-19 17:05:18 -08:00
0613e1115d Update k8 self host docs 2023-01-19 15:38:06 -08:00
6567c3bddf Fixed the bug with redirect during signup invite 2023-01-18 13:21:30 -08:00
b7115d8862 Merge pull request #238 from akhilmhdh/feat/storybook
feat(frontend): added storybook with tailwind integration
2023-01-18 13:11:42 -08:00
83899bebc8 feat(frontend): added storybook with tailwind integration
chore(frontend): added some required radix components
2023-01-18 23:45:54 +05:30
06803519e6 Reduce page reloads for integrations page 2023-01-18 17:38:52 +07:00
3a6b2084bc Patch GitHub integration for organization repos by including correct owner 2023-01-18 16:33:24 +07:00
2235069e78 Merge pull request #183 from jon4hz/helm
Helm updates
2023-01-17 22:57:25 -08:00
15698c5036 Increase chart version 2023-01-17 22:44:34 -08:00
6ac8e057b0 set frontend env to empty {} 2023-01-17 22:38:56 -08:00
375412b45d Allow mongo connection string based on type 2023-01-17 22:34:19 -08:00
e47530dc71 Allowed upper case for environment names 2023-01-17 22:00:52 -08:00
93150199a4 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-18 10:54:07 +07:00
900f69f336 Uncomment GitHub/Netlify integrations 2023-01-18 10:53:57 +07:00
c556820646 Merge pull request #235 from akhilmhdh/chore/eslint-frontend
airbnb eslint and sorting
2023-01-17 19:05:38 -08:00
18fbe82535 Fixed minor bugs during code cleaning 2023-01-17 19:03:43 -08:00
7ae73d1b62 chore(frontend): added rule to seperate out @app imports and linted 2023-01-17 21:06:14 +05:30
cf7834bfc3 chore(frontend): fixed all eslint errors 2023-01-17 21:06:14 +05:30
9f82e2d836 correct host api for k8 2023-01-16 18:01:20 -08:00
f20af1f5f8 Improve k8 docs and add docs for auto redeploy 2023-01-16 17:29:02 -08:00
8343f8ea0d Update k8 helm chart version 2023-01-16 16:55:24 -08:00
74c0dcd1f5 Remove the use of error channel from go routine 2023-01-16 16:52:59 -08:00
40696e4095 increase CLI version 2023-01-16 15:11:49 -08:00
614a2558f5 Patch IP recognition 2023-01-17 00:50:35 +07:00
56aec216c1 Adjust app-wide API limit 2023-01-17 00:10:11 +07:00
b359fb5f3b Adjust app-wide rate limiter 2023-01-16 23:48:03 +07:00
1fbbbab602 Allow new channel types 2023-01-16 08:45:21 -08:00
89697df85e Increase rate limits for API 2023-01-16 22:09:58 +07:00
37ee8148c6 revert default api domain 2023-01-15 23:57:13 -08:00
9e55102816 Switch to v2/secrets CURD api for cli 2023-01-15 23:56:06 -08:00
b8fa5e8a89 add method to get channel from user agent 2023-01-15 23:55:13 -08:00
3ba636f300 switch k8-operator to secrets v2api 2023-01-15 23:12:11 -08:00
da3742f600 set userid and email based on presence of service token/jwt 2023-01-15 22:03:14 -08:00
35f4d27ab0 Populate service token user 2023-01-15 21:40:19 -08:00
cf123d1887 service token create - change env to slug 2023-01-15 21:40:19 -08:00
b3816bd828 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-16 10:53:16 +07:00
7c7c9dea40 Add Infisical API to README 2023-01-16 10:53:07 +07:00
eabe406ab0 Merge pull request #229 from Infisical/expand-open-api
Fill in more example values for OpenAPI schema
2023-01-16 10:14:50 +07:00
2ae617cda6 Fill in more example values for OpenAPI schema 2023-01-16 10:13:56 +07:00
1b16066335 Merge pull request #228 from Infisical/expand-open-api
Add images to API reference authentication for getting API keys and n…
2023-01-16 10:02:08 +07:00
da251d3d2d Add images to API reference authentication for getting API keys and notes on crypto 2023-01-16 09:58:48 +07:00
818efe61f4 Merge pull request #227 from Infisical/k8-new-service-token-and-auto-redeploy
add auto redeploy, new secrets api, and new service token
2023-01-15 17:03:11 -08:00
9f08b04c92 update secrets-operator helm chart 2023-01-15 17:01:31 -08:00
41d17c930a update kubectl install configs 2023-01-15 17:00:06 -08:00
63f22c554a add auto redeploy, new secrets api, and new service token 2023-01-15 16:47:09 -08:00
cba57cf317 Updated readme.md 2023-01-15 16:44:44 -08:00
9a28e5b4bc Added to auto-redirect to the no projects page 2023-01-15 16:06:44 -08:00
a2689002d3 Merge pull request #226 from akhilmhdh/chore/move-to-src
chore(frontend): changed source code to src folder from root
2023-01-15 14:58:17 -08:00
e7a9b83877 Merge branch 'main' into chore/move-to-src 2023-01-15 14:37:04 -08:00
813db9dbbc Added volumes and deleted logs 2023-01-15 14:25:29 -08:00
72d52c9941 Fixing merge conflicts for the folder structure 2023-01-15 13:42:17 -08:00
4c2b9d4703 Solving merge conflicts 2023-01-15 13:40:03 -08:00
b1f7505f30 Fixed the redirectbug with deleting a certain workspace 2023-01-15 13:31:57 -08:00
63e9d83ba4 chore(frontend): changed source code to src folder from root 2023-01-16 00:34:22 +05:30
1534a47adc Fixed the redirectbug with adding a new workspace 2023-01-15 10:45:41 -08:00
c563548a1c Merge pull request #224 from akhilmhdh/feat/migration-ts-v2
feat(frontend): migrated to ts completed.
2023-01-15 09:57:15 -08:00
a633a3534d Merge pull request #225 from Infisical/expand-open-api
Add new organization endpoints to API reference
2023-01-16 00:04:08 +07:00
992357cbc4 Add new organization endpoints to API reference 2023-01-16 00:00:04 +07:00
ffc3562709 feat(frontend): migrated to ts completed. 2023-01-15 21:34:54 +05:30
f19db530b1 Merge pull request #223 from Infisical/api-keys
API Keys V1
2023-01-15 14:49:21 +07:00
061a9c8583 Fix build errors 2023-01-15 14:47:23 +07:00
b8fbc36b2d Fix faulty import 2023-01-15 14:40:45 +07:00
e364faaffd Complete v1 API Key 2023-01-15 14:34:16 +07:00
b3246778f2 Merge remote-tracking branch 'origin' into api-keys 2023-01-15 14:26:27 +07:00
74b76eda7e Complete v1 API Key 2023-01-15 14:25:23 +07:00
564367d5fd Merge pull request #222 from akhilmhdh/feat/migration-ts
migration(frontend): migrated frontend files to ts execpt dialog component
2023-01-14 16:00:53 -08:00
fd2966610c fix: typo 2023-01-14 19:58:08 +01:00
c23b291f25 fix: mongodb connection 2023-01-14 19:51:49 +01:00
67365e5480 migration(frontend): migrated frontend files except some components to ts 2023-01-15 00:13:25 +05:30
4df205dea6 Fix README 2023-01-14 21:43:36 +07:00
32928bf45c Fix merge conflicts 2023-01-14 21:27:55 +07:00
ea98f9be3c Merge pull request #221 from Infisical/api-docs
API Reference Docs v1
2023-01-14 21:09:00 +07:00
5085376f11 Check v2 workspace membership routes (currently not routed) 2023-01-14 21:07:17 +07:00
e2b4adb2e9 Complete v1 API reference docs, pre-launch 2023-01-14 19:06:43 +07:00
315810bd74 Complete v1 API reference docs, pre-launch 2023-01-14 19:02:12 +07:00
7e9ba3b6e2 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-13 22:59:54 -08:00
08dd5174b3 Making the dashboard less clunky 2023-01-13 22:59:43 -08:00
e552be0a81 add deployment error check for gamma 2023-01-13 20:42:23 -08:00
3cd9241aee increase cli version 2023-01-13 19:52:29 -08:00
9ca544f680 Merge pull request #219 from imakecodes/feature/adding-export-dotenv-export-format
feat(CLI): adding new export format (dotenv-export)
2023-01-13 19:50:10 -08:00
98d84b6717 add FormatDotEnvExport to list of available formats 2023-01-13 19:48:40 -08:00
b63360813a Continue API reference development 2023-01-14 09:48:13 +07:00
5d8c4ad03f Changed the formulation to secrets and configs 2023-01-13 17:32:36 -08:00
3e6206951e updated Readme with new contributors 2023-01-13 17:19:22 -08:00
3bc7f2aa7c Merge pull request #199 from Gabriellopes232/language-support-ptbr
Language Support pt-BR
2023-01-13 17:03:43 -08:00
72b8dbda15 Merge branch 'main' into language-support-ptbr 2023-01-13 16:50:19 -08:00
439e86d763 Merge pull request #186 from akhilmhdh/feat/#31
feat(#31): implemented api for environment crud operations
2023-01-13 16:45:35 -08:00
71fbf519ce Minor style changes - capitalization 2023-01-13 16:17:40 -08:00
d386f2702d Minor style changes to integrations 2023-01-13 16:05:50 -08:00
986434d66a Add infisical in Makefile for docker compose 2023-01-13 14:01:23 -08:00
30d84ede41 Merge pull request #220 from Infisical/gamma-auto-deploy
Gamma auto deploy
2023-01-13 13:42:34 -08:00
87a3f9a03c delete gamma deploy workflow file 2023-01-13 13:41:56 -08:00
64d1f252e2 Rename workflow file 2023-01-13 13:39:30 -08:00
092e4a55bd enable auto deploy to gamma 2023-01-13 13:13:15 -08:00
a00e6df59f add manual approval setp 2023-01-13 13:09:57 -08:00
189d24589e correct needs fields in gha 2023-01-13 12:53:53 -08:00
17bae52830 gamma deployment after image build 2023-01-13 12:52:07 -08:00
323701d432 add gha upgrade helmchart 2023-01-13 12:38:56 -08:00
593765cb24 cat values file after downloading 2023-01-13 12:31:55 -08:00
fa60784a6b Add files via upload 2023-01-13 12:27:14 -08:00
eb9a8e0285 echo values file 2023-01-13 12:16:36 -08:00
d1f296b7e7 fix indent gamma deploy gha 2023-01-13 12:11:05 -08:00
dc6d036d86 write helm values to file form secret 2023-01-13 12:08:47 -08:00
58aee0239f docs: adding documentation about dotenv-export 2023-01-13 16:59:22 -03:00
799a839940 feat: adding new export format 2023-01-13 16:52:15 -03:00
0242707e33 gh action test kubectl 2023-01-13 11:49:38 -08:00
9974f889f3 Merge pull request #218 from Infisical/gamma-auto-deploy
Add github action for gamma deploy
2023-01-13 11:45:01 -08:00
a8f38a5367 Add github action for gamma deploy 2023-01-13 11:40:29 -08:00
61318f28f7 Remove netlify and gh 2023-01-13 09:18:41 -08:00
036d32aeba feat(#31): added multi env support in integration 2023-01-13 21:09:59 +05:30
d03eff4f46 Merge pull request #216 from Infisical/check-smtp-setup
Add SMTP support for AWS SES and docs for it
2023-01-13 20:26:12 +07:00
29592a1e9e Add SMTP support for AWS SES and docs for it 2023-01-13 20:25:13 +07:00
0f151fcd7a Merge pull request #215 from Infisical/patch-integrations
Patch integrations
2023-01-13 18:19:08 +07:00
cbd8302afe Add temp patch for CRUD ops race conditions 2023-01-13 18:15:17 +07:00
6992c51e17 Working fixing integrations race condition 2023-01-13 16:36:34 +07:00
91f1090568 Merge branch 'main' of https://github.com/Infisical/infisical 2023-01-13 01:29:26 -08:00
6c61aef526 hotfix: fix the bug with pushing multiple envars in a sequence 2023-01-13 01:29:14 -08:00
b67abf94d4 Fixing minor bugs for custom environments 2023-01-13 01:02:43 -08:00
9d4ea2dcda Continue api-reference docs 2023-01-13 15:04:46 +07:00
f57f3e6475 enable integ 2023-01-12 23:27:10 -08:00
d958341154 Corrected the telemetery event name 2023-01-12 22:15:17 -08:00
61f767e895 Corrected the telemetery event name 2023-01-12 22:12:23 -08:00
95177074e3 Merge branch 'main' into feat/#31 2023-01-12 17:24:48 -08:00
efd5016977 Added frontend for api-keys 2023-01-12 17:08:40 -08:00
84700308f5 feat(#31): implemented ui for multi env and integrated api with backend
fix(#31): fixed all v2 release conflict
2023-01-11 23:12:05 +05:30
9116bf3344 feat(ui): implemented ui for env management table 2023-01-11 23:10:11 +05:30
3ad3e19bcf feat(#31): implemented api for environment crud operations 2023-01-11 23:10:11 +05:30
0f043605d9 Fix merge conflicts 2023-01-11 10:08:44 +07:00
9ff0b7bc18 Minor changes to README 2023-01-11 10:03:40 +07:00
0bf8661350 fix: pattern folder based on i18next locales 2023-01-08 16:09:57 -03:00
69b819e7c4 refactor: adding translate pt-br in signup.json archive 2023-01-08 09:26:04 -03:00
d870ecc62a refactor: adding translate pt-br in setting-project.json archive 2023-01-08 09:25:52 -03:00
c0a0252cf5 refactor: adding translate pt-br in setting-personal.json archive 2023-01-08 09:25:38 -03:00
2f5186634c refactor: adding translate pt-br in setting-org.json archive 2023-01-08 09:25:27 -03:00
36525325fd refactor: adding translate pt-br in setting-members.json archive 2023-01-08 09:25:16 -03:00
a990a5ee7d refactor: adding translate pt-br in section-token.json archive 2023-01-08 09:24:55 -03:00
f2372bb265 refactor: adding translate pt-br in section-password.json archive 2023-01-08 09:24:44 -03:00
8c0046be87 refactor: adding translate pt-br in section-members.json archive 2023-01-08 09:24:29 -03:00
556858d1a8 refactor: adding translate pt-br in section-incident.json archive 2023-01-08 09:24:12 -03:00
2b147fce6e refactor: adding translate pt-br in nav.json archive 2023-01-08 09:23:31 -03:00
553be71ddf refactor: translate pt-br in login.json archive 2023-01-08 09:23:17 -03:00
96b254d7c3 refactor: adding translate pt-br in login.json archive 2023-01-07 13:51:16 -03:00
3f1eaa8d42 refactor: adding translate pt-br in integrations.json archive 2023-01-07 13:51:04 -03:00
3e56fe95d2 refactor: adding translate pt-br in dashboard.json archive 2023-01-07 13:50:51 -03:00
15553e972a refactor: adding translate pt-br in common.json archive 2023-01-07 12:09:43 -03:00
4c32f3dfd0 refactor: adding translate pt-br billing archive 2023-01-06 12:08:20 -03:00
c0d7b4ea88 feat: add "boilerplate" json 2023-01-06 11:41:32 -03:00
e6c631586a refactor: add "pt-BR" option 2023-01-06 11:40:51 -03:00
3e102fee3d chore: add language locale 2023-01-06 11:40:21 -03:00
53502e22f4 fix: comments 2022-12-29 01:35:13 +01:00
d683e385ae fix: add support for custom annotations 2022-12-28 16:36:33 +01:00
4880cd84dc refactor: naming, labels and selectors 2022-12-28 15:42:47 +01:00
da5800c268 fix: allow setting of nodeport 2022-12-28 15:25:47 +01:00
21439761c3 fix: allow frontend service type overrides 2022-12-28 15:16:36 +01:00
bef857a7dc fix: allow image overrides 2022-12-28 15:02:52 +01:00
59ab4bf7f9 fix: upgrade mongoose from 6.7.2 to 6.7.3
Snyk has created this PR to upgrade mongoose from 6.7.2 to 6.7.3.

See this package in npm:
https://www.npmjs.com/package/mongoose

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2022-12-14 18:05:06 +00:00
d4bc92bd5b fix: upgrade axios from 1.1.3 to 1.2.0
Snyk has created this PR to upgrade axios from 1.1.3 to 1.2.0.

See this package in npm:
https://www.npmjs.com/package/axios

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2022-12-14 18:04:59 +00:00
7efdbeb787 fix: upgrade @sentry/node from 7.19.0 to 7.21.1
Snyk has created this PR to upgrade @sentry/node from 7.19.0 to 7.21.1.

See this package in npm:
https://www.npmjs.com/package/@sentry/node

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2022-12-14 18:04:55 +00:00
43daff29dc fix: upgrade @sentry/tracing from 7.19.0 to 7.21.1
Snyk has created this PR to upgrade @sentry/tracing from 7.19.0 to 7.21.1.

See this package in npm:
https://www.npmjs.com/package/@sentry/tracing

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2022-12-14 18:04:51 +00:00
903 changed files with 75390 additions and 15745 deletions

View File

@ -64,7 +64,7 @@ POSTHOG_PROJECT_API_KEY=
STRIPE_SECRET_KEY=
STRIPE_PUBLISHABLE_KEY=
STRIPE_WEBHOOK_SECRET=
STRIPE_PRODUCT_CARD_AUTH=
STRIPE_PRODUCT_PRO=
STRIPE_PRODUCT_STARTER=
STRIPE_PRODUCT_TEAM=
STRIPE_PRODUCT_PRO=
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=

View File

@ -1,3 +1,4 @@
node_modules
built
healthcheck.js
tailwind.config.js

22
.github/pull_request_template.md vendored Normal file
View File

@ -0,0 +1,22 @@
# Description 📣
*Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.*
## Type ✨
- [ ] Bug fix
- [ ] New feature
- [ ] Breaking change
- [ ] Documentation
# Tests 🛠️
*Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration. You may want to add screenshots when relevant and possible*
```sh
# Here's some code block to paste some code snippets
```
---
- [ ] I have read the [contributing guide](https://infisical.com/docs/contributing/overview), agreed and acknowledged the [code of conduct](https://infisical.com/docs/contributing/code-of-conduct). 📝

71
.github/values.yaml vendored Normal file
View File

@ -0,0 +1,71 @@
frontend:
enabled: true
name: frontend
podAnnotations: {}
deploymentAnnotations:
secrets.infisical.com/auto-reload: "true"
replicaCount: 2
image:
repository: infisical/frontend
tag: "latest"
pullPolicy: Always
kubeSecretRef: managed-secret-frontend
service:
annotations: {}
type: ClusterIP
nodePort: ""
frontendEnvironmentVariables: null
backend:
enabled: true
name: backend
podAnnotations: {}
deploymentAnnotations:
secrets.infisical.com/auto-reload: "true"
replicaCount: 2
image:
repository: infisical/backend
tag: "latest"
pullPolicy: Always
kubeSecretRef: managed-backend-secret
service:
annotations: {}
type: ClusterIP
nodePort: ""
backendEnvironmentVariables: null
## Mongo DB persistence
mongodb:
enabled: true
persistence:
enabled: false
## By default the backend will be connected to a Mongo instance within the cluster
## However, it is recommended to add a managed document DB connection string for production-use (DBaaS)
## Learn about connection string type here https://www.mongodb.com/docs/manual/reference/connection-string/
## e.g. "mongodb://<user>:<pass>@<host>:<port>/<database-name>"
mongodbConnection:
externalMongoDBConnectionString: ""
ingress:
enabled: true
annotations:
kubernetes.io/ingress.class: "nginx"
# cert-manager.io/issuer: letsencrypt-nginx
hostName: gamma.infisical.com ## <- Replace with your own domain
frontend:
path: /
pathType: Prefix
backend:
path: /api
pathType: Prefix
tls:
[]
# - secretName: letsencrypt-nginx
# hosts:
# - infisical.local
mailhog:
enabled: false

View File

@ -1,5 +1,4 @@
name: Push frontend and backend to Dockerhub
name: Build, Publish and Deploy to Gamma
on: [workflow_dispatch]
jobs:
@ -99,4 +98,41 @@ jobs:
infisical/frontend:latest
platforms: linux/amd64,linux/arm64
build-args: |
POSTHOG_API_KEY=${{ secrets.PUBLIC_POSTHOG_API_KEY }}
POSTHOG_API_KEY=${{ secrets.PUBLIC_POSTHOG_API_KEY }}
gamma-deployment:
name: Deploy to gamma
runs-on: ubuntu-latest
needs: [frontend-image, backend-image]
steps:
- name: ☁️ Checkout source
uses: actions/checkout@v3
- name: Install Helm
uses: azure/setup-helm@v3
with:
version: v3.10.0
- name: Install infisical helm chart
run: |
helm repo add infisical-helm-charts 'https://dl.cloudsmith.io/public/infisical/helm-charts/helm/charts/'
helm repo update
- name: Install kubectl
uses: azure/setup-kubectl@v3
- name: Install doctl
uses: digitalocean/action-doctl@v2
with:
token: ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}
- name: Save DigitalOcean kubeconfig with short-lived credentials
run: doctl kubernetes cluster kubeconfig save --expiry-seconds 600 k8s-1-25-4-do-0-nyc1-1670645170179
- name: switch to gamma namespace
run: kubectl config set-context --current --namespace=gamma
- name: test kubectl
run: kubectl get ingress
- name: Download helm values to file and upgrade gamma deploy
run: |
wget https://raw.githubusercontent.com/Infisical/infisical/main/.github/values.yaml
helm upgrade infisical infisical-helm-charts/infisical --values values.yaml --recreate-pods
if [[ $(helm status infisical) == *"FAILED"* ]]; then
echo "Helm upgrade failed"
exit 1
else
echo "Helm upgrade was successful"
fi

View File

@ -4,7 +4,7 @@ on:
push:
# run only against tags
tags:
- 'v*'
- "v*"
permissions:
contents: write
@ -18,10 +18,16 @@ jobs:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: 🐋 Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- run: git fetch --force --tags
- run: echo "Ref name ${{github.ref_name}}"
- uses: actions/setup-go@v3
with:
go-version: '>=1.19.3'
go-version: ">=1.19.3"
cache: true
cache-dependency-path: cli/go.sum
- name: libssl1.1 => libssl1.0-dev for OSXCross
@ -33,19 +39,18 @@ jobs:
run: |
mkdir ../../osxcross
git clone https://github.com/plentico/osxcross-target.git ../../osxcross/target
- uses: goreleaser/goreleaser-action@v2
- uses: goreleaser/goreleaser-action@v4
with:
distribution: goreleaser
version: latest
args: release --rm-dist
args: release --clean
env:
GITHUB_TOKEN: ${{ secrets.GO_RELEASER_GITHUB_TOKEN }}
FURY_TOKEN: ${{ secrets.FURYPUSHTOKEN }}
AUR_KEY: ${{ secrets.AUR_KEY }}
- uses: actions/setup-python@v4
- run: pip install --upgrade cloudsmith-cli
- name: Publish to CloudSmith
- name: Publish to CloudSmith
run: sh cli/upload_to_cloudsmith.sh
env:
CLOUDSMITH_API_KEY: ${{ secrets.CLOUDSMITH_API_KEY }}

View File

@ -14,6 +14,9 @@ before:
builds:
- id: darwin-build
binary: infisical
ldflags: -X github.com/Infisical/infisical-merge/packages/util.CLI_VERSION={{ .Version }}
flags:
- -trimpath
env:
- CGO_ENABLED=1
- CC=/home/runner/work/osxcross/target/bin/o64-clang
@ -24,10 +27,14 @@ builds:
- goos: darwin
goarch: "386"
dir: ./cli
- id: all-other-builds
env:
- CGO_ENABLED=0
binary: infisical
ldflags: -X github.com/Infisical/infisical-merge/packages/util.CLI_VERSION={{ .Version }}
flags:
- -trimpath
goos:
- freebsd
- linux
@ -61,18 +68,20 @@ archives:
release:
replace_existing_draft: true
mode: 'replace'
mode: "replace"
checksum:
name_template: 'checksums.txt'
name_template: "checksums.txt"
snapshot:
name_template: "{{ incpatch .Version }}"
name_template: "{{ incpatch .Version }}-devel"
changelog:
sort: asc
filters:
exclude:
- '^docs:'
- '^test:'
- "^docs:"
- "^test:"
# publishers:
# - name: fury.io
@ -80,6 +89,7 @@ changelog:
# - infisical
# dir: "{{ dir .ArtifactPath }}"
# cmd: curl -F package=@{{ .ArtifactName }} https://{{ .Env.FURY_TOKEN }}@push.fury.io/infisical/
brews:
- name: infisical
tap:
@ -91,31 +101,39 @@ brews:
folder: Formula
homepage: "https://infisical.com"
description: "The official Infisical CLI"
install: |-
bin.install "infisical"
bash_completion.install "completions/infisical.bash" => "infisical"
zsh_completion.install "completions/infisical.zsh" => "_infisical"
fish_completion.install "completions/infisical.fish"
man1.install "manpages/infisical.1.gz"
nfpms:
- id: infisical
package_name: infisical
builds:
- all-other-builds
vendor: Infisical, Inc
homepage: https://infisical.com/
maintainer: Infisical, Inc
description: The offical Infisical CLI
license: MIT
formats:
- rpm
- deb
- apk
- archlinux
bindir: /usr/bin
contents:
- src: ./completions/infisical.bash
dst: /etc/bash_completion.d/infisical
- src: ./completions/infisical.fish
dst: /usr/share/fish/vendor_completions.d/infisical.fish
- src: ./completions/infisical.zsh
dst: /usr/share/zsh/site-functions/_infisical
- src: ./manpages/infisical.1.gz
dst: /usr/share/man/man1/infisical.1.gz
- id: infisical
package_name: infisical
builds:
- all-other-builds
vendor: Infisical, Inc
homepage: https://infisical.com/
maintainer: Infisical, Inc
description: The offical Infisical CLI
license: MIT
formats:
- rpm
- deb
- apk
- archlinux
bindir: /usr/bin
contents:
- src: ./completions/infisical.bash
dst: /etc/bash_completion.d/infisical
- src: ./completions/infisical.fish
dst: /usr/share/fish/vendor_completions.d/infisical.fish
- src: ./completions/infisical.zsh
dst: /usr/share/zsh/site-functions/_infisical
- src: ./manpages/infisical.1.gz
dst: /usr/share/man/man1/infisical.1.gz
scoop:
bucket:
owner: Infisical
@ -126,16 +144,16 @@ scoop:
homepage: "https://infisical.com"
description: "The official Infisical CLI"
license: MIT
aurs:
-
name: infisical-bin
- name: infisical-bin
homepage: "https://infisical.com"
description: "The official Infisical CLI"
maintainers:
- Infisical, Inc <support@infisical.com>
license: MIT
private_key: '{{ .Env.AUR_KEY }}'
git_url: 'ssh://aur@aur.archlinux.org/infisical-bin.git'
private_key: "{{ .Env.AUR_KEY }}"
git_url: "ssh://aur@aur.archlinux.org/infisical-bin.git"
package: |-
# bin
install -Dm755 "./infisical" "${pkgdir}/usr/bin/infisical"
@ -150,19 +168,13 @@ aurs:
install -Dm644 "./completions/infisical.fish" "${pkgdir}/usr/share/fish/vendor_completions.d/infisical.fish"
# man pages
install -Dm644 "./manpages/infisical.1.gz" "${pkgdir}/usr/share/man/man1/infisical.1.gz"
# dockers:
# - dockerfile: goreleaser.dockerfile
# - dockerfile: cli/docker/Dockerfile
# goos: linux
# goarch: amd64
# ids:
# - infisical
# image_templates:
# - "infisical/cli:{{ .Version }}"
# - "infisical/cli:{{ .Major }}.{{ .Minor }}"
# - "infisical/cli:{{ .Major }}"
# - "infisical/cli:{{ .Version }}"
# - "infisical/cli:latest"
# build_flag_templates:
# - "--label=org.label-schema.schema-version=1.0"
# - "--label=org.label-schema.version={{.Version}}"
# - "--label=org.label-schema.name={{.ProjectName}}"
# - "--platform=linux/amd64"

View File

@ -7,6 +7,9 @@ push:
up-dev:
docker-compose -f docker-compose.dev.yml up --build
i-dev:
infisical run -- docker-compose -f docker-compose.dev.yml up --build
up-prod:
docker-compose -f docker-compose.yml up --build

158
README.md

File diff suppressed because one or more lines are too long

View File

@ -1,18 +1,27 @@
FROM node:16-bullseye-slim
# Build stage
FROM node:16-alpine AS build
WORKDIR /app
COPY package.json package-lock.json ./
# RUN npm ci --only-production --ignore-scripts
# "prepare": "cd .. && npm install"
COPY package*.json ./
RUN npm ci --only-production
COPY . .
RUN npm run build
# Production stage
FROM node:16-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only-production
COPY --from=build /app .
HEALTHCHECK --interval=10s --timeout=3s --start-period=10s \
CMD node healthcheck.js
EXPOSE 4000
CMD ["npm", "run", "start"]
CMD ["npm", "run", "start"]

View File

@ -3,8 +3,10 @@ export {};
declare global {
namespace NodeJS {
interface ProcessEnv {
PORT: string;
EMAIL_TOKEN_LIFETIME: string;
ENCRYPTION_KEY: string;
SALT_ROUNDS: string;
JWT_AUTH_LIFETIME: string;
JWT_AUTH_SECRET: string;
JWT_REFRESH_LIFETIME: string;
@ -19,23 +21,31 @@ declare global {
CLIENT_ID_HEROKU: string;
CLIENT_ID_VERCEL: string;
CLIENT_ID_NETLIFY: string;
CLIENT_ID_GITHUB: string;
CLIENT_SECRET_HEROKU: string;
CLIENT_SECRET_VERCEL: string;
CLIENT_SECRET_NETLIFY: string;
CLIENT_SECRET_GITHUB: string;
CLIENT_SLUG_VERCEL: string;
POSTHOG_HOST: string;
POSTHOG_PROJECT_API_KEY: string;
SENTRY_DSN: string;
SITE_URL: string;
SMTP_HOST: string;
SMTP_NAME: string;
SMTP_PASSWORD: string;
SMTP_SECURE: string;
SMTP_PORT: string;
SMTP_USERNAME: string;
STRIPE_PRODUCT_CARD_AUTH: string;
STRIPE_PRODUCT_PRO: string;
SMTP_PASSWORD: string;
SMTP_FROM_ADDRESS: string;
SMTP_FROM_NAME: string;
STRIPE_PRODUCT_STARTER: string;
STRIPE_PRODUCT_TEAM: string;
STRIPE_PRODUCT_PRO: string;
STRIPE_PUBLISHABLE_KEY: string;
STRIPE_SECRET_KEY: string;
STRIPE_WEBHOOK_SECRET: string;
TELEMETRY_ENABLED: string;
LICENSE_KEY: string;
}
}
}

3658
backend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,11 +1,56 @@
{
"dependencies": {
"@aws-sdk/client-secrets-manager": "^3.267.0",
"@godaddy/terminus": "^4.11.2",
"@octokit/rest": "^19.0.5",
"@sentry/node": "^7.14.0",
"@sentry/tracing": "^7.19.0",
"@types/crypto-js": "^4.1.1",
"@types/libsodium-wrappers": "^0.7.10",
"await-to-js": "^3.0.0",
"aws-sdk": "^2.1311.0",
"axios": "^1.1.3",
"axios-retry": "^3.4.0",
"bcrypt": "^5.1.0",
"bigint-conversion": "^2.2.2",
"builder-pattern": "^2.2.0",
"cookie-parser": "^1.4.6",
"cors": "^2.8.5",
"crypto-js": "^4.1.1",
"dotenv": "^16.0.1",
"express": "^4.18.1",
"express-rate-limit": "^6.7.0",
"express-validator": "^6.14.2",
"handlebars": "^4.7.7",
"helmet": "^5.1.1",
"js-yaml": "^4.1.0",
"jsonwebtoken": "^9.0.0",
"jsrp": "^0.2.4",
"libsodium-wrappers": "^0.7.10",
"lodash": "^4.17.21",
"mongoose": "^6.7.2",
"nodemailer": "^6.8.0",
"posthog-node": "^2.2.2",
"query-string": "^7.1.3",
"request-ip": "^3.3.0",
"rimraf": "^3.0.2",
"stripe": "^10.7.0",
"swagger-autogen": "^2.22.0",
"swagger-ui-express": "^4.6.0",
"tweetnacl": "^1.0.3",
"tweetnacl-util": "^0.15.1",
"typescript": "^4.9.3",
"utility-types": "^3.10.0",
"winston": "^3.8.2",
"winston-loki": "^6.0.6"
},
"name": "infisical-api",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "npm run build && node build/index.js",
"start": "node build/index.js",
"dev": "nodemon",
"swagger-autogen": "node ./swagger.ts",
"swagger-autogen": "node ./swagger/index.ts",
"build": "rimraf ./build && tsc && cp -R ./src/templates ./build",
"lint": "eslint . --ext .ts",
"lint-and-fix": "eslint . --ext .ts --fix",
@ -36,6 +81,7 @@
"@types/express": "^4.17.14",
"@types/jest": "^29.2.4",
"@types/jsonwebtoken": "^8.5.9",
"@types/lodash": "^4.14.191",
"@types/node": "^18.11.3",
"@types/nodemailer": "^6.4.6",
"@types/supertest": "^2.0.12",
@ -73,43 +119,5 @@
"suiteNameTemplate": "{filepath}",
"classNameTemplate": "{classname}",
"titleTemplate": "{title}"
},
"dependencies": {
"@godaddy/terminus": "^4.11.2",
"@octokit/rest": "^19.0.5",
"@sentry/node": "^7.14.0",
"@sentry/tracing": "^7.19.0",
"@types/crypto-js": "^4.1.1",
"@types/libsodium-wrappers": "^0.7.10",
"await-to-js": "^3.0.0",
"axios": "^1.1.3",
"bcrypt": "^5.1.0",
"bigint-conversion": "^2.2.2",
"cookie-parser": "^1.4.6",
"cors": "^2.8.5",
"crypto-js": "^4.1.1",
"dotenv": "^16.0.1",
"express": "^4.18.1",
"express-rate-limit": "^6.7.0",
"express-validator": "^6.14.2",
"handlebars": "^4.7.7",
"helmet": "^5.1.1",
"jsonwebtoken": "^9.0.0",
"jsrp": "^0.2.4",
"libsodium-wrappers": "^0.7.10",
"mongoose": "^6.7.2",
"nodemailer": "^6.8.0",
"posthog-node": "^2.2.2",
"query-string": "^7.1.3",
"rimraf": "^3.0.2",
"stripe": "^10.7.0",
"swagger-autogen": "^2.22.0",
"swagger-ui-express": "^4.6.0",
"tweetnacl": "^1.0.3",
"tweetnacl-util": "^0.15.1",
"typescript": "^4.9.3",
"utility-types": "^3.10.0",
"winston": "^3.8.2",
"winston-loki": "^6.0.6"
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,15 +1,16 @@
// eslint-disable-next-line @typescript-eslint/no-var-requires
const { patchRouterParam } = require('./utils/patchAsyncRoutes');
import express, { Request, Response } from 'express';
import express from 'express';
import helmet from 'helmet';
import cors from 'cors';
import cookieParser from 'cookie-parser';
import dotenv from 'dotenv';
import swaggerUi = require('swagger-ui-express');
// eslint-disable-next-line @typescript-eslint/no-var-requires
const swaggerFile = require('../api-documentation.json')
const swaggerFile = require('../spec.json');
// eslint-disable-next-line @typescript-eslint/no-var-requires
const requestIp = require('request-ip');
dotenv.config();
import { PORT, NODE_ENV, SITE_URL } from './config';
@ -38,21 +39,28 @@ import {
password as v1PasswordRouter,
stripe as v1StripeRouter,
integration as v1IntegrationRouter,
integrationAuth as v1IntegrationAuthRouter
integrationAuth as v1IntegrationAuthRouter,
secretApprovalRequest as v1SecretApprovalRequest
} from './routes/v1';
import {
secret as v2SecretRouter,
secrets as v2SecretsRouter,
signup as v2SignupRouter,
auth as v2AuthRouter,
users as v2UsersRouter,
organizations as v2OrganizationsRouter,
workspace as v2WorkspaceRouter,
secret as v2SecretRouter, // begin to phase out
secrets as v2SecretsRouter,
serviceTokenData as v2ServiceTokenDataRouter,
apiKeyData as v2APIKeyDataRouter,
environment as v2EnvironmentRouter,
tags as v2TagsRouter,
} from './routes/v2';
import { healthCheck } from './routes/status';
import { getLogger } from './utils/logger';
import { RouteNotFoundError } from './utils/errors';
import { requestErrorHandler } from './middleware/requestErrorHandler';
import { handleMongoInvalidDataError, requestErrorHandler } from './middleware/requestErrorHandler';
// patch async route params to handle Promise Rejections
patchRouterParam();
@ -69,6 +77,8 @@ app.use(
})
);
app.use(requestIp.mw())
if (NODE_ENV === 'production') {
// enable app-wide rate-limiting + helmet security
// in production
@ -96,18 +106,25 @@ app.use('/api/v1/membership', v1MembershipRouter);
app.use('/api/v1/key', v1KeyRouter);
app.use('/api/v1/invite-org', v1InviteOrgRouter);
app.use('/api/v1/secret', v1SecretRouter);
app.use('/api/v1/service-token', v1ServiceTokenRouter); // stop supporting
app.use('/api/v1/service-token', v1ServiceTokenRouter); // deprecated
app.use('/api/v1/password', v1PasswordRouter);
app.use('/api/v1/stripe', v1StripeRouter);
app.use('/api/v1/integration', v1IntegrationRouter);
app.use('/api/v1/integration-auth', v1IntegrationAuthRouter);
app.use('/api/v1/secrets-approval-request', v1SecretApprovalRequest)
// v2 routes
app.use('/api/v2/workspace', v2WorkspaceRouter); // TODO: turn into plural route
app.use('/api/v2/secret', v2SecretRouter); // stop supporting, TODO: revise
app.use('/api/v2/signup', v2SignupRouter);
app.use('/api/v2/auth', v2AuthRouter);
app.use('/api/v2/users', v2UsersRouter);
app.use('/api/v2/organizations', v2OrganizationsRouter);
app.use('/api/v2/workspace', v2EnvironmentRouter);
app.use('/api/v2/workspace', v2TagsRouter);
app.use('/api/v2/workspace', v2WorkspaceRouter);
app.use('/api/v2/secret', v2SecretRouter); // deprecated
app.use('/api/v2/secrets', v2SecretsRouter);
app.use('/api/v2/service-token', v2ServiceTokenDataRouter); // TODO: turn into plural route
app.use('/api/v2/api-key-data', v2APIKeyDataRouter);
app.use('/api/v2/api-key', v2APIKeyDataRouter);
// api docs
app.use('/api-docs', swaggerUi.serve, swaggerUi.setup(swaggerFile))
@ -121,10 +138,12 @@ app.use((req, res, next) => {
next(RouteNotFoundError({ message: `The requested source '(${req.method})${req.url}' was not found` }))
})
// handle mongo validation errors
app.use(handleMongoInvalidDataError);
//* Error Handling Middleware (must be after all routing logic)
app.use(requestErrorHandler)
export const server = app.listen(PORT, () => {
getLogger("backend-main").info(`Server started listening at port ${PORT}`)
});

View File

@ -1,9 +1,12 @@
const PORT = process.env.PORT || 4000;
const EMAIL_TOKEN_LIFETIME = process.env.EMAIL_TOKEN_LIFETIME! || '86400';
const EMAIL_TOKEN_LIFETIME = parseInt(process.env.EMAIL_TOKEN_LIFETIME! || '86400');
const INVITE_ONLY_SIGNUP = process.env.INVITE_ONLY_SIGNUP == undefined ? false : process.env.INVITE_ONLY_SIGNUP
const ENCRYPTION_KEY = process.env.ENCRYPTION_KEY!;
const SALT_ROUNDS = parseInt(process.env.SALT_ROUNDS!) || 10;
const JWT_AUTH_LIFETIME = process.env.JWT_AUTH_LIFETIME! || '10d';
const JWT_AUTH_SECRET = process.env.JWT_AUTH_SECRET!;
const JWT_MFA_LIFETIME = process.env.JWT_MFA_LIFETIME! || '5m';
const JWT_MFA_SECRET = process.env.JWT_MFA_SECRET!;
const JWT_REFRESH_LIFETIME = process.env.JWT_REFRESH_LIFETIME! || '90d';
const JWT_REFRESH_SECRET = process.env.JWT_REFRESH_SECRET!;
const JWT_SERVICE_SECRET = process.env.JWT_SERVICE_SECRET!;
@ -13,15 +16,18 @@ const MONGO_URL = process.env.MONGO_URL!;
const NODE_ENV = process.env.NODE_ENV! || 'production';
const VERBOSE_ERROR_OUTPUT = process.env.VERBOSE_ERROR_OUTPUT! === 'true' && true;
const LOKI_HOST = process.env.LOKI_HOST || undefined;
const CLIENT_SECRET_HEROKU = process.env.CLIENT_SECRET_HEROKU!;
const CLIENT_ID_AZURE = process.env.CLIENT_ID_AZURE!;
const TENANT_ID_AZURE = process.env.TENANT_ID_AZURE!;
const CLIENT_ID_HEROKU = process.env.CLIENT_ID_HEROKU!;
const CLIENT_ID_VERCEL = process.env.CLIENT_ID_VERCEL!;
const CLIENT_ID_NETLIFY = process.env.CLIENT_ID_NETLIFY!;
const CLIENT_ID_GITHUB = process.env.CLIENT_ID_GITHUB!;
const CLIENT_SECRET_AZURE = process.env.CLIENT_SECRET_AZURE!;
const CLIENT_SECRET_HEROKU = process.env.CLIENT_SECRET_HEROKU!;
const CLIENT_SECRET_VERCEL = process.env.CLIENT_SECRET_VERCEL!;
const CLIENT_SECRET_NETLIFY = process.env.CLIENT_SECRET_NETLIFY!;
const CLIENT_SECRET_GITHUB = process.env.CLIENT_SECRET_GITHUB!;
const CLIENT_SLUG_VERCEL= process.env.CLIENT_SLUG_VERCEL!;
const CLIENT_SLUG_VERCEL = process.env.CLIENT_SLUG_VERCEL!;
const POSTHOG_HOST = process.env.POSTHOG_HOST! || 'https://app.posthog.com';
const POSTHOG_PROJECT_API_KEY =
process.env.POSTHOG_PROJECT_API_KEY! ||
@ -35,9 +41,9 @@ const SMTP_USERNAME = process.env.SMTP_USERNAME!;
const SMTP_PASSWORD = process.env.SMTP_PASSWORD!;
const SMTP_FROM_ADDRESS = process.env.SMTP_FROM_ADDRESS!;
const SMTP_FROM_NAME = process.env.SMTP_FROM_NAME! || 'Infisical';
const STRIPE_PRODUCT_CARD_AUTH = process.env.STRIPE_PRODUCT_CARD_AUTH!;
const STRIPE_PRODUCT_PRO = process.env.STRIPE_PRODUCT_PRO!;
const STRIPE_PRODUCT_STARTER = process.env.STRIPE_PRODUCT_STARTER!;
const STRIPE_PRODUCT_PRO = process.env.STRIPE_PRODUCT_PRO!;
const STRIPE_PRODUCT_TEAM = process.env.STRIPE_PRODUCT_TEAM!;
const STRIPE_PUBLISHABLE_KEY = process.env.STRIPE_PUBLISHABLE_KEY!;
const STRIPE_SECRET_KEY = process.env.STRIPE_SECRET_KEY!;
const STRIPE_WEBHOOK_SECRET = process.env.STRIPE_WEBHOOK_SECRET!;
@ -47,10 +53,13 @@ const LICENSE_KEY = process.env.LICENSE_KEY!;
export {
PORT,
EMAIL_TOKEN_LIFETIME,
INVITE_ONLY_SIGNUP,
ENCRYPTION_KEY,
SALT_ROUNDS,
JWT_AUTH_LIFETIME,
JWT_AUTH_SECRET,
JWT_MFA_LIFETIME,
JWT_MFA_SECRET,
JWT_REFRESH_LIFETIME,
JWT_REFRESH_SECRET,
JWT_SERVICE_SECRET,
@ -60,10 +69,13 @@ export {
NODE_ENV,
VERBOSE_ERROR_OUTPUT,
LOKI_HOST,
CLIENT_ID_AZURE,
TENANT_ID_AZURE,
CLIENT_ID_HEROKU,
CLIENT_ID_VERCEL,
CLIENT_ID_NETLIFY,
CLIENT_ID_GITHUB,
CLIENT_SECRET_AZURE,
CLIENT_SECRET_HEROKU,
CLIENT_SECRET_VERCEL,
CLIENT_SECRET_NETLIFY,
@ -80,9 +92,9 @@ export {
SMTP_PASSWORD,
SMTP_FROM_ADDRESS,
SMTP_FROM_NAME,
STRIPE_PRODUCT_CARD_AUTH,
STRIPE_PRODUCT_PRO,
STRIPE_PRODUCT_STARTER,
STRIPE_PRODUCT_TEAM,
STRIPE_PRODUCT_PRO,
STRIPE_PUBLISHABLE_KEY,
STRIPE_SECRET_KEY,
STRIPE_WEBHOOK_SECRET,

View File

@ -0,0 +1,16 @@
import axios from 'axios';
import axiosRetry from 'axios-retry';
const axiosInstance = axios.create();
// add retry functionality to the axios instance
axiosRetry(axiosInstance, {
retries: 3,
retryDelay: axiosRetry.exponentialDelay, // exponential back-off delay between retries
retryCondition: (error) => {
// only retry if the error is a network error or a 5xx server error
return axiosRetry.isNetworkError(error) || axiosRetry.isRetryableError(error);
},
});
export default axiosInstance;

View File

@ -4,14 +4,22 @@ import jwt from 'jsonwebtoken';
import * as Sentry from '@sentry/node';
import * as bigintConversion from 'bigint-conversion';
const jsrp = require('jsrp');
import { User } from '../../models';
import { createToken, issueTokens, clearTokens } from '../../helpers/auth';
import { User, LoginSRPDetail } from '../../models';
import { createToken, issueAuthTokens, clearTokens } from '../../helpers/auth';
import { checkUserDevice } from '../../helpers/user';
import {
ACTION_LOGIN,
ACTION_LOGOUT
} from '../../variables';
import {
NODE_ENV,
JWT_AUTH_LIFETIME,
JWT_AUTH_SECRET,
JWT_REFRESH_SECRET
} from '../../config';
import { BadRequestError } from '../../utils/errors';
import { EELogService } from '../../ee/services';
import { getChannelFromUserAgent } from '../../utils/posthog'; // TODO: move this
declare module 'jsonwebtoken' {
export interface UserIDJwtPayload extends jwt.JwtPayload {
@ -19,8 +27,6 @@ declare module 'jsonwebtoken' {
}
}
const clientPublicKeys: any = {};
/**
* Log in user step 1: Return [salt] and [serverPublicKey] as part of step 1 of SRP protocol
* @param req
@ -46,13 +52,15 @@ export const login1 = async (req: Request, res: Response) => {
salt: user.salt,
verifier: user.verifier
},
() => {
async () => {
// generate server-side public key
const serverPublicKey = server.getPublicKey();
clientPublicKeys[email] = {
clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt)
};
await LoginSRPDetail.findOneAndReplace({ email: email }, {
email: email,
clientPublicKey: clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt),
}, { upsert: true, returnNewDocument: false })
return res.status(200).send({
serverPublicKey,
@ -85,20 +93,33 @@ export const login2 = async (req: Request, res: Response) => {
if (!user) throw new Error('Failed to find user');
const loginSRPDetailFromDB = await LoginSRPDetail.findOneAndDelete({ email: email })
if (!loginSRPDetailFromDB) {
return BadRequestError(Error("It looks like some details from the first login are not found. Please try login one again"))
}
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier,
b: clientPublicKeys[email].serverBInt
b: loginSRPDetailFromDB.serverBInt
},
async () => {
server.setClientPublicKey(clientPublicKeys[email].clientPublicKey);
server.setClientPublicKey(loginSRPDetailFromDB.clientPublicKey);
// compare server and client shared keys
if (server.checkClientProof(clientProof)) {
// issue tokens
const tokens = await issueTokens({ userId: user._id.toString() });
await checkUserDevice({
user,
ip: req.ip,
userAgent: req.headers['user-agent'] ?? ''
});
const tokens = await issueAuthTokens({ userId: user._id.toString() });
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
@ -108,6 +129,18 @@ export const login2 = async (req: Request, res: Response) => {
secure: NODE_ENV === 'production' ? true : false
});
const loginAction = await EELogService.createAction({
name: ACTION_LOGIN,
userId: user._id
});
loginAction && await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getChannelFromUserAgent(req.headers['user-agent']),
ipAddress: req.ip
});
// return (access) token in response
return res.status(200).send({
token: tokens.token,
@ -151,6 +184,19 @@ export const logout = async (req: Request, res: Response) => {
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
const logoutAction = await EELogService.createAction({
name: ACTION_LOGOUT,
userId: req.user._id
});
logoutAction && await EELogService.createLog({
userId: req.user._id,
actions: [logoutAction],
channel: getChannelFromUserAgent(req.headers['user-agent']),
ipAddress: req.ip
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
@ -170,10 +216,11 @@ export const logout = async (req: Request, res: Response) => {
* @param res
* @returns
*/
export const checkAuth = async (req: Request, res: Response) =>
res.status(200).send({
export const checkAuth = async (req: Request, res: Response) => {
return res.status(200).send({
message: 'Authenticated'
});
}
/**
* Return new token by redeeming refresh token

View File

@ -14,6 +14,7 @@ import * as stripeController from './stripeController';
import * as userActionController from './userActionController';
import * as userController from './userController';
import * as workspaceController from './workspaceController';
import * as secretApprovalController from './secretApprovalController';
export {
authController,
@ -31,5 +32,6 @@ export {
stripeController,
userActionController,
userController,
workspaceController
workspaceController,
secretApprovalController
};

View File

@ -1,21 +1,46 @@
import { Request, Response } from 'express';
import { Types } from 'mongoose';
import * as Sentry from '@sentry/node';
import axios from 'axios';
import { readFileSync } from 'fs';
import { IntegrationAuth, Integration } from '../../models';
import { INTEGRATION_SET, INTEGRATION_OPTIONS, ENV_DEV } from '../../variables';
import {
Integration,
IntegrationAuth,
Bot
} from '../../models';
import { INTEGRATION_SET, INTEGRATION_OPTIONS } from '../../variables';
import { IntegrationService } from '../../services';
import { getApps, revokeAccess } from '../../integrations';
export const getIntegrationOptions = async (
req: Request,
res: Response
) => {
/***
* Return integration authorization with id [integrationAuthId]
*/
export const getIntegrationAuth = async (req: Request, res: Response) => {
let integrationAuth;
try {
const { integrationAuthId } = req.params;
integrationAuth = await IntegrationAuth.findById(integrationAuthId);
if (!integrationAuth) return res.status(400).send({
message: 'Failed to find integration authorization'
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get integration authorization'
});
}
return res.status(200).send({
integrationOptions: INTEGRATION_OPTIONS
integrationAuth
});
}
export const getIntegrationOptions = async (req: Request, res: Response) => {
return res.status(200).send({
integrationOptions: INTEGRATION_OPTIONS,
});
};
/**
* Perform OAuth2 code-token exchange as part of integration [integration] for workspace with id [workspaceId]
* @param req
@ -28,28 +53,100 @@ export const oAuthExchange = async (
) => {
try {
const { workspaceId, code, integration } = req.body;
if (!INTEGRATION_SET.has(integration))
throw new Error('Failed to validate integration');
const environments = req.membership.workspace?.environments || [];
if(environments.length === 0){
throw new Error("Failed to get environments")
}
await IntegrationService.handleOAuthExchange({
const integrationAuth = await IntegrationService.handleOAuthExchange({
workspaceId,
integration,
code
code,
environment: environments[0].slug,
});
return res.status(200).send({
integrationAuth
});
} catch (err) {
Sentry.setUser(null);
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get OAuth2 code-token exchange'
});
}
return res.status(200).send({
message: 'Successfully enabled integration authorization'
});
};
/**
* Save integration access token and (optionally) access id as part of integration
* [integration] for workspace with id [workspaceId]
* @param req
* @param res
*/
export const saveIntegrationAccessToken = async (
req: Request,
res: Response
) => {
// TODO: refactor
// TODO: check if access token is valid for each integration
let integrationAuth;
try {
const {
workspaceId,
accessId,
accessToken,
integration
}: {
workspaceId: string;
accessId: string | null;
accessToken: string;
integration: string;
} = req.body;
const bot = await Bot.findOne({
workspace: new Types.ObjectId(workspaceId),
isActive: true
});
if (!bot) throw new Error('Bot must be enabled to save integration access token');
integrationAuth = await IntegrationAuth.findOneAndUpdate({
workspace: new Types.ObjectId(workspaceId),
integration
}, {
workspace: new Types.ObjectId(workspaceId),
integration
}, {
new: true,
upsert: true
});
// encrypt and save integration access details
integrationAuth = await IntegrationService.setIntegrationAuthAccess({
integrationAuthId: integrationAuth._id.toString(),
accessId,
accessToken,
accessExpiresAt: undefined
});
if (!integrationAuth) throw new Error('Failed to save integration access token');
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to save access token for integration'
});
}
return res.status(200).send({
integrationAuth
});
}
/**
* Return list of applications allowed for integration with integration authorization id [integrationAuthId]
* @param req
@ -57,23 +154,23 @@ export const oAuthExchange = async (
* @returns
*/
export const getIntegrationAuthApps = async (req: Request, res: Response) => {
let apps;
try {
apps = await getApps({
integrationAuth: req.integrationAuth,
accessToken: req.accessToken
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get integration authorization applications'
});
}
let apps;
try {
apps = await getApps({
integrationAuth: req.integrationAuth,
accessToken: req.accessToken,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get integration authorization applications",
});
}
return res.status(200).send({
apps
});
return res.status(200).send({
apps,
});
};
/**
@ -83,22 +180,21 @@ export const getIntegrationAuthApps = async (req: Request, res: Response) => {
* @returns
*/
export const deleteIntegrationAuth = async (req: Request, res: Response) => {
try {
const { integrationAuthId } = req.params;
let integrationAuth;
try {
integrationAuth = await revokeAccess({
integrationAuth: req.integrationAuth,
accessToken: req.accessToken,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to delete integration authorization",
});
}
await revokeAccess({
integrationAuth: req.integrationAuth,
accessToken: req.accessToken
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to delete integration authorization'
});
}
return res.status(200).send({
message: 'Successfully deleted integration authorization'
});
}
return res.status(200).send({
integrationAuth,
});
};

View File

@ -1,65 +1,52 @@
import { Request, Response } from 'express';
import { readFileSync } from 'fs';
import { Types } from 'mongoose';
import * as Sentry from '@sentry/node';
import { Integration, Bot, BotKey } from '../../models';
import {
Integration,
Workspace,
Bot,
BotKey
} from '../../models';
import { EventService } from '../../services';
import { eventPushSecrets } from '../../events';
interface Key {
encryptedKey: string;
nonce: string;
}
interface PushSecret {
ciphertextKey: string;
ivKey: string;
tagKey: string;
hashKey: string;
ciphertextValue: string;
ivValue: string;
tagValue: string;
hashValue: string;
type: 'shared' | 'personal';
}
/**
* Change environment or name of integration with id [integrationId]
* Create/initialize an (empty) integration for integration authorization
* @param req
* @param res
* @returns
*/
export const updateIntegration = async (req: Request, res: Response) => {
export const createIntegration = async (req: Request, res: Response) => {
let integration;
// TODO: add integration-specific validation to ensure that each
// integration has the correct fields populated in [Integration]
try {
const {
app,
environment,
isActive,
target, // vercel-specific integration param
context, // netlify-specific integration param
siteId // netlify-specific integration param
const {
integrationAuthId,
app,
appId,
isActive,
sourceEnvironment,
targetEnvironment,
owner,
path,
region
} = req.body;
integration = await Integration.findOneAndUpdate(
{
_id: req.integration._id
},
{
environment,
isActive,
app,
target,
context,
siteId
},
{
new: true
}
);
// TODO: validate [sourceEnvironment] and [targetEnvironment]
// initialize new integration after saving integration access token
integration = await new Integration({
workspace: req.integrationAuth.workspace._id,
environment: sourceEnvironment,
isActive,
app,
appId,
targetEnvironment,
owner,
path,
region,
integration: req.integrationAuth.integration,
integrationAuth: new Types.ObjectId(integrationAuthId)
}).save();
if (integration) {
// trigger event - push secrets
@ -69,17 +56,78 @@ export const updateIntegration = async (req: Request, res: Response) => {
})
});
}
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to update integration'
message: 'Failed to create integration'
});
}
return res.status(200).send({
integration
});
return res.status(200).send({
integration,
});
};
/**
* Change environment or name of integration with id [integrationId]
* @param req
* @param res
* @returns
*/
export const updateIntegration = async (req: Request, res: Response) => {
let integration;
// TODO: add integration-specific validation to ensure that each
// integration has the correct fields populated in [Integration]
try {
const {
environment,
isActive,
app,
appId,
targetEnvironment,
owner, // github-specific integration param
} = req.body;
integration = await Integration.findOneAndUpdate(
{
_id: req.integration._id,
},
{
environment,
isActive,
app,
appId,
targetEnvironment,
owner,
},
{
new: true,
}
);
if (integration) {
// trigger event - push secrets
EventService.handleEvent({
event: eventPushSecrets({
workspaceId: integration.workspace.toString(),
}),
});
}
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to update integration",
});
}
return res.status(200).send({
integration,
});
};
/**
@ -90,45 +138,24 @@ export const updateIntegration = async (req: Request, res: Response) => {
* @returns
*/
export const deleteIntegration = async (req: Request, res: Response) => {
let deletedIntegration;
try {
const { integrationId } = req.params;
let integration;
try {
const { integrationId } = req.params;
deletedIntegration = await Integration.findOneAndDelete({
_id: integrationId
});
if (!deletedIntegration) throw new Error('Failed to find integration');
const integrations = await Integration.find({
workspace: deletedIntegration.workspace
});
if (integrations.length === 0) {
// case: no integrations left, deactivate bot
const bot = await Bot.findOneAndUpdate({
workspace: deletedIntegration.workspace
}, {
isActive: false
}, {
new: true
});
if (bot) {
await BotKey.deleteOne({
bot: bot._id
});
}
}
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to delete integration'
});
}
integration = await Integration.findOneAndDelete({
_id: integrationId,
});
return res.status(200).send({
deletedIntegration
});
if (!integration) throw new Error("Failed to find integration");
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to delete integration",
});
}
return res.status(200).send({
integration,
});
};

View File

@ -1,6 +1,6 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { Membership, MembershipOrg, User, Key } from '../../models';
import { Membership, MembershipOrg, User, Key, IMembership, Workspace } from '../../models';
import {
findMembership,
deleteMembership as deleteMember
@ -230,4 +230,4 @@ export const inviteUserToWorkspace = async (req: Request, res: Response) => {
invitee,
latestKey
});
};
};

View File

@ -1,14 +1,13 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import crypto from 'crypto';
import { SITE_URL, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET } from '../../config';
import { MembershipOrg, Organization, User, Token } from '../../models';
import { MembershipOrg, Organization, User } from '../../models';
import { deleteMembershipOrg as deleteMemberFromOrg } from '../../helpers/membershipOrg';
import { checkEmailVerification } from '../../helpers/signup';
import { createToken } from '../../helpers/auth';
import { updateSubscriptionOrgQuantity } from '../../helpers/organization';
import { sendMail } from '../../helpers/nodemailer';
import { OWNER, ADMIN, MEMBER, ACCEPTED, INVITED } from '../../variables';
import { TokenService } from '../../services';
import { OWNER, ADMIN, MEMBER, ACCEPTED, INVITED, TOKEN_EMAIL_ORG_INVITATION } from '../../variables';
/**
* Delete organization membership with id [membershipOrgId] from organization
@ -77,8 +76,6 @@ export const changeMembershipOrgRole = async (req: Request, res: Response) => {
// change role for (target) organization membership with id
// [membershipOrgId]
// TODO
let membershipToChangeRole;
// try {
// } catch (err) {
@ -115,14 +112,14 @@ export const inviteUserToOrganization = async (req: Request, res: Response) => {
if (!membershipOrg) {
throw new Error('Failed to validate organization membership');
}
invitee = await User.findOne({
email: inviteeEmail
}).select('+publicKey');
if (invitee) {
// case: invitee is an existing user
inviteeMembershipOrg = await MembershipOrg.findOne({
user: invitee._id,
organization: organizationId
@ -165,17 +162,11 @@ export const inviteUserToOrganization = async (req: Request, res: Response) => {
const organization = await Organization.findOne({ _id: organizationId });
if (organization) {
const token = crypto.randomBytes(16).toString('hex');
await Token.findOneAndUpdate(
{ email: inviteeEmail },
{
email: inviteeEmail,
token,
createdAt: new Date()
},
{ upsert: true, new: true }
);
const token = await TokenService.createToken({
type: TOKEN_EMAIL_ORG_INVITATION,
email: inviteeEmail,
organizationId: organization._id
});
await sendMail({
template: 'organizationInvitation.handlebars',
@ -227,10 +218,12 @@ export const verifyUserToOrganization = async (req: Request, res: Response) => {
if (!membershipOrg)
throw new Error('Failed to find any invitations for email');
await checkEmailVerification({
await TokenService.validateToken({
type: TOKEN_EMAIL_ORG_INVITATION,
email,
code
organizationId: membershipOrg.organization,
token: code
});
if (user && user?.publicKey) {
@ -243,7 +236,7 @@ export const verifyUserToOrganization = async (req: Request, res: Response) => {
message: 'Successfully verified email',
user,
});
}
}
if (!user) {
// initialize user account

View File

@ -2,10 +2,7 @@ import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import {
SITE_URL,
STRIPE_SECRET_KEY,
STRIPE_PRODUCT_STARTER,
STRIPE_PRODUCT_PRO,
STRIPE_PRODUCT_CARD_AUTH
STRIPE_SECRET_KEY
} from '../../config';
import Stripe from 'stripe';
@ -17,24 +14,14 @@ import {
MembershipOrg,
Organization,
Workspace,
IncidentContactOrg
IncidentContactOrg,
IMembershipOrg
} from '../../models';
import { createOrganization as create } from '../../helpers/organization';
import { addMembershipsOrg } from '../../helpers/membershipOrg';
import { OWNER, ACCEPTED } from '../../variables';
import _ from 'lodash';
const productToPriceMap = {
starter: STRIPE_PRODUCT_STARTER,
pro: STRIPE_PRODUCT_PRO,
cardAuth: STRIPE_PRODUCT_CARD_AUTH
};
/**
* Return organizations that user is part of
* @param req
* @param res
* @returns
*/
export const getOrganizations = async (req: Request, res: Response) => {
let organizations;
try {
@ -346,7 +333,6 @@ export const createOrganizationPortalSession = async (
if (paymentMethods.data.length < 1) {
// case: no payment method on file
productToPriceMap['cardAuth'];
session = await stripe.checkout.sessions.create({
customer: req.membershipOrg.organization.customerId,
mode: 'setup',
@ -398,3 +384,44 @@ export const getOrganizationSubscriptions = async (
subscriptions
});
};
/**
* Given a org id, return the projects each member of the org belongs to
* @param req
* @param res
* @returns
*/
export const getOrganizationMembersAndTheirWorkspaces = async (
req: Request,
res: Response
) => {
const { organizationId } = req.params;
const workspacesSet = (
await Workspace.find(
{
organization: organizationId
},
'_id'
)
).map((w) => w._id.toString());
const memberships = (
await Membership.find({
workspace: { $in: workspacesSet }
}).populate('workspace')
);
const userToWorkspaceIds: any = {};
memberships.forEach(membership => {
const user = membership.user.toString();
if (userToWorkspaceIds[user]) {
userToWorkspaceIds[user].push(membership.workspace);
} else {
userToWorkspaceIds[user] = [membership.workspace];
}
});
return res.json(userToWorkspaceIds);
};

View File

@ -1,16 +1,15 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import crypto from 'crypto';
// eslint-disable-next-line @typescript-eslint/no-var-requires
const jsrp = require('jsrp');
import * as bigintConversion from 'bigint-conversion';
import { User, Token, BackupPrivateKey } from '../../models';
import { checkEmailVerification } from '../../helpers/signup';
import { User, BackupPrivateKey, LoginSRPDetail } from '../../models';
import { createToken } from '../../helpers/auth';
import { sendMail } from '../../helpers/nodemailer';
import { TokenService } from '../../services';
import { JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, SITE_URL } from '../../config';
const clientPublicKeys: any = {};
import { TOKEN_EMAIL_PASSWORD_RESET } from '../../variables';
import { BadRequestError } from '../../utils/errors';
/**
* Password reset step 1: Send email verification link to email [email]
@ -33,17 +32,10 @@ export const emailPasswordReset = async (req: Request, res: Response) => {
});
}
const token = crypto.randomBytes(16).toString('hex');
await Token.findOneAndUpdate(
{ email },
{
email,
token,
createdAt: new Date()
},
{ upsert: true, new: true }
);
const token = await TokenService.createToken({
type: TOKEN_EMAIL_PASSWORD_RESET,
email
});
await sendMail({
template: 'passwordReset.handlebars',
@ -55,15 +47,14 @@ export const emailPasswordReset = async (req: Request, res: Response) => {
callback_url: SITE_URL + '/password-reset'
}
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to send email for account recovery'
});
});
}
return res.status(200).send({
message: `Sent an email for account recovery to ${email}`
});
@ -79,7 +70,7 @@ export const emailPasswordResetVerify = async (req: Request, res: Response) => {
let user, token;
try {
const { email, code } = req.body;
user = await User.findOne({ email }).select('+publicKey');
if (!user || !user?.publicKey) {
// case: user doesn't exist with email [email] or
@ -88,12 +79,13 @@ export const emailPasswordResetVerify = async (req: Request, res: Response) => {
error: 'Failed email verification for password reset'
});
}
await checkEmailVerification({
email,
code
});
await TokenService.validateToken({
type: TOKEN_EMAIL_PASSWORD_RESET,
email,
token: code
});
// generate temporary password-reset token
token = createToken({
payload: {
@ -107,7 +99,7 @@ export const emailPasswordResetVerify = async (req: Request, res: Response) => {
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed email verification for password reset'
});
});
}
return res.status(200).send({
@ -130,7 +122,7 @@ export const srp1 = async (req: Request, res: Response) => {
const user = await User.findOne({
email: req.user.email
}).select('+salt +verifier');
if (!user) throw new Error('Failed to find user');
const server = new jsrp.server();
@ -139,13 +131,15 @@ export const srp1 = async (req: Request, res: Response) => {
salt: user.salt,
verifier: user.verifier
},
() => {
async () => {
// generate server-side public key
const serverPublicKey = server.getPublicKey();
clientPublicKeys[req.user.email] = {
clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt)
};
await LoginSRPDetail.findOneAndReplace({ email: req.user.email }, {
email: req.user.email,
clientPublicKey: clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt),
}, { upsert: true, returnNewDocument: false })
return res.status(200).send({
serverPublicKey,
@ -172,25 +166,39 @@ export const srp1 = async (req: Request, res: Response) => {
*/
export const changePassword = async (req: Request, res: Response) => {
try {
const { clientProof, encryptedPrivateKey, iv, tag, salt, verifier } =
req.body;
const {
clientProof,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
} = req.body;
const user = await User.findOne({
email: req.user.email
}).select('+salt +verifier');
if (!user) throw new Error('Failed to find user');
const loginSRPDetailFromDB = await LoginSRPDetail.findOneAndDelete({ email: req.user.email })
if (!loginSRPDetailFromDB) {
return BadRequestError(Error("It looks like some details from the first login are not found. Please try login one again"))
}
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier,
b: clientPublicKeys[req.user.email].serverBInt
b: loginSRPDetailFromDB.serverBInt
},
async () => {
server.setClientPublicKey(
clientPublicKeys[req.user.email].clientPublicKey
);
server.setClientPublicKey(loginSRPDetailFromDB.clientPublicKey);
// compare server and client shared keys
if (server.checkClientProof(clientProof)) {
@ -199,9 +207,13 @@ export const changePassword = async (req: Request, res: Response) => {
await User.findByIdAndUpdate(
req.user._id.toString(),
{
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv,
tag,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier
},
@ -249,16 +261,22 @@ export const createBackupPrivateKey = async (req: Request, res: Response) => {
if (!user) throw new Error('Failed to find user');
const loginSRPDetailFromDB = await LoginSRPDetail.findOneAndDelete({ email: req.user.email })
if (!loginSRPDetailFromDB) {
return BadRequestError(Error("It looks like some details from the first login are not found. Please try login one again"))
}
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier,
b: clientPublicKeys[req.user.email].serverBInt
b: loginSRPDetailFromDB.serverBInt
},
async () => {
server.setClientPublicKey(
clientPublicKeys[req.user.email].clientPublicKey
loginSRPDetailFromDB.clientPublicKey
);
// compare server and client shared keys
@ -311,16 +329,16 @@ export const getBackupPrivateKey = async (req: Request, res: Response) => {
backupPrivateKey = await BackupPrivateKey.findOne({
user: req.user._id
}).select('+encryptedPrivateKey +iv +tag');
if (!backupPrivateKey) throw new Error('Failed to find backup private key');
} catch (err) {
Sentry.setUser({ email: req.user.email});
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get backup private key'
});
}
return res.status(200).send({
backupPrivateKey
});
@ -329,9 +347,12 @@ export const getBackupPrivateKey = async (req: Request, res: Response) => {
export const resetPassword = async (req: Request, res: Response) => {
try {
const {
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv,
tag,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier,
} = req.body;
@ -339,24 +360,28 @@ export const resetPassword = async (req: Request, res: Response) => {
await User.findByIdAndUpdate(
req.user._id.toString(),
{
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv,
tag,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier
},
{
new: true
}
);
);
} catch (err) {
Sentry.setUser({ email: req.user.email});
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get backup private key'
});
});
}
return res.status(200).send({
message: 'Successfully reset password'
});

View File

@ -0,0 +1,320 @@
import { Request, Response } from 'express';
import SecretApprovalRequest, { ApprovalStatus, ChangeType, IApprover, IRequestedChange } from '../../models/secretApprovalRequest';
import { Builder, IBuilder } from "builder-pattern"
import { secretObjectHasRequiredFields, validateSecrets } from '../../helpers/secret';
import _ from 'lodash';
import { SECRET_PERSONAL, SECRET_SHARED } from '../../variables';
import { BadRequestError, ResourceNotFound, UnauthorizedRequestError } from '../../utils/errors';
import { ISecret, Membership, Secret, Workspace } from '../../models';
import mongoose from 'mongoose';
export const createApprovalRequest = async (req: Request, res: Response) => {
const { workspaceId, environment, requestedChanges } = req.body;
// validate workspace
const workspaceFromDB = await Workspace.findById(workspaceId)
if (!workspaceFromDB) {
throw ResourceNotFound()
}
const environmentBelongsToWorkspace = _.some(workspaceFromDB.environments, { slug: environment })
if (!environmentBelongsToWorkspace) {
throw ResourceNotFound()
}
// check for secret duplicates
const hasSecretIdDuplicates = requestedChanges.length !== _.uniqBy(requestedChanges, 'modifiedSecretParentId').length;
if (hasSecretIdDuplicates) {
throw BadRequestError({ message: "Request cannot contain changes for duplicate secrets" })
}
// ensure the workspace has approvers set
if (!workspaceFromDB.approvers.length) {
throw BadRequestError({ message: "There are no designated approvers for this project, you must set approvers first before making a request" })
}
const approverIds = _.compact(_.map(workspaceFromDB.approvers, "userId"))
const approversFormatted: IApprover[] = approverIds.map(id => {
return { "userId": id, status: ApprovalStatus.PENDING }
})
const listOfSecretIdsToModify = _.compact(_.map(requestedChanges, "modifiedSecretParentId"))
// Ensure that the user requesting changes for the set of secrets can indeed interact with said secrets
if (listOfSecretIdsToModify.length > 0) {
await validateSecrets({
userId: req.user._id.toString(),
secretIds: listOfSecretIdsToModify
});
}
const sanitizedRequestedChangesList: IRequestedChange[] = []
requestedChanges.forEach((requestedChange: IRequestedChange) => {
const secretDetailsIsValid = secretObjectHasRequiredFields(requestedChange.modifiedSecretDetails)
if (!secretDetailsIsValid) {
throw BadRequestError({ message: "One or more required fields are missing from your modified secret" })
}
if (!requestedChange.modifiedSecretParentId && (requestedChange.type != ChangeType.DELETE.toString() && requestedChange.type != ChangeType.CREATE.toString())) {
throw BadRequestError({ message: "modifiedSecretParentId can only be empty when secret change type is DELETE or CREATE" })
}
sanitizedRequestedChangesList.push(Builder<IRequestedChange>()
.modifiedSecretParentId(requestedChange.modifiedSecretParentId)
.modifiedSecretDetails(requestedChange.modifiedSecretDetails)
.approvers(approversFormatted)
.type(requestedChange.type).build())
});
const newApprovalRequest = await SecretApprovalRequest.create({
workspace: workspaceId,
requestedByUserId: req.user._id.toString(),
environment: environment,
requestedChanges: sanitizedRequestedChangesList
})
const populatedNewApprovalRequest = await newApprovalRequest.populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
return res.send({ approvalRequest: populatedNewApprovalRequest });
};
export const getAllApprovalRequestsForUser = async (req: Request, res: Response) => {
const approvalRequests = await SecretApprovalRequest.find({
requestedByUserId: req.user._id.toString()
}).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
.sort({ updatedAt: -1 })
res.send({ approvalRequests: approvalRequests })
}
export const getAllApprovalRequestsThatRequireUserApproval = async (req: Request, res: Response) => {
const approvalRequests = await SecretApprovalRequest.find({
'requestedChanges.approvers.userId': req.user._id.toString()
}).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
.sort({ updatedAt: -1 })
res.send({ approvalRequests: approvalRequests })
}
export const approveApprovalRequest = async (req: Request, res: Response) => {
const { requestedChangeIds } = req.body;
const { reviewId } = req.params
const approvalRequestFromDB = await SecretApprovalRequest.findById(reviewId)
if (!approvalRequestFromDB) {
throw ResourceNotFound()
}
const requestedChangesFromDB: IRequestedChange[] = approvalRequestFromDB.requestedChanges
const filteredChangesByIds = requestedChangesFromDB.filter(change => requestedChangeIds.includes(change._id.toString()))
if (filteredChangesByIds.length != requestedChangeIds.length) {
throw BadRequestError({ message: "All requestedChangeIds should exist in this approval request" })
}
const changesThatRequireUserApproval = _.filter(filteredChangesByIds, change => {
return _.some(change.approvers, approver => {
return approver.userId.toString() == req.user._id.toString();
});
});
if (!changesThatRequireUserApproval.length) {
throw UnauthorizedRequestError({ message: "Your approval is not required for this review" })
}
if (changesThatRequireUserApproval.length != filteredChangesByIds.length) {
throw BadRequestError({ message: "You may only request to approve changes that require your approval" })
}
changesThatRequireUserApproval.forEach((requestedChange) => {
const overallChangeStatus = requestedChange.status
const currentLoggedInUserId = req.user._id.toString()
if (overallChangeStatus == ApprovalStatus.PENDING.toString()) {
requestedChange.approvers.forEach((approver) => {
if (approver.userId.toString() == currentLoggedInUserId && approver.status == ApprovalStatus.PENDING.toString()) {
approver.status = ApprovalStatus.APPROVED
}
})
let updateOverallStatusToApproved = true
requestedChange.approvers.forEach((approver) => {
if (approver.status != ApprovalStatus.APPROVED.toString()) {
updateOverallStatusToApproved = false
}
})
if (updateOverallStatusToApproved) {
requestedChange.status = ApprovalStatus.APPROVED
}
}
})
const updatedApprovalRequest = await SecretApprovalRequest.findByIdAndUpdate(reviewId, {
requestedChanges: requestedChangesFromDB
}, { new: true }).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
res.send({ approvalRequest: updatedApprovalRequest })
}
export const rejectApprovalRequest = async (req: Request, res: Response) => {
const { requestedChangeIds } = req.body;
const { reviewId } = req.params
const approvalRequestFromDB = await SecretApprovalRequest.findById(reviewId)
if (!approvalRequestFromDB) {
throw ResourceNotFound()
}
const requestedChangesFromDB: IRequestedChange[] = approvalRequestFromDB.requestedChanges
const filteredChangesByIds = requestedChangesFromDB.filter(change => requestedChangeIds.includes(change._id.toString()))
if (filteredChangesByIds.length != requestedChangeIds.length) {
throw BadRequestError({ message: "All requestedChangeIds should exist in this approval request" })
}
const changesThatRequireUserApproval = _.filter(filteredChangesByIds, change => {
return _.some(change.approvers, approver => {
return approver.userId.toString() == req.user._id.toString();
});
});
if (!changesThatRequireUserApproval.length) {
throw UnauthorizedRequestError({ message: "Your approval is not required for this review" })
}
if (changesThatRequireUserApproval.length != filteredChangesByIds.length) {
throw BadRequestError({ message: "You may only request to reject changes that require your approval" })
}
changesThatRequireUserApproval.forEach((requestedChange) => {
const overallChangeStatus = requestedChange.status
const currentLoggedInUserId = req.user._id.toString()
if (overallChangeStatus == ApprovalStatus.PENDING.toString()) {
requestedChange.approvers.forEach((approver) => {
if (approver.userId.toString() == currentLoggedInUserId && approver.status == ApprovalStatus.PENDING.toString()) {
approver.status = ApprovalStatus.REJECTED
requestedChange.status = ApprovalStatus.REJECTED
}
})
}
})
const updatedApprovalRequest = await SecretApprovalRequest.findByIdAndUpdate(reviewId, {
requestedChanges: requestedChangesFromDB
}, { new: true }).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
res.send({ approvalRequest: updatedApprovalRequest })
};
export const mergeApprovalRequestSecrets = async (req: Request, res: Response) => {
const { requestedChangeIds } = req.body;
const { reviewId } = req.params
// only the user who requested the set of changes can merge it
const approvalRequestFromDB = await SecretApprovalRequest.findOne({ _id: reviewId, requestedByUserId: req.user._id })
if (!approvalRequestFromDB) {
throw ResourceNotFound()
}
// ensure that this user is a member of this workspace
const membershipDetails = await Membership.find({ user: req.user._id, workspace: approvalRequestFromDB.workspace })
if (!membershipDetails) {
throw UnauthorizedRequestError()
}
// filter not merged, approved, and change ids specified in this request
const filteredChangesToMerge: IRequestedChange[] = approvalRequestFromDB.requestedChanges.filter(change => change.merged == false && change.status == ApprovalStatus.APPROVED && requestedChangeIds.includes(change._id.toString()))
if (filteredChangesToMerge.length != requestedChangeIds.length) {
throw BadRequestError({ message: "One or more changes in this approval is either already merged/not approved or do not exist" })
}
const secretsToCreate: ISecret[] = []
const secretsToUpdate: any[] = []
const secretsIdsToDelete: any[] = []
const secretIdsToModify: any[] = []
filteredChangesToMerge.forEach((requestedChange: any) => {
const overallChangeStatus = requestedChange.status
const currentLoggedInUserId = req.user._id.toString()
if (overallChangeStatus == ApprovalStatus.APPROVED.toString()) {
if (ChangeType.CREATE.toString() == requestedChange.type) {
const modifiedSecret = requestedChange.modifiedSecretDetails.toObject()
secretsToCreate.push({
...modifiedSecret,
user: requestedChange.modifiedSecretDetails.type === SECRET_PERSONAL ? currentLoggedInUserId : undefined,
})
}
if (ChangeType.UPDATE.toString() == requestedChange.type) {
const modifiedSecret = requestedChange.modifiedSecretDetails.toObject()
secretIdsToModify.push(requestedChange.modifiedSecretParentId)
secretsToUpdate.push({
filter: { _id: requestedChange.modifiedSecretParentId },
update: {
$set: {
...modifiedSecret,
user: requestedChange.modifiedSecretDetails.type === SECRET_PERSONAL ? currentLoggedInUserId : undefined,
},
$inc: {
version: 1
}
}
})
}
if (ChangeType.DELETE.toString() == requestedChange.type) {
secretsIdsToDelete.push({
_id: requestedChange.modifiedSecretParentId.toString()
})
}
requestedChange.merged = true
}
})
// ensure all secrets that are to be updated exist
const numSecretsFromDBThatRequireUpdate = await Secret.countDocuments({ _id: { $in: secretIdsToModify } });
const numSecretsFromDBThatRequireDelete = await Secret.countDocuments({ _id: { $in: secretsIdsToDelete } });
if (numSecretsFromDBThatRequireUpdate != secretIdsToModify.length || numSecretsFromDBThatRequireDelete != secretsIdsToDelete.length) {
throw BadRequestError({ message: "You cannot merge changes for secrets that no longer exist" })
}
// Add add CRUD operations into a single list of operations
const allOperationsForBulkWrite: any[] = [];
for (const updateStatement of secretsToUpdate) {
allOperationsForBulkWrite.push({ updateOne: updateStatement });
}
for (const secretId of secretsIdsToDelete) {
allOperationsForBulkWrite.push({ deleteOne: { filter: { _id: secretId } } });
}
for (const createStatement of secretsToCreate) {
allOperationsForBulkWrite.push({ insertOne: { document: createStatement } });
}
// start transaction
const session = await mongoose.startSession();
session.startTransaction();
try {
await Secret.bulkWrite(allOperationsForBulkWrite);
await SecretApprovalRequest.updateOne({ _id: reviewId, 'requestedChanges._id': { $in: requestedChangeIds } },
{ $set: { 'requestedChanges.$.merged': true } })
const updatedApproval = await SecretApprovalRequest.findById(reviewId).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
res.send(updatedApproval)
} catch (error) {
await session.abortTransaction();
throw error
} finally {
session.endSession();
}
};

View File

@ -9,7 +9,6 @@ import {
import { pushKeys } from '../../helpers/key';
import { eventPushSecrets } from '../../events';
import { EventService } from '../../services';
import { ENV_SET } from '../../variables';
import { postHogClient } from '../../services';
interface PushSecret {
@ -44,7 +43,8 @@ export const pushSecrets = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
// validate environment
if (!ENV_SET.has(environment)) {
const workspaceEnvs = req.membership.workspace.environments;
if (!workspaceEnvs.find(({ slug }: { slug: string }) => slug === environment)) {
throw new Error('Failed to validate environment');
}
@ -116,7 +116,8 @@ export const pullSecrets = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
// validate environment
if (!ENV_SET.has(environment)) {
const workspaceEnvs = req.membership.workspace.environments;
if (!workspaceEnvs.find(({ slug }: { slug: string }) => slug === environment)) {
throw new Error('Failed to validate environment');
}
@ -183,7 +184,8 @@ export const pullSecretsServiceToken = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
// validate environment
if (!ENV_SET.has(environment)) {
const workspaceEnvs = req.membership.workspace.environments;
if (!workspaceEnvs.find(({ slug }: { slug: string }) => slug === environment)) {
throw new Error('Failed to validate environment');
}

View File

@ -1,7 +1,6 @@
import { Request, Response } from 'express';
import { ServiceToken } from '../../models';
import { createToken } from '../../helpers/auth';
import { ENV_SET } from '../../variables';
import { JWT_SERVICE_SECRET } from '../../config';
/**
@ -36,7 +35,8 @@ export const createServiceToken = async (req: Request, res: Response) => {
} = req.body;
// validate environment
if (!ENV_SET.has(environment)) {
const workspaceEnvs = req.membership.workspace.environments;
if (!workspaceEnvs.find(({ slug }: { slug: string }) => slug === environment)) {
throw new Error('Failed to validate environment');
}

View File

@ -1,16 +1,13 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { NODE_ENV, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET } from '../../config';
import { User, MembershipOrg } from '../../models';
import { completeAccount } from '../../helpers/user';
import { User } from '../../models';
import { JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, INVITE_ONLY_SIGNUP } from '../../config';
import {
sendEmailVerification,
checkEmailVerification,
initializeDefaultOrg
} from '../../helpers/signup';
import { issueTokens, createToken } from '../../helpers/auth';
import { INVITED, ACCEPTED } from '../../variables';
import axios from 'axios';
import { createToken } from '../../helpers/auth';
import { BadRequestError } from '../../utils/errors';
/**
* Signup step 1: Initialize account for user under email [email] and send a verification code
@ -24,6 +21,14 @@ export const beginEmailSignup = async (req: Request, res: Response) => {
try {
email = req.body.email;
if (INVITE_ONLY_SIGNUP) {
// Only one user can create an account without being invited. The rest need to be invited in order to make an account
const userCount = await User.countDocuments({})
if (userCount != 0) {
throw BadRequestError({ message: "New user sign ups are not allowed at this time. You must be invited to sign up." })
}
}
const user = await User.findOne({ email }).select('+publicKey');
if (user && user?.publicKey) {
// case: user has already completed account
@ -103,201 +108,3 @@ export const verifyEmailSignup = async (req: Request, res: Response) => {
token
});
};
/**
* Complete setting up user by adding their personal and auth information as part of the
* signup flow
* @param req
* @param res
* @returns
*/
export const completeAccountSignup = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier,
organizationName
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user'); // ensure user is non-null
// initialize default organization and workspace
await initializeDefaultOrg({
organizationName,
user
});
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueTokens({
userId: user._id.toString()
});
token = tokens.token;
refreshToken = tokens.refreshToken;
// sending a welcome email to new users
if (process.env.LOOPS_API_KEY) {
await axios.post("https://app.loops.so/api/v1/events/send", {
"email": email,
"eventName": "Sign Up",
"firstName": firstName,
"lastName": lastName
}, {
headers: {
"Accept": "application/json",
"Authorization": "Bearer " + process.env.LOOPS_API_KEY
},
});
}
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token,
refreshToken
});
};
/**
* Complete setting up user by adding their personal and auth information as part of the
* invite flow
* @param req
* @param res
* @returns
*/
export const completeAccountInvite = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
const membershipOrg = await MembershipOrg.findOne({
inviteEmail: email,
status: INVITED
});
if (!membershipOrg) throw new Error('Failed to find invitations for email');
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user');
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueTokens({
userId: user._id.toString()
});
token = tokens.token;
refreshToken = tokens.refreshToken;
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token,
refreshToken
});
};

View File

@ -1,22 +1,23 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { Request, Response } from "express";
import * as Sentry from "@sentry/node";
import {
Workspace,
Membership,
MembershipOrg,
Integration,
IntegrationAuth,
IUser,
ServiceToken,
ServiceTokenData
} from '../../models';
Workspace,
Membership,
MembershipOrg,
Integration,
IntegrationAuth,
IUser,
ServiceToken,
ServiceTokenData,
} from "../../models";
import {
createWorkspace as create,
deleteWorkspace as deleteWork
} from '../../helpers/workspace';
import { addMemberships } from '../../helpers/membership';
import { ADMIN } from '../../variables';
createWorkspace as create,
deleteWorkspace as deleteWork,
} from "../../helpers/workspace";
import { addMemberships } from "../../helpers/membership";
import { ADMIN } from "../../variables";
import { BadRequestError, ResourceNotFound, UnauthorizedRequestError } from "../../utils/errors";
import _ from "lodash";
/**
* Return public keys of members of workspace with id [workspaceId]
* @param req
@ -24,32 +25,31 @@ import { ADMIN } from '../../variables';
* @returns
*/
export const getWorkspacePublicKeys = async (req: Request, res: Response) => {
let publicKeys;
try {
const { workspaceId } = req.params;
let publicKeys;
try {
const { workspaceId } = req.params;
publicKeys = (
await Membership.find({
workspace: workspaceId
}).populate<{ user: IUser }>('user', 'publicKey')
)
.map((member) => {
return {
publicKey: member.user.publicKey,
userId: member.user._id
};
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace member public keys'
});
}
publicKeys = (
await Membership.find({
workspace: workspaceId,
}).populate<{ user: IUser }>("user", "publicKey")
).map((member) => {
return {
publicKey: member.user.publicKey,
userId: member.user._id,
};
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspace member public keys",
});
}
return res.status(200).send({
publicKeys
});
return res.status(200).send({
publicKeys,
});
};
/**
@ -59,24 +59,24 @@ export const getWorkspacePublicKeys = async (req: Request, res: Response) => {
* @returns
*/
export const getWorkspaceMemberships = async (req: Request, res: Response) => {
let users;
try {
const { workspaceId } = req.params;
let users;
try {
const { workspaceId } = req.params;
users = await Membership.find({
workspace: workspaceId
}).populate('user', '+publicKey');
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace members'
});
}
users = await Membership.find({
workspace: workspaceId,
}).populate("user", "+publicKey");
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspace members",
});
}
return res.status(200).send({
users
});
return res.status(200).send({
users,
});
};
/**
@ -86,24 +86,24 @@ export const getWorkspaceMemberships = async (req: Request, res: Response) => {
* @returns
*/
export const getWorkspaces = async (req: Request, res: Response) => {
let workspaces;
try {
workspaces = (
await Membership.find({
user: req.user._id
}).populate('workspace')
).map((m) => m.workspace);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspaces'
});
}
let workspaces;
try {
workspaces = (
await Membership.find({
user: req.user._id,
}).populate("workspace")
).map((m) => m.workspace);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspaces",
});
}
return res.status(200).send({
workspaces
});
return res.status(200).send({
workspaces,
});
};
/**
@ -113,24 +113,24 @@ export const getWorkspaces = async (req: Request, res: Response) => {
* @returns
*/
export const getWorkspace = async (req: Request, res: Response) => {
let workspace;
try {
const { workspaceId } = req.params;
let workspace;
try {
const { workspaceId } = req.params;
workspace = await Workspace.findOne({
_id: workspaceId
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace'
});
}
workspace = await Workspace.findOne({
_id: workspaceId,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspace",
});
}
return res.status(200).send({
workspace
});
return res.status(200).send({
workspace,
});
};
/**
@ -141,46 +141,46 @@ export const getWorkspace = async (req: Request, res: Response) => {
* @returns
*/
export const createWorkspace = async (req: Request, res: Response) => {
let workspace;
try {
const { workspaceName, organizationId } = req.body;
let workspace;
try {
const { workspaceName, organizationId } = req.body;
// validate organization membership
const membershipOrg = await MembershipOrg.findOne({
user: req.user._id,
organization: organizationId
});
// validate organization membership
const membershipOrg = await MembershipOrg.findOne({
user: req.user._id,
organization: organizationId,
});
if (!membershipOrg) {
throw new Error('Failed to validate organization membership');
}
if (!membershipOrg) {
throw new Error("Failed to validate organization membership");
}
if (workspaceName.length < 1) {
throw new Error('Workspace names must be at least 1-character long');
}
if (workspaceName.length < 1) {
throw new Error("Workspace names must be at least 1-character long");
}
// create workspace and add user as member
workspace = await create({
name: workspaceName,
organizationId
});
// create workspace and add user as member
workspace = await create({
name: workspaceName,
organizationId,
});
await addMemberships({
userIds: [req.user._id],
workspaceId: workspace._id.toString(),
roles: [ADMIN]
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to create workspace'
});
}
await addMemberships({
userIds: [req.user._id],
workspaceId: workspace._id.toString(),
roles: [ADMIN],
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to create workspace",
});
}
return res.status(200).send({
workspace
});
return res.status(200).send({
workspace,
});
};
/**
@ -190,24 +190,24 @@ export const createWorkspace = async (req: Request, res: Response) => {
* @returns
*/
export const deleteWorkspace = async (req: Request, res: Response) => {
try {
const { workspaceId } = req.params;
try {
const { workspaceId } = req.params;
// delete workspace
await deleteWork({
id: workspaceId
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to delete workspace'
});
}
// delete workspace
await deleteWork({
id: workspaceId,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to delete workspace",
});
}
return res.status(200).send({
message: 'Successfully deleted workspace'
});
return res.status(200).send({
message: "Successfully deleted workspace",
});
};
/**
@ -217,34 +217,34 @@ export const deleteWorkspace = async (req: Request, res: Response) => {
* @returns
*/
export const changeWorkspaceName = async (req: Request, res: Response) => {
let workspace;
try {
const { workspaceId } = req.params;
const { name } = req.body;
let workspace;
try {
const { workspaceId } = req.params;
const { name } = req.body;
workspace = await Workspace.findOneAndUpdate(
{
_id: workspaceId
},
{
name
},
{
new: true
}
);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to change workspace name'
});
}
workspace = await Workspace.findOneAndUpdate(
{
_id: workspaceId,
},
{
name,
},
{
new: true,
}
);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to change workspace name",
});
}
return res.status(200).send({
message: 'Successfully changed workspace name',
workspace
});
return res.status(200).send({
message: "Successfully changed workspace name",
workspace,
});
};
/**
@ -254,24 +254,24 @@ export const changeWorkspaceName = async (req: Request, res: Response) => {
* @returns
*/
export const getWorkspaceIntegrations = async (req: Request, res: Response) => {
let integrations;
try {
const { workspaceId } = req.params;
let integrations;
try {
const { workspaceId } = req.params;
integrations = await Integration.find({
workspace: workspaceId
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace integrations'
});
}
integrations = await Integration.find({
workspace: workspaceId,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspace integrations",
});
}
return res.status(200).send({
integrations
});
return res.status(200).send({
integrations,
});
};
/**
@ -281,56 +281,162 @@ export const getWorkspaceIntegrations = async (req: Request, res: Response) => {
* @returns
*/
export const getWorkspaceIntegrationAuthorizations = async (
req: Request,
res: Response
req: Request,
res: Response
) => {
let authorizations;
try {
const { workspaceId } = req.params;
let authorizations;
try {
const { workspaceId } = req.params;
authorizations = await IntegrationAuth.find({
workspace: workspaceId
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace integration authorizations'
});
}
authorizations = await IntegrationAuth.find({
workspace: workspaceId,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspace integration authorizations",
});
}
return res.status(200).send({
authorizations
});
return res.status(200).send({
authorizations,
});
};
export const addApproverForWorkspaceAndEnvironment = async (
req: Request,
res: Response
) => {
interface Approver {
environment: string;
userId: string;
}
const { workspaceId } = req.params;
const { approvers }: { approvers: Approver[] } = req.body;
const workspaceFromDB = await Workspace.findById(workspaceId)
if (!workspaceFromDB) {
throw ResourceNotFound()
}
const allAvailableWorkspaceEnvironments = _.map(workspaceFromDB.environments, 'slug');
const environmentsFromApprovers = _.map(approvers, "environment")
const filteredApprovers = environmentsFromApprovers.map(environment => allAvailableWorkspaceEnvironments.includes(environment))
// validate environments
if (filteredApprovers.length != environmentsFromApprovers.length) {
const err = `One or more environments set for approver(s) is invalid`
throw BadRequestError({ message: err })
}
const approverIds = _.map(approvers, "userId")
// validate approvers membership
const approversMemberships = await Membership.find({
workspace: workspaceId,
user: { $in: approverIds }
})
if (!approversMemberships) {
throw ResourceNotFound()
}
if (approversMemberships.length != approverIds.length) {
throw UnauthorizedRequestError({ message: "Approvers must be apart of the workspace they are being added to" })
}
const updatedWorkspace = await Workspace.findByIdAndUpdate(workspaceId,
{
$addToSet: {
approvers: {
$each: approvers,
}
}
}, { new: true })
return res.json(updatedWorkspace)
};
export const removeApproverForWorkspaceAndEnvironment = async (
req: Request,
res: Response
) => {
interface Approver {
environment: string;
userId: string;
}
const { workspaceId } = req.params;
const { approvers }: { approvers: Approver[] } = req.body;
const workspaceFromDB = await Workspace.findById(workspaceId)
if (!workspaceFromDB) {
throw ResourceNotFound()
}
const allAvailableWorkspaceEnvironments = _.map(workspaceFromDB.environments, 'slug');
const environmentsFromApprovers = _.map(approvers, "environment")
const filteredApprovers = environmentsFromApprovers.map(environment => allAvailableWorkspaceEnvironments.includes(environment))
// validate environments
if (filteredApprovers.length != environmentsFromApprovers.length) {
const err = `One or more environments set for approver(s) is invalid`
throw BadRequestError({ message: err })
}
const approverIds = _.map(approvers, "userId")
// validate approvers membership
const approversMemberships = await Membership.find({
workspace: workspaceId,
user: { $in: approverIds }
})
if (!approversMemberships) {
throw ResourceNotFound()
}
if (approversMemberships.length != approverIds.length) {
throw UnauthorizedRequestError({ message: "Approvers must be apart of the workspace they are being added to" })
}
const updatedWorkspace = await Workspace.findByIdAndUpdate(workspaceId, { $pullAll: { approvers: approvers } }, { new: true })
return res.json(updatedWorkspace)
};
/**
* Return service service tokens for workspace [workspaceId] belonging to user
* @param req
* @param res
* @returns
* @param req
* @param res
* @returns
*/
export const getWorkspaceServiceTokens = async (
req: Request,
res: Response
req: Request,
res: Response
) => {
let serviceTokens;
try {
const { workspaceId } = req.params;
let serviceTokens;
try {
const { workspaceId } = req.params;
// ?? FIX.
serviceTokens = await ServiceToken.find({
user: req.user._id,
workspace: workspaceId,
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get workspace service tokens",
});
}
serviceTokens = await ServiceToken.find({
user: req.user._id,
workspace: workspaceId
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace service tokens'
});
}
return res.status(200).send({
serviceTokens
});
}
return res.status(200).send({
serviceTokens,
});
};

View File

@ -0,0 +1,351 @@
/* eslint-disable @typescript-eslint/no-var-requires */
import { Request, Response } from 'express';
import jwt from 'jsonwebtoken';
import * as Sentry from '@sentry/node';
import * as bigintConversion from 'bigint-conversion';
const jsrp = require('jsrp');
import { User, LoginSRPDetail } from '../../models';
import { issueAuthTokens, createToken } from '../../helpers/auth';
import { checkUserDevice } from '../../helpers/user';
import { sendMail } from '../../helpers/nodemailer';
import { TokenService } from '../../services';
import { EELogService } from '../../ee/services';
import {
NODE_ENV,
JWT_MFA_LIFETIME,
JWT_MFA_SECRET
} from '../../config';
import { BadRequestError, InternalServerError } from '../../utils/errors';
import {
TOKEN_EMAIL_MFA,
ACTION_LOGIN
} from '../../variables';
import { getChannelFromUserAgent } from '../../utils/posthog'; // TODO: move this
declare module 'jsonwebtoken' {
export interface UserIDJwtPayload extends jwt.JwtPayload {
userId: string;
}
}
const clientPublicKeys: any = {};
/**
* Log in user step 1: Return [salt] and [serverPublicKey] as part of step 1 of SRP protocol
* @param req
* @param res
* @returns
*/
export const login1 = async (req: Request, res: Response) => {
try {
const {
email,
clientPublicKey
}: { email: string; clientPublicKey: string } = req.body;
const user = await User.findOne({
email
}).select('+salt +verifier');
if (!user) throw new Error('Failed to find user');
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier
},
async () => {
// generate server-side public key
const serverPublicKey = server.getPublicKey();
await LoginSRPDetail.findOneAndReplace({ email: email }, {
email: email,
clientPublicKey: clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt),
}, { upsert: true, returnNewDocument: false });
return res.status(200).send({
serverPublicKey,
salt: user.salt
});
}
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to start authentication process'
});
}
};
/**
* Log in user step 2: complete step 2 of SRP protocol and return token and their (encrypted)
* private key
* @param req
* @param res
* @returns
*/
export const login2 = async (req: Request, res: Response) => {
try {
if (!req.headers['user-agent']) throw InternalServerError({ message: 'User-Agent header is required' });
const { email, clientProof } = req.body;
const user = await User.findOne({
email
}).select('+salt +verifier +encryptionVersion +protectedKey +protectedKeyIV +protectedKeyTag +publicKey +encryptedPrivateKey +iv +tag');
if (!user) throw new Error('Failed to find user');
const loginSRPDetail = await LoginSRPDetail.findOneAndDelete({ email: email })
if (!loginSRPDetail) {
return BadRequestError(Error("Failed to find login details for SRP"))
}
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier,
b: loginSRPDetail.serverBInt
},
async () => {
server.setClientPublicKey(loginSRPDetail.clientPublicKey);
// compare server and client shared keys
if (server.checkClientProof(clientProof)) {
if (user.isMfaEnabled) {
// case: user has MFA enabled
// generate temporary MFA token
const token = createToken({
payload: {
userId: user._id.toString()
},
expiresIn: JWT_MFA_LIFETIME,
secret: JWT_MFA_SECRET
});
const code = await TokenService.createToken({
type: TOKEN_EMAIL_MFA,
email
});
// send MFA code [code] to [email]
await sendMail({
template: 'emailMfa.handlebars',
subjectLine: 'Infisical MFA code',
recipients: [email],
substitutions: {
code
}
});
return res.status(200).send({
mfaEnabled: true,
token
});
}
await checkUserDevice({
user,
ip: req.ip,
userAgent: req.headers['user-agent'] ?? ''
});
// issue tokens
const tokens = await issueAuthTokens({ userId: user._id.toString() });
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
// case: user does not have MFA enablgged
// return (access) token in response
interface ResponseData {
mfaEnabled: boolean;
encryptionVersion: any;
protectedKey?: string;
protectedKeyIV?: string;
protectedKeyTag?: string;
token: string;
publicKey?: string;
encryptedPrivateKey?: string;
iv?: string;
tag?: string;
}
const response: ResponseData = {
mfaEnabled: false,
encryptionVersion: user.encryptionVersion,
token: tokens.token,
publicKey: user.publicKey,
encryptedPrivateKey: user.encryptedPrivateKey,
iv: user.iv,
tag: user.tag
}
if (
user?.protectedKey &&
user?.protectedKeyIV &&
user?.protectedKeyTag
) {
response.protectedKey = user.protectedKey;
response.protectedKeyIV = user.protectedKeyIV
response.protectedKeyTag = user.protectedKeyTag;
}
const loginAction = await EELogService.createAction({
name: ACTION_LOGIN,
userId: user._id
});
loginAction && await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getChannelFromUserAgent(req.headers['user-agent']),
ipAddress: req.ip
});
return res.status(200).send(response);
}
return res.status(400).send({
message: 'Failed to authenticate. Try again?'
});
}
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to authenticate. Try again?'
});
}
};
/**
* Send MFA token to email [email]
* @param req
* @param res
*/
export const sendMfaToken = async (req: Request, res: Response) => {
try {
const { email } = req.body;
const code = await TokenService.createToken({
type: TOKEN_EMAIL_MFA,
email
});
// send MFA code [code] to [email]
await sendMail({
template: 'emailMfa.handlebars',
subjectLine: 'Infisical MFA code',
recipients: [email],
substitutions: {
code
}
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to send MFA code'
});
}
return res.status(200).send({
message: 'Successfully sent new MFA code'
});
}
/**
* Verify MFA token [mfaToken] and issue JWT and refresh tokens if the
* MFA token [mfaToken] is valid
* @param req
* @param res
*/
export const verifyMfaToken = async (req: Request, res: Response) => {
const { email, mfaToken } = req.body;
await TokenService.validateToken({
type: TOKEN_EMAIL_MFA,
email,
token: mfaToken
});
const user = await User.findOne({
email
}).select('+salt +verifier +encryptionVersion +protectedKey +protectedKeyIV +protectedKeyTag +publicKey +encryptedPrivateKey +iv +tag');
if (!user) throw new Error('Failed to find user');
await checkUserDevice({
user,
ip: req.ip,
userAgent: req.headers['user-agent'] ?? ''
});
// issue tokens
const tokens = await issueAuthTokens({ userId: user._id.toString() });
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
interface VerifyMfaTokenRes {
encryptionVersion: number;
protectedKey?: string;
protectedKeyIV?: string;
protectedKeyTag?: string;
token: string;
publicKey: string;
encryptedPrivateKey: string;
iv: string;
tag: string;
}
const resObj: VerifyMfaTokenRes = {
encryptionVersion: user.encryptionVersion,
token: tokens.token,
publicKey: user.publicKey as string,
encryptedPrivateKey: user.encryptedPrivateKey as string,
iv: user.iv as string,
tag: user.tag as string
}
if (user?.protectedKey && user?.protectedKeyIV && user?.protectedKeyTag) {
resObj.protectedKey = user.protectedKey;
resObj.protectedKeyIV = user.protectedKeyIV;
resObj.protectedKeyTag = user.protectedKeyTag;
}
const loginAction = await EELogService.createAction({
name: ACTION_LOGIN,
userId: user._id
});
loginAction && await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getChannelFromUserAgent(req.headers['user-agent']),
ipAddress: req.ip
});
return res.status(200).send(resObj);
}

View File

@ -0,0 +1,262 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import {
Secret,
ServiceToken,
Workspace,
Integration,
ServiceTokenData,
Membership,
} from '../../models';
import { SecretVersion } from '../../ee/models';
import { BadRequestError } from '../../utils/errors';
import _ from 'lodash';
import { ABILITY_READ, ABILITY_WRITE } from '../../variables/organization';
/**
* Create new workspace environment named [environmentName] under workspace with id
* @param req
* @param res
* @returns
*/
export const createWorkspaceEnvironment = async (
req: Request,
res: Response
) => {
const { workspaceId } = req.params;
const { environmentName, environmentSlug } = req.body;
try {
const workspace = await Workspace.findById(workspaceId).exec();
if (
!workspace ||
workspace?.environments.find(
({ name, slug }) => slug === environmentSlug || environmentName === name
)
) {
throw new Error('Failed to create workspace environment');
}
workspace?.environments.push({
name: environmentName,
slug: environmentSlug.toLowerCase(),
});
await workspace.save();
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to create new workspace environment',
});
}
return res.status(200).send({
message: 'Successfully created new environment',
workspace: workspaceId,
environment: {
name: environmentName,
slug: environmentSlug,
},
});
};
/**
* Rename workspace environment with new name and slug of a workspace with [workspaceId]
* Old slug [oldEnvironmentSlug] must be provided
* @param req
* @param res
* @returns
*/
export const renameWorkspaceEnvironment = async (
req: Request,
res: Response
) => {
const { workspaceId } = req.params;
const { environmentName, environmentSlug, oldEnvironmentSlug } = req.body;
try {
// user should pass both new slug and env name
if (!environmentSlug || !environmentName) {
throw new Error('Invalid environment given.');
}
// atomic update the env to avoid conflict
const workspace = await Workspace.findById(workspaceId).exec();
if (!workspace) {
throw new Error('Failed to create workspace environment');
}
const isEnvExist = workspace.environments.some(
({ name, slug }) =>
slug !== oldEnvironmentSlug &&
(name === environmentName || slug === environmentSlug)
);
if (isEnvExist) {
throw new Error('Invalid environment given');
}
const envIndex = workspace?.environments.findIndex(
({ slug }) => slug === oldEnvironmentSlug
);
if (envIndex === -1) {
throw new Error('Invalid environment given');
}
workspace.environments[envIndex].name = environmentName;
workspace.environments[envIndex].slug = environmentSlug.toLowerCase();
await workspace.save();
await Secret.updateMany(
{ workspace: workspaceId, environment: oldEnvironmentSlug },
{ environment: environmentSlug }
);
await SecretVersion.updateMany(
{ workspace: workspaceId, environment: oldEnvironmentSlug },
{ environment: environmentSlug }
);
await ServiceToken.updateMany(
{ workspace: workspaceId, environment: oldEnvironmentSlug },
{ environment: environmentSlug }
);
await ServiceTokenData.updateMany(
{ workspace: workspaceId, environment: oldEnvironmentSlug },
{ environment: environmentSlug }
);
await Integration.updateMany(
{ workspace: workspaceId, environment: oldEnvironmentSlug },
{ environment: environmentSlug }
);
await Membership.updateMany(
{
workspace: workspaceId,
"deniedPermissions.environmentSlug": oldEnvironmentSlug
},
{ $set: { "deniedPermissions.$[element].environmentSlug": environmentSlug } },
{ arrayFilters: [{ "element.environmentSlug": oldEnvironmentSlug }] }
)
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to update workspace environment',
});
}
return res.status(200).send({
message: 'Successfully update environment',
workspace: workspaceId,
environment: {
name: environmentName,
slug: environmentSlug,
},
});
};
/**
* Delete workspace environment by [environmentSlug] of workspace [workspaceId] and do the clean up
* @param req
* @param res
* @returns
*/
export const deleteWorkspaceEnvironment = async (
req: Request,
res: Response
) => {
const { workspaceId } = req.params;
const { environmentSlug } = req.body;
try {
// atomic update the env to avoid conflict
const workspace = await Workspace.findById(workspaceId).exec();
if (!workspace) {
throw new Error('Failed to create workspace environment');
}
const envIndex = workspace?.environments.findIndex(
({ slug }) => slug === environmentSlug
);
if (envIndex === -1) {
throw new Error('Invalid environment given');
}
workspace.environments.splice(envIndex, 1);
await workspace.save();
// clean up
await Secret.deleteMany({
workspace: workspaceId,
environment: environmentSlug,
});
await SecretVersion.deleteMany({
workspace: workspaceId,
environment: environmentSlug,
});
await ServiceToken.deleteMany({
workspace: workspaceId,
environment: environmentSlug,
});
await ServiceTokenData.deleteMany({
workspace: workspaceId,
environment: environmentSlug,
});
await Integration.deleteMany({
workspace: workspaceId,
environment: environmentSlug,
});
await Membership.updateMany(
{ workspace: workspaceId },
{ $pull: { deniedPermissions: { environmentSlug: environmentSlug } } }
)
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to delete workspace environment',
});
}
return res.status(200).send({
message: 'Successfully deleted environment',
workspace: workspaceId,
environment: environmentSlug,
});
};
export const getAllAccessibleEnvironmentsOfWorkspace = async (
req: Request,
res: Response
) => {
const { workspaceId } = req.params;
const workspacesUserIsMemberOf = await Membership.findOne({
workspace: workspaceId,
user: req.user
})
if (!workspacesUserIsMemberOf) {
throw BadRequestError()
}
const accessibleEnvironments: any = []
const deniedPermission = workspacesUserIsMemberOf.deniedPermissions
const relatedWorkspace = await Workspace.findById(workspaceId)
if (!relatedWorkspace) {
throw BadRequestError()
}
relatedWorkspace.environments.forEach(environment => {
const isReadBlocked = _.some(deniedPermission, { environmentSlug: environment.slug, ability: ABILITY_READ })
const isWriteBlocked = _.some(deniedPermission, { environmentSlug: environment.slug, ability: ABILITY_WRITE })
if (isReadBlocked && isWriteBlocked) {
return
} else {
accessibleEnvironments.push({
name: environment.name,
slug: environment.slug,
isWriteDenied: isWriteBlocked,
isReadDenied: isReadBlocked
})
}
})
res.json({ accessibleEnvironments })
};

View File

@ -1,13 +1,25 @@
import * as authController from './authController';
import * as signupController from './signupController';
import * as usersController from './usersController';
import * as organizationsController from './organizationsController';
import * as workspaceController from './workspaceController';
import * as serviceTokenDataController from './serviceTokenDataController';
import * as apiKeyDataController from './apiKeyDataController';
import * as secretController from './secretController';
import * as secretsController from './secretsController';
import * as environmentController from './environmentController';
import * as tagController from './tagController';
export {
authController,
signupController,
usersController,
organizationsController,
workspaceController,
serviceTokenDataController,
apiKeyDataController,
secretController,
secretsController
secretsController,
environmentController,
tagController
}

View File

@ -0,0 +1,296 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import {
MembershipOrg,
Membership,
Workspace
} from '../../models';
import { deleteMembershipOrg } from '../../helpers/membershipOrg';
import { updateSubscriptionOrgQuantity } from '../../helpers/organization';
/**
* Return memberships for organization with id [organizationId]
* @param req
* @param res
*/
export const getOrganizationMemberships = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return organization memberships'
#swagger.description = 'Return organization memberships'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['organizationId'] = {
"description": "ID of organization",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"memberships": {
"type": "array",
"items": {
$ref: "#/components/schemas/MembershipOrg"
},
"description": "Memberships of organization"
}
}
}
}
}
}
*/
let memberships;
try {
const { organizationId } = req.params;
memberships = await MembershipOrg.find({
organization: organizationId
}).populate('user', '+publicKey');
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get organization memberships'
});
}
return res.status(200).send({
memberships
});
}
/**
* Update role of membership with id [membershipId] to role [role]
* @param req
* @param res
*/
export const updateOrganizationMembership = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Update organization membership'
#swagger.description = 'Update organization membership'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['organizationId'] = {
"description": "ID of organization",
"required": true,
"type": "string"
}
#swagger.parameters['membershipId'] = {
"description": "ID of organization membership to update",
"required": true,
"type": "string"
}
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"role": {
"type": "string",
"description": "Role of organization membership - either owner, admin, or member",
}
}
}
}
}
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"membership": {
$ref: "#/components/schemas/MembershipOrg",
"description": "Updated organization membership"
}
}
}
}
}
}
*/
let membership;
try {
const { membershipId } = req.params;
const { role } = req.body;
membership = await MembershipOrg.findByIdAndUpdate(
membershipId,
{
role
}, {
new: true
}
);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to update organization membership'
});
}
return res.status(200).send({
membership
});
}
/**
* Delete organization membership with id [membershipId]
* @param req
* @param res
* @returns
*/
export const deleteOrganizationMembership = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Delete organization membership'
#swagger.description = 'Delete organization membership'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['organizationId'] = {
"description": "ID of organization",
"required": true,
"type": "string"
}
#swagger.parameters['membershipId'] = {
"description": "ID of organization membership to delete",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"membership": {
$ref: "#/components/schemas/MembershipOrg",
"description": "Deleted organization membership"
}
}
}
}
}
}
*/
let membership;
try {
const { membershipId } = req.params;
// delete organization membership
membership = await deleteMembershipOrg({
membershipOrgId: membershipId
});
await updateSubscriptionOrgQuantity({
organizationId: membership.organization.toString()
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to delete organization membership'
});
}
return res.status(200).send({
membership
});
}
/**
* Return workspaces for organization with id [organizationId] that user has
* access to
* @param req
* @param res
*/
export const getOrganizationWorkspaces = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return projects in organization that user is part of'
#swagger.description = 'Return projects in organization that user is part of'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['organizationId'] = {
"description": "ID of organization",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"workspaces": {
"type": "array",
"items": {
$ref: "#/components/schemas/Project"
},
"description": "Projects of organization"
}
}
}
}
}
}
*/
let workspaces;
try {
const { organizationId } = req.params;
const workspacesSet = new Set(
(
await Workspace.find(
{
organization: organizationId
},
'_id'
)
).map((w) => w._id.toString())
);
workspaces = (
await Membership.find({
user: req.user._id
}).populate('workspace')
)
.filter((m) => workspacesSet.has(m.workspace._id.toString()))
.map((m) => m.workspace);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get organization workspaces'
});
}
return res.status(200).send({
workspaces
});
}

View File

@ -2,6 +2,7 @@ import to from 'await-to-js';
import { Types } from 'mongoose';
import { Request, Response } from 'express';
import { ISecret, Secret } from '../../models';
import { IAction } from '../../ee/models';
import {
SECRET_PERSONAL,
SECRET_SHARED,
@ -10,12 +11,262 @@ import {
ACTION_UPDATE_SECRETS,
ACTION_DELETE_SECRETS
} from '../../variables';
import { ValidationError } from '../../utils/errors';
import { UnauthorizedRequestError, ValidationError } from '../../utils/errors';
import { EventService } from '../../services';
import { eventPushSecrets } from '../../events';
import { EESecretService, EELogService } from '../../ee/services';
import { postHogClient } from '../../services';
import { BadRequestError } from '../../utils/errors';
import { getChannelFromUserAgent } from '../../utils/posthog';
import { ABILITY_READ, ABILITY_WRITE } from '../../variables/organization';
import { userHasNoAbility, userHasWorkspaceAccess, userHasWriteOnlyAbility } from '../../ee/helpers/checkMembershipPermissions';
import Tag from '../../models/tag';
import _ from 'lodash';
import {
BatchSecretRequest,
BatchSecret
} from '../../types/secret';
/**
* Peform a batch of any specified CUD secret operations
* @param req
* @param res
*/
export const batchSecrets = async (req: Request, res: Response) => {
const channel = getChannelFromUserAgent(req.headers['user-agent']);
const {
workspaceId,
environment,
requests
}: {
workspaceId: string;
environment: string;
requests: BatchSecretRequest[];
}= req.body;
const createSecrets: BatchSecret[] = [];
const updateSecrets: BatchSecret[] = [];
const deleteSecrets: Types.ObjectId[] = [];
const actions: IAction[] = [];
requests.forEach((request) => {
switch (request.method) {
case 'POST':
createSecrets.push({
...request.secret,
version: 1,
user: request.secret.type === SECRET_PERSONAL ? req.user : undefined,
environment,
workspace: new Types.ObjectId(workspaceId)
});
break;
case 'PATCH':
updateSecrets.push({
...request.secret,
_id: new Types.ObjectId(request.secret._id)
});
break;
case 'DELETE':
deleteSecrets.push(new Types.ObjectId(request.secret._id));
break;
}
});
// handle create secrets
let createdSecrets: ISecret[] = [];
if (createSecrets.length > 0) {
createdSecrets = await Secret.insertMany(createSecrets);
// (EE) add secret versions for new secrets
await EESecretService.addSecretVersions({
secretVersions: createdSecrets.map((n: any) => {
return ({
...n._doc,
_id: new Types.ObjectId(),
secret: n._id,
isDeleted: false
});
})
});
const addAction = await EELogService.createAction({
name: ACTION_ADD_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: createdSecrets.map((n) => n._id)
}) as IAction;
actions.push(addAction);
if (postHogClient) {
postHogClient.capture({
event: 'secrets added',
distinctId: req.user.email,
properties: {
numberOfSecrets: createdSecrets.length,
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
}
});
}
}
// handle update secrets
let updatedSecrets: ISecret[] = [];
if (updateSecrets.length > 0 && req.secrets) {
// construct object containing all secrets
let listedSecretsObj: {
[key: string]: {
version: number;
type: string;
}
} = {};
listedSecretsObj = req.secrets.reduce((obj: any, secret: ISecret) => ({
...obj,
[secret._id.toString()]: secret
}), {});
const updateOperations = updateSecrets.map((u) => ({
updateOne: {
filter: { _id: new Types.ObjectId(u._id) },
update: {
$inc: {
version: 1
},
...u,
_id: new Types.ObjectId(u._id)
}
}
}));
await Secret.bulkWrite(updateOperations);
const secretVersions = updateSecrets.map((u) => ({
secret: new Types.ObjectId(u._id),
version: listedSecretsObj[u._id.toString()].version,
workspace: new Types.ObjectId(workspaceId),
type: listedSecretsObj[u._id.toString()].type,
environment,
isDeleted: false,
secretKeyCiphertext: u.secretKeyCiphertext,
secretKeyIV: u.secretKeyIV,
secretKeyTag: u.secretKeyTag,
secretValueCiphertext: u.secretValueCiphertext,
secretValueIV: u.secretValueIV,
secretValueTag: u.secretValueTag,
secretCommentCiphertext: u.secretCommentCiphertext,
secretCommentIV: u.secretCommentIV,
secretCommentTag: u.secretCommentTag,
tags: u.tags
}));
await EESecretService.addSecretVersions({
secretVersions
});
updatedSecrets = await Secret.find({
_id: {
$in: updateSecrets.map((u) => new Types.ObjectId(u._id))
}
});
const updateAction = await EELogService.createAction({
name: ACTION_UPDATE_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: updatedSecrets.map((u) => u._id)
}) as IAction;
actions.push(updateAction);
if (postHogClient) {
postHogClient.capture({
event: 'secrets modified',
distinctId: req.user.email,
properties: {
numberOfSecrets: updateSecrets.length,
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
}
});
}
}
// handle delete secrets
if (deleteSecrets.length > 0) {
await Secret.deleteMany({
_id: {
$in: deleteSecrets
}
});
await EESecretService.markDeletedSecretVersions({
secretIds: deleteSecrets
});
const deleteAction = await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: deleteSecrets
}) as IAction;
actions.push(deleteAction);
if (postHogClient) {
postHogClient.capture({
event: 'secrets deleted',
distinctId: req.user.email,
properties: {
numberOfSecrets: deleteSecrets.length,
environment,
workspaceId,
channel: channel,
userAgent: req.headers?.['user-agent']
}
});
}
}
if (actions.length > 0) {
// (EE) create (audit) log
await EELogService.createLog({
userId: req.user._id.toString(),
workspaceId: new Types.ObjectId(workspaceId),
actions,
channel,
ipAddress: req.ip
});
}
// // trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId
})
});
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId
});
const resObj: { [key: string]: ISecret[] | string[] } = {}
if (createSecrets.length > 0) {
resObj['createdSecrets'] = createdSecrets;
}
if (updateSecrets.length > 0) {
resObj['updatedSecrets'] = updatedSecrets;
}
if (deleteSecrets.length > 0) {
resObj['deletedSecrets'] = deleteSecrets.map((d) => d.toString());
}
return res.status(200).send(resObj);
}
/**
* Create secret(s) for workspace with id [workspaceId] and environment [environment]
@ -23,20 +274,92 @@ import { BadRequestError } from '../../utils/errors';
* @param res
*/
export const createSecrets = async (req: Request, res: Response) => {
const channel = req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli';
const { workspaceId, environment } = req.body;
/*
#swagger.summary = 'Create new secret(s)'
#swagger.description = 'Create one or many secrets for a given project and environment.'
#swagger.security = [{
"apiKeyAuth": []
}]
let toAdd;
if (Array.isArray(req.body.secrets)) {
// case: create multiple secrets
toAdd = req.body.secrets;
} else if (typeof req.body.secrets === 'object') {
// case: create 1 secret
toAdd = [req.body.secrets];
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"workspaceId": {
"type": "string",
"description": "ID of project",
},
"environment": {
"type": "string",
"description": "Environment within project"
},
"secrets": {
$ref: "#/components/schemas/CreateSecret",
"description": "Secret(s) to create - object or array of objects"
}
}
}
}
}
}
const newSecrets = await Secret.insertMany(
toAdd.map(({
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"secrets": {
"type": "array",
"items": {
$ref: "#/components/schemas/Secret"
},
"description": "Newly-created secrets for the given project and environment"
}
}
}
}
}
}
*/
const channel = getChannelFromUserAgent(req.headers['user-agent'])
const { workspaceId, environment }: { workspaceId: string, environment: string } = req.body;
const hasAccess = await userHasWorkspaceAccess(req.user, workspaceId, environment, ABILITY_WRITE)
if (!hasAccess) {
throw UnauthorizedRequestError({ message: "You do not have the necessary permission(s) perform this action" })
}
let listOfSecretsToCreate;
if (Array.isArray(req.body.secrets)) {
// case: create multiple secrets
listOfSecretsToCreate = req.body.secrets;
} else if (typeof req.body.secrets === 'object') {
// case: create 1 secret
listOfSecretsToCreate = [req.body.secrets];
}
type secretsToCreateType = {
type: string;
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretCommentCiphertext: string;
secretCommentIV: string;
secretCommentTag: string;
tags: string[]
}
const newlyCreatedSecrets = await Secret.insertMany(
listOfSecretsToCreate.map(({
type,
secretKeyCiphertext,
secretKeyIV,
@ -44,32 +367,43 @@ export const createSecrets = async (req: Request, res: Response) => {
secretValueCiphertext,
secretValueIV,
secretValueTag,
}: {
type: string;
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
}) => ({
version: 1,
workspace: new Types.ObjectId(workspaceId),
type,
user: type === SECRET_PERSONAL ? req.user : undefined,
environment,
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag
}))
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags
}: secretsToCreateType) => {
return ({
version: 1,
workspace: new Types.ObjectId(workspaceId),
type,
user: type === SECRET_PERSONAL ? req.user : undefined,
environment,
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags
});
})
);
setTimeout(async () => {
// trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId
})
});
}, 5000);
// (EE) add secret versions for new secrets
EESecretService.addSecretVersions({
secretVersions: newSecrets.map(({
await EESecretService.addSecretVersions({
secretVersions: newlyCreatedSecrets.map(({
_id,
version,
workspace,
@ -79,11 +413,13 @@ export const createSecrets = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags
}) => ({
_id: new Types.ObjectId(),
secret: _id,
@ -96,32 +432,27 @@ export const createSecrets = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags
}))
});
// trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId
})
});
const addAction = await EELogService.createActionSecret({
const addAction = await EELogService.createAction({
name: ACTION_ADD_SECRETS,
userId: req.user._id.toString(),
workspaceId,
secretIds: newSecrets.map((n) => n._id)
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: newlyCreatedSecrets.map((n) => n._id)
});
// (EE) create (audit) log
addAction && await EELogService.createLog({
userId: req.user._id.toString(),
workspaceId,
workspaceId: new Types.ObjectId(workspaceId),
actions: [addAction],
channel,
ipAddress: req.ip
@ -137,17 +468,17 @@ export const createSecrets = async (req: Request, res: Response) => {
event: 'secrets added',
distinctId: req.user.email,
properties: {
numberOfSecrets: toAdd.length,
numberOfSecrets: listOfSecretsToCreate.length,
environment,
workspaceId,
channel: req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli',
channel: channel,
userAgent: req.headers?.['user-agent']
}
});
}
return res.status(200).send({
secrets: newSecrets
secrets: newlyCreatedSecrets
});
}
@ -159,10 +490,51 @@ export const createSecrets = async (req: Request, res: Response) => {
* @returns
*/
export const getSecrets = async (req: Request, res: Response) => {
const { workspaceId, environment } = req.query;
/*
#swagger.summary = 'Read secrets'
#swagger.description = 'Read secrets from a project and environment'
#swagger.security = [{
"apiKeyAuth": []
}]
let userId: Types.ObjectId | undefined = undefined // used for getting personal secrets for user
let userEmail: Types.ObjectId | undefined = undefined // used for posthog
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.parameters['environment'] = {
"description": "Environment within project",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"secrets": {
"type": "array",
"items": {
$ref: "#/components/schemas/Secret"
},
"description": "Secrets for the given project and environment"
}
}
}
}
}
}
*/
const { workspaceId, environment, tagSlugs } = req.query;
const tagNamesList = typeof tagSlugs === 'string' && tagSlugs !== '' ? tagSlugs.split(',') : [];
let userId = "" // used for getting personal secrets for user
let userEmail = "" // used for posthog
if (req.user) {
userId = req.user._id;
userEmail = req.user.email;
@ -173,8 +545,38 @@ export const getSecrets = async (req: Request, res: Response) => {
userEmail = req.serviceTokenData.user.email;
}
const [err, secrets] = await to(Secret.find(
{
// none service token case as service tokens are already scoped to env and project
let hasWriteOnlyAccess
if (!req.serviceTokenData) {
hasWriteOnlyAccess = await userHasWriteOnlyAbility(userId, workspaceId, environment)
const hasNoAccess = await userHasNoAbility(userId, workspaceId, environment)
if (hasNoAccess) {
throw UnauthorizedRequestError({ message: "You do not have the necessary permission(s) perform this action" })
}
}
let secrets: any
let secretQuery: any
if (tagNamesList != undefined && tagNamesList.length != 0) {
const workspaceFromDB = await Tag.find({ workspace: workspaceId })
const tagIds = _.map(tagNamesList, (tagName) => {
const tag = _.find(workspaceFromDB, { slug: tagName });
return tag ? tag.id : null;
});
secretQuery = {
workspace: workspaceId,
environment,
$or: [
{ user: userId },
{ user: { $exists: false } }
],
tags: { $in: tagIds },
type: { $in: [SECRET_SHARED, SECRET_PERSONAL] }
}
} else {
secretQuery = {
workspace: workspaceId,
environment,
$or: [
@ -183,22 +585,26 @@ export const getSecrets = async (req: Request, res: Response) => {
],
type: { $in: [SECRET_SHARED, SECRET_PERSONAL] }
}
).then())
}
if (err) throw ValidationError({ message: 'Failed to get secrets', stack: err.stack });
if (hasWriteOnlyAccess) {
secrets = await Secret.find(secretQuery).select("secretKeyCiphertext secretKeyIV secretKeyTag")
} else {
secrets = await Secret.find(secretQuery).populate("tags")
}
const channel = req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli';
const channel = getChannelFromUserAgent(req.headers['user-agent'])
const readAction = await EELogService.createActionSecret({
const readAction = await EELogService.createAction({
name: ACTION_READ_SECRETS,
userId: req.user._id.toString(),
workspaceId: workspaceId as string,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId as string),
secretIds: secrets.map((n: any) => n._id)
});
readAction && await EELogService.createLog({
userId: req.user._id.toString(),
workspaceId: workspaceId as string,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId as string),
actions: [readAction],
channel,
ipAddress: req.ip
@ -206,7 +612,7 @@ export const getSecrets = async (req: Request, res: Response) => {
if (postHogClient) {
postHogClient.capture({
event: 'secrets added',
event: 'secrets pulled',
distinctId: userEmail,
properties: {
numberOfSecrets: secrets.length,
@ -223,12 +629,109 @@ export const getSecrets = async (req: Request, res: Response) => {
});
}
export const getOnlySecretKeys = async (req: Request, res: Response) => {
const { workspaceId, environment } = req.query;
let userId = "" // used for getting personal secrets for user
let userEmail = "" // used for posthog
if (req.user) {
userId = req.user._id;
userEmail = req.user.email;
}
if (req.serviceTokenData) {
userId = req.serviceTokenData.user._id
userEmail = req.serviceTokenData.user.email;
}
// none service token case as service tokens are already scoped
if (!req.serviceTokenData) {
const hasAccess = await userHasWorkspaceAccess(userId, workspaceId, environment, ABILITY_READ)
if (!hasAccess) {
throw UnauthorizedRequestError({ message: "You do not have the necessary permission(s) perform this action" })
}
}
const [err, secretKeys] = await to(Secret.find(
{
workspace: workspaceId,
environment,
$or: [
{ user: userId },
{ user: { $exists: false } }
],
type: { $in: [SECRET_SHARED, SECRET_PERSONAL] }
}
)
.select("secretKeyIV secretKeyTag secretKeyCiphertext")
.then())
if (err) throw ValidationError({ message: 'Failed to get secrets', stack: err.stack });
// readAction && await EELogService.createLog({
// userId: new Types.ObjectId(userId),
// workspaceId: new Types.ObjectId(workspaceId as string),
// actions: [readAction],
// channel,
// ipAddress: req.ip
// });
return res.status(200).send({
secretKeys
});
}
/**
* Update secret(s)
* @param req
* @param res
*/
export const updateSecrets = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Update secret(s)'
#swagger.description = 'Update secret(s)'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"secrets": {
$ref: "#/components/schemas/UpdateSecret",
"description": "Secret(s) to update - object or array of objects"
}
}
}
}
}
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"secrets": {
"type": "array",
"items": {
$ref: "#/components/schemas/Secret"
},
"description": "Updated secrets"
}
}
}
}
}
}
*/
const channel = req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli';
// TODO: move type
@ -243,6 +746,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretCommentCiphertext: string;
secretCommentIV: string;
secretCommentTag: string;
tags: string[]
}
const updateOperationsToPerform = req.body.secrets.map((secret: PatchSecret) => {
@ -255,7 +759,8 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretValueTag,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag
secretCommentTag,
tags
} = secret;
return ({
@ -271,8 +776,9 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretValueCiphertext,
secretValueIV,
secretValueTag,
tags,
...((
secretCommentCiphertext &&
secretCommentCiphertext !== undefined &&
secretCommentIV &&
secretCommentTag
) ? {
@ -305,6 +811,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
tags
} = secretModificationsBySecretId[secret._id.toString()]
return ({
@ -322,6 +829,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
secretCommentCiphertext: secretCommentCiphertext ? secretCommentCiphertext : secret.secretCommentCiphertext,
secretCommentIV: secretCommentIV ? secretCommentIV : secret.secretCommentIV,
secretCommentTag: secretCommentTag ? secretCommentTag : secret.secretCommentTag,
tags: tags ? tags : secret.tags
});
})
}
@ -342,23 +850,25 @@ export const updateSecrets = async (req: Request, res: Response) => {
Object.keys(workspaceSecretObj).forEach(async (key) => {
// trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: key
})
});
setTimeout(async () => {
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: key
})
});
}, 10000);
const updateAction = await EELogService.createActionSecret({
const updateAction = await EELogService.createAction({
name: ACTION_UPDATE_SECRETS,
userId: req.user._id.toString(),
workspaceId: key,
userId: req.user._id,
workspaceId: new Types.ObjectId(key),
secretIds: workspaceSecretObj[key].map((secret: ISecret) => secret._id)
});
// (EE) create (audit) log
updateAction && await EELogService.createLog({
userId: req.user._id.toString(),
workspaceId: key,
workspaceId: new Types.ObjectId(key),
actions: [updateAction],
channel,
ipAddress: req.ip
@ -377,7 +887,7 @@ export const updateSecrets = async (req: Request, res: Response) => {
numberOfSecrets: workspaceSecretObj[key].length,
environment: workspaceSecretObj[key][0].environment,
workspaceId: key,
channel: req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli',
channel: channel,
userAgent: req.headers?.['user-agent']
}
});
@ -399,7 +909,51 @@ export const updateSecrets = async (req: Request, res: Response) => {
* @param res
*/
export const deleteSecrets = async (req: Request, res: Response) => {
const channel = req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli';
/*
#swagger.summary = 'Delete secret(s)'
#swagger.description = 'Delete one or many secrets by their ID(s)'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"secretIds": {
"type": "string",
"description": "ID(s) of secrets - string or array of strings"
},
}
}
}
}
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"secrets": {
"type": "array",
"items": {
$ref: "#/components/schemas/Secret"
},
"description": "Deleted secrets"
}
}
}
}
}
}
*/
const channel = getChannelFromUserAgent(req.headers['user-agent'])
const toDelete = req.secrets.map((s: any) => s._id);
await Secret.deleteMany({
@ -430,17 +984,17 @@ export const deleteSecrets = async (req: Request, res: Response) => {
workspaceId: key
})
});
const deleteAction = await EELogService.createActionSecret({
const deleteAction = await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
userId: req.user._id.toString(),
workspaceId: key,
userId: req.user._id,
workspaceId: new Types.ObjectId(key),
secretIds: workspaceSecretObj[key].map((secret: ISecret) => secret._id)
});
// (EE) create (audit) log
deleteAction && await EELogService.createLog({
userId: req.user._id.toString(),
workspaceId: key,
workspaceId: new Types.ObjectId(key),
actions: [deleteAction],
channel,
ipAddress: req.ip
@ -459,7 +1013,7 @@ export const deleteSecrets = async (req: Request, res: Response) => {
numberOfSecrets: workspaceSecretObj[key].length,
environment: workspaceSecretObj[key][0].environment,
workspaceId: key,
channel: req.headers?.['user-agent']?.toLowerCase().includes('mozilla') ? 'web' : 'cli',
channel: channel,
userAgent: req.headers?.['user-agent']
}
});

View File

@ -8,6 +8,8 @@ import {
import {
SALT_ROUNDS
} from '../../config';
import { userHasWorkspaceAccess } from '../../ee/helpers/checkMembershipPermissions';
import { ABILITY_READ } from '../../variables/organization';
/**
* Return service token data associated with service token on request
@ -37,6 +39,11 @@ export const createServiceTokenData = async (req: Request, res: Response) => {
expiresIn
} = req.body;
const hasAccess = await userHasWorkspaceAccess(req.user, workspaceId, environment, ABILITY_READ)
if (!hasAccess) {
throw UnauthorizedRequestError({ message: "You do not have the necessary permission(s) perform this action" })
}
const secret = crypto.randomBytes(16).toString('hex');
const secretHash = await bcrypt.hash(secret, SALT_ROUNDS);
@ -100,4 +107,8 @@ export const deleteServiceTokenData = async (req: Request, res: Response) => {
return res.status(200).send({
serviceTokenData
});
}
}
function UnauthorizedRequestError(arg0: { message: string; }) {
throw new Error('Function not implemented.');
}

View File

@ -0,0 +1,250 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { User, MembershipOrg } from '../../models';
import { completeAccount } from '../../helpers/user';
import {
initializeDefaultOrg
} from '../../helpers/signup';
import { issueAuthTokens } from '../../helpers/auth';
import { INVITED, ACCEPTED } from '../../variables';
import { NODE_ENV } from '../../config';
import request from '../../config/request';
/**
* Complete setting up user by adding their personal and auth information as part of the
* signup flow
* @param req
* @param res
* @returns
*/
export const completeAccountSignup = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier,
organizationName
}: {
email: string;
firstName: string;
lastName: string;
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
publicKey: string;
encryptedPrivateKey: string;
encryptedPrivateKeyIV: string;
encryptedPrivateKeyTag: string;
salt: string;
verifier: string;
organizationName: string;
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user'); // ensure user is non-null
// initialize default organization and workspace
await initializeDefaultOrg({
organizationName,
user
});
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueAuthTokens({
userId: user._id.toString()
});
token = tokens.token;
// sending a welcome email to new users
if (process.env.LOOPS_API_KEY) {
await request.post("https://app.loops.so/api/v1/events/send", {
"email": email,
"eventName": "Sign Up",
"firstName": firstName,
"lastName": lastName
}, {
headers: {
"Accept": "application/json",
"Authorization": "Bearer " + process.env.LOOPS_API_KEY
},
});
}
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token
});
};
/**
* Complete setting up user by adding their personal and auth information as part of the
* invite flow
* @param req
* @param res
* @returns
*/
export const completeAccountInvite = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
const membershipOrg = await MembershipOrg.findOne({
inviteEmail: email,
status: INVITED
});
if (!membershipOrg) throw new Error('Failed to find invitations for email');
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user');
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueAuthTokens({
userId: user._id.toString()
});
token = tokens.token;
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token
});
};

View File

@ -0,0 +1,72 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import {
Membership, Secret,
} from '../../models';
import Tag, { ITag } from '../../models/tag';
import { Builder } from "builder-pattern"
import to from 'await-to-js';
import { BadRequestError, UnauthorizedRequestError } from '../../utils/errors';
import { MongoError } from 'mongodb';
import { userHasWorkspaceAccess } from '../../ee/helpers/checkMembershipPermissions';
export const createWorkspaceTag = async (req: Request, res: Response) => {
const { workspaceId } = req.params
const { name, slug } = req.body
const sanitizedTagToCreate = Builder<ITag>()
.name(name)
.workspace(new Types.ObjectId(workspaceId))
.slug(slug)
.user(new Types.ObjectId(req.user._id))
.build();
const [err, createdTag] = await to(Tag.create(sanitizedTagToCreate))
if (err) {
if ((err as MongoError).code === 11000) {
throw BadRequestError({ message: "Tags must be unique in a workspace" })
}
throw err
}
res.json(createdTag)
}
export const deleteWorkspaceTag = async (req: Request, res: Response) => {
const { tagId } = req.params
const tagFromDB = await Tag.findById(tagId)
if (!tagFromDB) {
throw BadRequestError()
}
// can only delete if the request user is one that belongs to the same workspace as the tag
const membership = await Membership.findOne({
user: req.user,
workspace: tagFromDB.workspace
});
if (!membership) {
UnauthorizedRequestError({ message: 'Failed to validate membership' });
}
const result = await Tag.findByIdAndDelete(tagId);
// remove the tag from secrets
await Secret.updateMany(
{ tags: { $in: [tagId] } },
{ $pull: { tags: tagId } }
);
res.json(result);
}
export const getWorkspaceTags = async (req: Request, res: Response) => {
const { workspaceId } = req.params
const workspaceTags = await Tag.find({ workspace: workspaceId })
return res.json({
workspaceTags
})
}

View File

@ -0,0 +1,147 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import {
User,
MembershipOrg
} from '../../models';
/**
* Return the current user.
* @param req
* @param res
* @returns
*/
export const getMe = async (req: Request, res: Response) => {
/*
#swagger.summary = "Retrieve the current user on the request"
#swagger.description = "Retrieve the current user on the request"
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"user": {
"type": "object",
$ref: "#/components/schemas/CurrentUser",
"description": "Current user on request"
}
}
}
}
}
}
*/
let user;
try {
user = await User
.findById(req.user._id)
.select('+publicKey +encryptedPrivateKey +iv +tag');
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get current user'
});
}
return res.status(200).send({
user
});
}
/**
* Update the current user's MFA-enabled status [isMfaEnabled].
* Note: Infisical currently only supports email-based 2FA only; this will expand to
* include SMS and authenticator app modes of authentication in the future.
* @param req
* @param res
* @returns
*/
export const updateMyMfaEnabled = async (req: Request, res: Response) => {
let user;
try {
const { isMfaEnabled }: { isMfaEnabled: boolean } = req.body;
req.user.isMfaEnabled = isMfaEnabled;
if (isMfaEnabled) {
// TODO: adapt this route/controller
// to work for different forms of MFA
req.user.mfaMethods = ['email'];
} else {
req.user.mfaMethods = [];
}
await req.user.save();
user = req.user;
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to update current user's MFA status"
});
}
return res.status(200).send({
user
});
}
/**
* Return organizations that the current user is part of.
* @param req
* @param res
*/
export const getMyOrganizations = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return organizations that current user is part of'
#swagger.description = 'Return organizations that current user is part of'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"organizations": {
"type": "array",
"items": {
$ref: "#/components/schemas/Organization"
},
"description": "Organizations that user is part of"
}
}
}
}
}
}
*/
let organizations;
try {
organizations = (
await MembershipOrg.find({
user: req.user._id
}).populate('organization')
).map((m) => m.organization);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to get current user's organizations"
});
}
return res.status(200).send({
organizations
});
}

View File

@ -1,7 +1,9 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import {
Workspace,
Secret,
Membership,
MembershipOrg,
Integration,
@ -19,7 +21,6 @@ import {
import { pushKeys } from '../../helpers/key';
import { postHogClient, EventService } from '../../services';
import { eventPushSecrets } from '../../events';
import { ENV_SET } from '../../variables';
interface V2PushSecret {
type: string; // personal or shared
@ -52,7 +53,8 @@ export const pushWorkspaceSecrets = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
// validate environment
if (!ENV_SET.has(environment)) {
const workspaceEnvs = req.membership.workspace.environments;
if (!workspaceEnvs.find(({ slug }: { slug: string }) => slug === environment)) {
throw new Error('Failed to validate environment');
}
@ -129,6 +131,11 @@ export const pullSecrets = async (req: Request, res: Response) => {
} else if (req.serviceTokenData) {
userId = req.serviceTokenData.user._id
}
// validate environment
const workspaceEnvs = req.membership.workspace.environments;
if (!workspaceEnvs.find(({ slug }: { slug: string }) => slug === environment)) {
throw new Error('Failed to validate environment');
}
secrets = await pull({
userId,
@ -169,6 +176,34 @@ export const pullSecrets = async (req: Request, res: Response) => {
};
export const getWorkspaceKey = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return encrypted project key'
#swagger.description = 'Return encrypted project key'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "array",
"items": {
$ref: "#/components/schemas/ProjectKey"
},
"description": "Encrypted project key for the given project"
}
}
}
}
*/
let key;
try {
const { workspaceId } = req.params;
@ -214,4 +249,260 @@ export const getWorkspaceServiceTokenData = async (
return res.status(200).send({
serviceTokenData
});
}
}
/**
* Return memberships for workspace with id [workspaceId]
* @param req
* @param res
* @returns
*/
export const getWorkspaceMemberships = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return project memberships'
#swagger.description = 'Return project memberships'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"memberships": {
"type": "array",
"items": {
$ref: "#/components/schemas/Membership"
},
"description": "Memberships of project"
}
}
}
}
}
}
*/
let memberships;
try {
const { workspaceId } = req.params;
memberships = await Membership.find({
workspace: workspaceId
}).populate('user', '+publicKey');
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to get workspace memberships'
});
}
return res.status(200).send({
memberships
});
}
/**
* Update role of membership with id [membershipId] to role [role]
* @param req
* @param res
* @returns
*/
export const updateWorkspaceMembership = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Update project membership'
#swagger.description = 'Update project membership'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.parameters['membershipId'] = {
"description": "ID of project membership to update",
"required": true,
"type": "string"
}
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"role": {
"type": "string",
"description": "Role of membership - either admin or member",
}
}
}
}
}
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"membership": {
$ref: "#/components/schemas/Membership",
"description": "Updated membership"
}
}
}
}
}
}
*/
let membership;
try {
const {
membershipId
} = req.params;
const { role } = req.body;
membership = await Membership.findByIdAndUpdate(
membershipId,
{
role
}, {
new: true
}
);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to update workspace membership'
});
}
return res.status(200).send({
membership
});
}
/**
* Delete workspace membership with id [membershipId]
* @param req
* @param res
* @returns
*/
export const deleteWorkspaceMembership = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Delete project membership'
#swagger.description = 'Delete project membership'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.parameters['membershipId'] = {
"description": "ID of project membership to delete",
"required": true,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
"schema": {
"type": "object",
"properties": {
"membership": {
$ref: "#/components/schemas/Membership",
"description": "Deleted membership"
}
}
}
}
}
}
*/
let membership;
try {
const {
membershipId
} = req.params;
membership = await Membership.findByIdAndDelete(membershipId);
if (!membership) throw new Error('Failed to delete workspace membership');
await Key.deleteMany({
receiver: membership.user,
workspace: membership.workspace
});
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to delete workspace membership'
});
}
return res.status(200).send({
membership
});
}
/**
* Change autoCapitilzation Rule of workspace
* @param req
* @param res
* @returns
*/
export const toggleAutoCapitalization = async (req: Request, res: Response) => {
let workspace;
try {
const { workspaceId } = req.params;
const { autoCapitalization } = req.body;
workspace = await Workspace.findOneAndUpdate(
{
_id: workspaceId
},
{
autoCapitalization
},
{
new: true
}
);
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to change autoCapitalization setting'
});
}
return res.status(200).send({
message: 'Successfully changed autoCapitalization setting',
workspace
});
};

View File

@ -3,11 +3,13 @@ import * as secretController from './secretController';
import * as secretSnapshotController from './secretSnapshotController';
import * as workspaceController from './workspaceController';
import * as actionController from './actionController';
import * as membershipController from './membershipController';
export {
stripeController,
secretController,
secretSnapshotController,
workspaceController,
actionController
actionController,
membershipController
}

View File

@ -0,0 +1,63 @@
import { Request, Response } from "express";
import { Membership, Workspace } from "../../../models";
import { IMembershipPermission } from "../../../models/membership";
import { BadRequestError, UnauthorizedRequestError } from "../../../utils/errors";
import { ABILITY_READ, ABILITY_WRITE, ADMIN, MEMBER } from "../../../variables/organization";
import { Builder } from "builder-pattern"
import _ from "lodash";
export const denyMembershipPermissions = async (req: Request, res: Response) => {
const { membershipId } = req.params;
const { permissions } = req.body;
const sanitizedMembershipPermissions: IMembershipPermission[] = permissions.map((permission: IMembershipPermission) => {
if (!permission.ability || !permission.environmentSlug || ![ABILITY_READ, ABILITY_WRITE].includes(permission.ability)) {
throw BadRequestError({ message: "One or more required fields are missing from the request or have incorrect type" })
}
return Builder<IMembershipPermission>()
.environmentSlug(permission.environmentSlug)
.ability(permission.ability)
.build();
})
const sanitizedMembershipPermissionsUnique = _.uniqWith(sanitizedMembershipPermissions, _.isEqual)
const membershipToModify = await Membership.findById(membershipId)
if (!membershipToModify) {
throw BadRequestError({ message: "Unable to locate resource" })
}
// check if the user making the request is a admin of this project
if (![ADMIN, MEMBER].includes(membershipToModify.role)) {
throw UnauthorizedRequestError()
}
// check if the requested slugs are indeed a part of this related workspace
const relatedWorkspace = await Workspace.findById(membershipToModify.workspace)
if (!relatedWorkspace) {
throw BadRequestError({ message: "Something went wrong when locating the related workspace" })
}
const uniqueEnvironmentSlugs = new Set(_.uniq(_.map(relatedWorkspace.environments, 'slug')));
sanitizedMembershipPermissionsUnique.forEach(permission => {
if (!uniqueEnvironmentSlugs.has(permission.environmentSlug)) {
throw BadRequestError({ message: "Unknown environment slug reference" })
}
})
// update the permissions
const updatedMembershipWithPermissions = await Membership.findByIdAndUpdate(
{ _id: membershipToModify._id },
{ $set: { deniedPermissions: sanitizedMembershipPermissionsUnique } },
{ new: true }
)
if (!updatedMembershipWithPermissions) {
throw BadRequestError({ message: "The resource has been removed before it can be modified" })
}
res.send({
permissionsDenied: updatedMembershipWithPermissions.deniedPermissions
})
}

View File

@ -10,6 +10,51 @@ import { EESecretService } from '../../services';
* @param res
*/
export const getSecretVersions = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return secret versions'
#swagger.description = 'Return secret versions'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['secretId'] = {
"description": "ID of secret",
"required": true,
"type": "string"
}
#swagger.parameters['offset'] = {
"description": "Number of versions to skip",
"required": false,
"type": "string"
}
#swagger.parameters['limit'] = {
"description": "Maximum number of versions to return",
"required": false,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
schema: {
"type": "object",
"properties": {
"secretVersions": {
"type": "array",
"items": {
$ref: "#/components/schemas/SecretVersion"
},
"description": "Secret versions"
}
}
}
}
}
}
*/
let secretVersions;
try {
const { secretId } = req.params;
@ -44,6 +89,54 @@ import { EESecretService } from '../../services';
* @returns
*/
export const rollbackSecretVersion = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Roll back secret to a version.'
#swagger.description = 'Roll back secret to a version.'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['secretId'] = {
"description": "ID of secret",
"required": true,
"type": "string"
}
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"version": {
"type": "integer",
"description": "Version of secret to roll back to"
}
}
}
}
}
}
#swagger.responses[200] = {
content: {
"application/json": {
schema: {
"type": "object",
"properties": {
"secret": {
"type": "object",
$ref: "#/components/schemas/Secret",
"description": "Secret rolled back to"
}
}
}
}
}
}
*/
let secret;
try {
const { secretId } = req.params;
@ -65,11 +158,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
} = oldSecretVersion;
// update secret
@ -86,11 +177,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
},
{
new: true
@ -111,11 +200,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
secretValueTag
}).save();
// take secret snapshot

View File

@ -15,7 +15,13 @@ export const getSecretSnapshot = async (req: Request, res: Response) => {
secretSnapshot = await SecretSnapshot
.findById(secretSnapshotId)
.populate('secretVersions');
.populate({
path: 'secretVersions',
populate: {
path: 'tags',
model: 'Tag',
}
});
if (!secretSnapshot) throw new Error('Failed to find secret snapshot');

View File

@ -19,6 +19,51 @@ import { getLatestSecretVersionIds } from '../../helpers/secretVersion';
* @param res
*/
export const getWorkspaceSecretSnapshots = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return project secret snapshot ids'
#swagger.description = 'Return project secret snapshots ids'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.parameters['offset'] = {
"description": "Number of secret snapshots to skip",
"required": false,
"type": "string"
}
#swagger.parameters['limit'] = {
"description": "Maximum number of secret snapshots to return",
"required": false,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
schema: {
"type": "object",
"properties": {
"secretSnapshots": {
"type": "array",
"items": {
$ref: "#/components/schemas/SecretSnapshot"
},
"description": "Project secret snapshots"
}
}
}
}
}
}
*/
let secretSnapshots;
try {
const { workspaceId } = req.params;
@ -78,16 +123,66 @@ export const getWorkspaceSecretSnapshotsCount = async (req: Request, res: Respon
* @returns
*/
export const rollbackWorkspaceSecretSnapshot = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Roll back project secrets to those captured in a secret snapshot version.'
#swagger.description = 'Roll back project secrets to those captured in a secret snapshot version.'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.requestBody = {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"version": {
"type": "integer",
"description": "Version of secret snapshot to roll back to",
}
}
}
}
}
}
#swagger.responses[200] = {
content: {
"application/json": {
schema: {
"type": "object",
"properties": {
"secrets": {
"type": "array",
"items": {
$ref: "#/components/schemas/Secret"
},
"description": "Secrets rolled back to"
}
}
}
}
}
}
*/
let secrets;
try {
const { workspaceId } = req.params;
const { version } = req.body;
// validate secret snapshot
const secretSnapshot = await SecretSnapshot.findOne({
workspace: workspaceId,
version
}).populate<{ secretVersions: ISecretVersion[]}>('secretVersions');
const secretSnapshot = await SecretSnapshot.findOne({
workspace: workspaceId,
version
}).populate<{ secretVersions: ISecretVersion[]}>('secretVersions');
if (!secretSnapshot) throw new Error('Failed to find secret snapshot');
@ -231,6 +326,72 @@ export const rollbackWorkspaceSecretSnapshot = async (req: Request, res: Respons
* @returns
*/
export const getWorkspaceLogs = async (req: Request, res: Response) => {
/*
#swagger.summary = 'Return project (audit) logs'
#swagger.description = 'Return project (audit) logs'
#swagger.security = [{
"apiKeyAuth": []
}]
#swagger.parameters['workspaceId'] = {
"description": "ID of project",
"required": true,
"type": "string"
}
#swagger.parameters['userId'] = {
"description": "ID of project member",
"required": false,
"type": "string"
}
#swagger.parameters['offset'] = {
"description": "Number of logs to skip",
"required": false,
"type": "string"
}
#swagger.parameters['limit'] = {
"description": "Maximum number of logs to return",
"required": false,
"type": "string"
}
#swagger.parameters['sortBy'] = {
"description": "Order to sort the logs by",
"schema": {
"type": "string",
"@enum": ["oldest", "recent"]
},
"required": false
}
#swagger.parameters['actionNames'] = {
"description": "Names of log actions (comma-separated)",
"required": false,
"type": "string"
}
#swagger.responses[200] = {
content: {
"application/json": {
schema: {
"type": "object",
"properties": {
"logs": {
"type": "array",
"items": {
$ref: "#/components/schemas/Log"
},
"description": "Project logs"
}
}
}
}
}
}
*/
let logs
try {
const { workspaceId } = req.params;

View File

@ -1,39 +1,40 @@
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import { SecretVersion, Action } from '../models';
import { Action } from '../models';
import {
getLatestSecretVersionIds,
getLatestNSecretSecretVersionIds
} from '../helpers/secretVersion';
import { ACTION_UPDATE_SECRETS } from '../../variables';
import {
ACTION_LOGIN,
ACTION_LOGOUT,
ACTION_ADD_SECRETS,
ACTION_READ_SECRETS,
ACTION_DELETE_SECRETS,
ACTION_UPDATE_SECRETS,
} from '../../variables';
/**
* Create an (audit) action for secrets including
* add, delete, update, and read actions.
* Create an (audit) action for updating secrets
* @param {Object} obj
* @param {String} obj.name - name of action
* @param {ObjectId[]} obj.secretIds - ids of relevant secrets
* @param {Types.ObjectId} obj.secretIds - ids of relevant secrets
* @returns {Action} action - new action
*/
const createActionSecretHelper = async ({
const createActionUpdateSecret = async ({
name,
userId,
workspaceId,
secretIds
}: {
name: string;
userId: string;
workspaceId: string;
userId: Types.ObjectId;
workspaceId: Types.ObjectId;
secretIds: Types.ObjectId[];
}) => {
let action;
let latestSecretVersions;
try {
if (name === ACTION_UPDATE_SECRETS) {
// case: action is updating secrets
// -> add old and new secret versions
latestSecretVersions = (await getLatestNSecretSecretVersionIds({
const latestSecretVersions = (await getLatestNSecretSecretVersionIds({
secretIds,
n: 2
}))
@ -41,17 +42,7 @@ const createActionSecretHelper = async ({
oldSecretVersion: s.versions[0]._id,
newSecretVersion: s.versions[1]._id
}));
} else {
// case: action is adding, deleting, or reading secrets
// -> add new secret versions
latestSecretVersions = (await getLatestSecretVersionIds({
secretIds
}))
.map((s) => ({
newSecretVersion: s.versionId
}));
}
action = await new Action({
name,
user: userId,
@ -64,10 +55,148 @@ const createActionSecretHelper = async ({
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to create update secret action');
}
return action;
}
/**
* Create an (audit) action for creating, reading, and deleting
* secrets
* @param {Object} obj
* @param {String} obj.name - name of action
* @param {Types.ObjectId} obj.secretIds - ids of relevant secrets
* @returns {Action} action - new action
*/
const createActionSecret = async ({
name,
userId,
workspaceId,
secretIds
}: {
name: string;
userId: Types.ObjectId;
workspaceId: Types.ObjectId;
secretIds: Types.ObjectId[];
}) => {
let action;
try {
// case: action is adding, deleting, or reading secrets
// -> add new secret versions
const latestSecretVersions = (await getLatestSecretVersionIds({
secretIds
}))
.map((s) => ({
newSecretVersion: s.versionId
}));
action = await new Action({
name,
user: userId,
workspace: workspaceId,
payload: {
secretVersions: latestSecretVersions
}
}).save();
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to create action create/read/delete secret action');
}
return action;
}
/**
* Create an (audit) action for user with id [userId]
* @param {Object} obj
* @param {String} obj.name - name of action
* @param {String} obj.userId - id of user associated with action
* @returns
*/
const createActionUser = ({
name,
userId
}: {
name: string;
userId: Types.ObjectId;
}) => {
let action;
try {
action = new Action({
name,
user: userId
}).save();
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to create user action');
}
return action;
}
/**
* Create an (audit) action.
* @param {Object} obj
* @param {Object} obj.name - name of action
* @param {Types.ObjectId} obj.userId - id of user associated with action
* @param {Types.ObjectId} obj.workspaceId - id of workspace associated with action
* @param {Types.ObjectId[]} obj.secretIds - ids of secrets associated with action
*/
const createActionHelper = async ({
name,
userId,
workspaceId,
secretIds,
}: {
name: string;
userId: Types.ObjectId;
workspaceId?: Types.ObjectId;
secretIds?: Types.ObjectId[];
}) => {
let action;
try {
switch (name) {
case ACTION_LOGIN:
case ACTION_LOGOUT:
action = await createActionUser({
name,
userId
});
break;
case ACTION_ADD_SECRETS:
case ACTION_READ_SECRETS:
case ACTION_DELETE_SECRETS:
if (!workspaceId || !secretIds) throw new Error('Missing required params workspace id or secret ids to create action secret');
action = await createActionSecret({
name,
userId,
workspaceId,
secretIds
});
break;
case ACTION_UPDATE_SECRETS:
if (!workspaceId || !secretIds) throw new Error('Missing required params workspace id or secret ids to create action secret');
action = await createActionUpdateSecret({
name,
userId,
workspaceId,
secretIds
});
break;
}
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to create action');
}
return action;
}
export { createActionSecretHelper };
export {
createActionHelper
};

View File

@ -0,0 +1,54 @@
import _ from "lodash";
import { Membership } from "../../models";
import { ABILITY_READ, ABILITY_WRITE } from "../../variables/organization";
export const userHasWorkspaceAccess = async (userId: any, workspaceId: any, environment: any, action: any) => {
const membershipForWorkspace = await Membership.findOne({ workspace: workspaceId, user: userId })
if (!membershipForWorkspace) {
return false
}
const deniedMembershipPermissions = membershipForWorkspace.deniedPermissions;
const isDisallowed = _.some(deniedMembershipPermissions, { environmentSlug: environment, ability: action });
if (isDisallowed) {
return false
}
return true
}
export const userHasWriteOnlyAbility = async (userId: any, workspaceId: any, environment: any) => {
const membershipForWorkspace = await Membership.findOne({ workspace: workspaceId, user: userId })
if (!membershipForWorkspace) {
return false
}
const deniedMembershipPermissions = membershipForWorkspace.deniedPermissions;
const isWriteDisallowed = _.some(deniedMembershipPermissions, { environmentSlug: environment, ability: ABILITY_WRITE });
const isReadDisallowed = _.some(deniedMembershipPermissions, { environmentSlug: environment, ability: ABILITY_READ });
// case: you have write only if read is blocked and write is not
if (isReadDisallowed && !isWriteDisallowed) {
return true
}
return false
}
export const userHasNoAbility = async (userId: any, workspaceId: any, environment: any) => {
const membershipForWorkspace = await Membership.findOne({ workspace: workspaceId, user: userId })
if (!membershipForWorkspace) {
return true
}
const deniedMembershipPermissions = membershipForWorkspace.deniedPermissions;
const isWriteDisallowed = _.some(deniedMembershipPermissions, { environmentSlug: environment, ability: ABILITY_WRITE });
const isReadBlocked = _.some(deniedMembershipPermissions, { environmentSlug: environment, ability: ABILITY_READ });
if (isReadBlocked && isWriteDisallowed) {
return true
}
return false
}

View File

@ -1,9 +1,19 @@
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import {
Log,
IAction
} from '../models';
/**
* Create an (audit) log
* @param {Object} obj
* @param {Types.ObjectId} obj.userId - id of user associated with the log
* @param {Types.ObjectId} obj.workspaceId - id of workspace associated with the log
* @param {IAction[]} obj.actions - actions to include in log
* @param {String} obj.channel - channel (web/cli/auto) associated with the log
* @param {String} obj.ipAddress - ip address associated with the log
* @returns {Log} log - new audit log
*/
const createLogHelper = async ({
userId,
workspaceId,
@ -11,8 +21,8 @@ const createLogHelper = async ({
channel,
ipAddress
}: {
userId: string;
workspaceId: string;
userId: Types.ObjectId;
workspaceId?: Types.ObjectId;
actions: IAction[];
channel: string;
ipAddress: string;
@ -21,7 +31,7 @@ const createLogHelper = async ({
try {
log = await new Log({
user: userId,
workspace: workspaceId,
workspace: workspaceId ?? undefined,
actionNames: actions.map((a) => a.name),
actions,
channel,

View File

@ -1,10 +1,18 @@
import { Schema, model, Types } from 'mongoose';
import {
ACTION_LOGIN,
ACTION_LOGOUT,
ACTION_ADD_SECRETS,
ACTION_UPDATE_SECRETS,
ACTION_READ_SECRETS,
ACTION_DELETE_SECRETS
} from '../../variables';
export interface IAction {
name: string;
user?: Types.ObjectId,
workspace?: Types.ObjectId,
payload: {
payload?: {
secretVersions?: Types.ObjectId[]
}
}
@ -13,7 +21,15 @@ const actionSchema = new Schema<IAction>(
{
name: {
type: String,
required: true
required: true,
enum: [
ACTION_LOGIN,
ACTION_LOGOUT,
ACTION_ADD_SECRETS,
ACTION_UPDATE_SECRETS,
ACTION_READ_SECRETS,
ACTION_DELETE_SECRETS
]
},
user: {
type: Schema.Types.ObjectId,

View File

@ -1,5 +1,7 @@
import { Schema, model, Types } from 'mongoose';
import {
ACTION_LOGIN,
ACTION_LOGOUT,
ACTION_ADD_SECRETS,
ACTION_UPDATE_SECRETS,
ACTION_READ_SECRETS,
@ -29,6 +31,8 @@ const logSchema = new Schema<ILog>(
actionNames: {
type: [String],
enum: [
ACTION_LOGIN,
ACTION_LOGOUT,
ACTION_ADD_SECRETS,
ACTION_UPDATE_SECRETS,
ACTION_READ_SECRETS,
@ -41,17 +45,17 @@ const logSchema = new Schema<ILog>(
ref: 'Action',
required: true
}],
channel: {
channel: {
type: String,
enum: ['web', 'cli', 'auto'],
enum: ['web', 'cli', 'auto', 'k8-operator', 'other'],
required: true
},
ipAddress: {
type: String
}
}, {
timestamps: true
}
timestamps: true
}
);
const Log = model<ILog>('Log', logSchema);

View File

@ -2,29 +2,23 @@ import { Schema, model, Types } from 'mongoose';
import {
SECRET_SHARED,
SECRET_PERSONAL,
ENV_DEV,
ENV_TESTING,
ENV_STAGING,
ENV_PROD
} from '../../variables';
export interface ISecretVersion {
_id: Types.ObjectId;
secret: Types.ObjectId;
version: number;
workspace: Types.ObjectId; // new
type: string; // new
user: Types.ObjectId; // new
user?: Types.ObjectId; // new
environment: string; // new
isDeleted: boolean;
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretKeyHash: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretValueHash: string;
tags?: string[];
}
const secretVersionSchema = new Schema<ISecretVersion>(
@ -56,7 +50,6 @@ const secretVersionSchema = new Schema<ISecretVersion>(
},
environment: {
type: String,
enum: [ENV_DEV, ENV_TESTING, ENV_STAGING, ENV_PROD],
required: true
},
isDeleted: { // consider removing field
@ -76,9 +69,6 @@ const secretVersionSchema = new Schema<ISecretVersion>(
type: String, // symmetric
required: true
},
secretKeyHash: {
type: String
},
secretValueCiphertext: {
type: String,
required: true
@ -91,9 +81,11 @@ const secretVersionSchema = new Schema<ISecretVersion>(
type: String, // symmetric
required: true
},
secretValueHash: {
type: String
}
tags: {
ref: 'Tag',
type: [Schema.Types.ObjectId],
default: []
},
},
{
timestamps: true

View File

@ -12,7 +12,7 @@ import { ADMIN, MEMBER } from '../../../variables';
router.get(
'/:secretId/secret-versions',
requireAuth({
acceptedAuthModes: ['jwt']
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireSecretAuth({
acceptedRoles: [ADMIN, MEMBER]
@ -27,7 +27,7 @@ router.get(
router.post(
'/:secretId/secret-versions/rollback',
requireAuth({
acceptedAuthModes: ['jwt']
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireSecretAuth({
acceptedRoles: [ADMIN, MEMBER]

View File

@ -12,7 +12,7 @@ import { workspaceController } from '../../controllers/v1';
router.get(
'/:workspaceId/secret-snapshots',
requireAuth({
acceptedAuthModes: ['jwt']
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER]
@ -40,7 +40,7 @@ router.get(
router.post(
'/:workspaceId/secret-snapshots/rollback',
requireAuth({
acceptedAuthModes: ['jwt']
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER]
@ -54,7 +54,7 @@ router.post(
router.get(
'/:workspaceId/logs',
requireAuth({
acceptedAuthModes: ['jwt']
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER]

View File

@ -1,14 +1,12 @@
import { Types } from 'mongoose';
import {
Log,
Action,
IAction
} from '../models';
import {
createLogHelper
} from '../helpers/log';
import {
createActionSecretHelper
createActionHelper
} from '../helpers/action';
import EELicenseService from './EELicenseService';
@ -33,8 +31,8 @@ class EELogService {
channel,
ipAddress
}: {
userId: string;
workspaceId: string;
userId: Types.ObjectId;
workspaceId?: Types.ObjectId;
actions: IAction[];
channel: string;
ipAddress: string;
@ -50,26 +48,26 @@ class EELogService {
}
/**
* Create an (audit) action for secrets including
* add, delete, update, and read actions.
* Create an (audit) action
* @param {Object} obj
* @param {String} obj.name - name of action
* @param {ObjectId[]} obj.secretIds - secret ids
* @param {Types.ObjectId} obj.userId - id of user associated with the action
* @param {Types.ObjectId} obj.workspaceId - id of workspace associated with the action
* @param {ObjectId[]} obj.secretIds - ids of secrets associated with the action
* @returns {Action} action - new action
*/
static async createActionSecret({
static async createAction({
name,
userId,
workspaceId,
secretIds
}: {
name: string;
userId: string;
workspaceId: string;
secretIds: Types.ObjectId[];
userId: Types.ObjectId;
workspaceId?: Types.ObjectId;
secretIds?: Types.ObjectId[];
}) {
if (!EELicenseService.isLicenseValid) return null;
return await createActionSecretHelper({
return await createActionHelper({
name,
userId,
workspaceId,

View File

@ -16,49 +16,66 @@ import {
AccountNotFoundError,
ServiceTokenDataNotFoundError,
APIKeyDataNotFoundError,
UnauthorizedRequestError
UnauthorizedRequestError,
BadRequestError
} from '../utils/errors';
// TODO 1: check if API key works
// TODO 2: optimize middleware
/**
* Validate that auth token value [authTokenValue] falls under one of
* accepted auth modes [acceptedAuthModes].
*
* @param {Object} obj
* @param {String} obj.authTokenValue - auth token value (e.g. JWT or service token value)
* @param {String[]} obj.acceptedAuthModes - accepted auth modes (e.g. jwt, serviceToken)
* @returns {String} authMode - auth mode
* @param {Object} obj.headers - HTTP request headers object
*/
const validateAuthMode = ({
authTokenValue,
headers,
acceptedAuthModes
}: {
authTokenValue: string;
acceptedAuthModes: string[];
headers: { [key: string]: string | string[] | undefined },
acceptedAuthModes: string[]
}) => {
let authMode;
try {
switch (authTokenValue.split('.', 1)[0]) {
// TODO: refactor middleware
const apiKey = headers['x-api-key'];
const authHeader = headers['authorization'];
let authTokenType, authTokenValue;
if (apiKey === undefined && authHeader === undefined) {
// case: no auth or X-API-KEY header present
throw BadRequestError({ message: 'Missing Authorization or X-API-KEY in request header.' });
}
if (typeof apiKey === 'string') {
// case: treat request authentication type as via X-API-KEY (i.e. API Key)
authTokenType = 'apiKey';
authTokenValue = apiKey;
}
if (typeof authHeader === 'string') {
// case: treat request authentication type as via Authorization header (i.e. either JWT or service token)
const [tokenType, tokenValue] = <[string, string]>authHeader.split(' ', 2) ?? [null, null]
if (tokenType === null)
throw BadRequestError({ message: `Missing Authorization Header in the request header.` });
if (tokenType.toLowerCase() !== 'bearer')
throw BadRequestError({ message: `The provided authentication type '${tokenType}' is not supported.` });
if (tokenValue === null)
throw BadRequestError({ message: 'Missing Authorization Body in the request header.' });
switch (tokenValue.split('.', 1)[0]) {
case 'st':
authMode = 'serviceToken';
break;
case 'ak':
authMode = 'apiKey';
authTokenType = 'serviceToken';
break;
default:
authMode = 'jwt';
break;
authTokenType = 'jwt';
}
if (!acceptedAuthModes.includes(authMode))
throw UnauthorizedRequestError({ message: 'Failed to authenticated auth mode' });
} catch (err) {
throw UnauthorizedRequestError({ message: 'Failed to authenticated auth mode' });
authTokenValue = tokenValue;
}
return authMode;
if (!authTokenType || !authTokenValue) throw BadRequestError({ message: 'Missing valid Authorization or X-API-KEY in request header.' });
if (!acceptedAuthModes.includes(authTokenType)) throw BadRequestError({ message: 'The provided authentication type is not supported.' });
return ({
authTokenType,
authTokenValue
});
}
/**
@ -91,7 +108,7 @@ const getAuthUserPayload = async ({
message: 'Failed to authenticate JWT token'
});
}
return user;
}
@ -113,7 +130,7 @@ const getAuthSTDPayload = async ({
// TODO: optimize double query
serviceTokenData = await ServiceTokenData
.findById(TOKEN_IDENTIFIER, '+secretHash +expiresAt');
if (!serviceTokenData) {
throw ServiceTokenDataNotFoundError({ message: 'Failed to find service token data' });
} else if (serviceTokenData?.expiresAt && new Date(serviceTokenData.expiresAt) < new Date()) {
@ -131,14 +148,14 @@ const getAuthSTDPayload = async ({
serviceTokenData = await ServiceTokenData
.findById(TOKEN_IDENTIFIER)
.select('+encryptedKey +iv +tag');
.select('+encryptedKey +iv +tag').populate('user');
} catch (err) {
throw UnauthorizedRequestError({
message: 'Failed to authenticate service token'
});
}
return serviceTokenData;
}
@ -156,11 +173,11 @@ const getAuthAPIKeyPayload = async ({
let user;
try {
const [_, TOKEN_IDENTIFIER, TOKEN_SECRET] = <[string, string, string]>authTokenValue.split('.', 3);
const apiKeyData = await APIKeyData
.findById(TOKEN_IDENTIFIER, '+secretHash +expiresAt')
.populate('user', '+publicKey');
if (!apiKeyData) {
throw APIKeyDataNotFoundError({ message: 'Failed to find API key data' });
} else if (apiKeyData?.expiresAt && new Date(apiKeyData.expiresAt) < new Date()) {
@ -175,14 +192,14 @@ const getAuthAPIKeyPayload = async ({
if (!isMatch) throw UnauthorizedRequestError({
message: 'Failed to authenticate API key'
});
user = apiKeyData.user;
} catch (err) {
throw UnauthorizedRequestError({
message: 'Failed to authenticate API key'
});
}
return user;
}
@ -194,7 +211,7 @@ const getAuthAPIKeyPayload = async ({
* @return {String} obj.token - issued JWT token
* @return {String} obj.refreshToken - issued refresh token
*/
const issueTokens = async ({ userId }: { userId: string }) => {
const issueAuthTokens = async ({ userId }: { userId: string }) => {
let token: string;
let refreshToken: string;
try {
@ -275,12 +292,12 @@ const createToken = ({
}
};
export {
export {
validateAuthMode,
getAuthUserPayload,
getAuthSTDPayload,
getAuthAPIKeyPayload,
createToken,
issueTokens,
clearTokens
createToken,
issueAuthTokens,
clearTokens
};

View File

@ -1,5 +1,4 @@
import mongoose from 'mongoose';
import { ISecret, Secret } from '../models';
import { EESecretService } from '../ee/services';
import { getLogger } from '../utils/logger';
@ -16,6 +15,10 @@ const initDatabaseHelper = async ({
}) => {
try {
await mongoose.connect(mongoURL);
// allow empty strings to pass the required validator
mongoose.Schema.Types.String.checkRequired(v => typeof v === 'string');
getLogger("database").info("Database connection established");
await EESecretService.initSecretVersioning();

View File

@ -7,8 +7,6 @@ import {
import { exchangeCode, exchangeRefresh, syncSecrets } from '../integrations';
import { BotService } from '../services';
import {
ENV_DEV,
EVENT_PUSH_SECRETS,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY
} from '../variables';
@ -32,17 +30,19 @@ interface Update {
* @param {String} obj.workspaceId - id of workspace
* @param {String} obj.integration - name of integration
* @param {String} obj.code - code
* @returns {IntegrationAuth} integrationAuth - integration auth after OAuth2 code-token exchange
*/
const handleOAuthExchangeHelper = async ({
workspaceId,
integration,
code
code,
environment
}: {
workspaceId: string;
integration: string;
code: string;
environment: string;
}) => {
let action;
let integrationAuth;
try {
const bot = await Bot.findOne({
@ -94,25 +94,18 @@ const handleOAuthExchangeHelper = async ({
// set integration auth access token
await setIntegrationAuthAccessHelper({
integrationAuthId: integrationAuth._id.toString(),
accessId: null,
accessToken: res.accessToken,
accessExpiresAt: res.accessExpiresAt
});
}
// initialize new integration after exchange
await new Integration({
workspace: workspaceId,
environment: ENV_DEV,
isActive: false,
app: null,
integration,
integrationAuth: integrationAuth._id
}).save();
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to handle OAuth2 code-token exchange')
}
return integrationAuth;
}
/**
* Sync/push environment variables in workspace with id [workspaceId] to
@ -127,7 +120,6 @@ const syncIntegrationsHelper = async ({
}) => {
let integrations;
try {
integrations = await Integration.find({
workspace: workspaceId,
isActive: true,
@ -142,12 +134,12 @@ const syncIntegrationsHelper = async ({
workspaceId: integration.workspace.toString(),
environment: integration.environment
});
const integrationAuth = await IntegrationAuth.findById(integration.integrationAuth);
if (!integrationAuth) throw new Error('Failed to find integration auth');
// get integration auth access token
const accessToken = await getIntegrationAuthAccessHelper({
const access = await getIntegrationAuthAccessHelper({
integrationAuthId: integration.integrationAuth.toString()
});
@ -156,7 +148,8 @@ const syncIntegrationsHelper = async ({
integration,
integrationAuth,
secrets,
accessToken
accessId: access.accessId,
accessToken: access.accessToken
});
}
} catch (err) {
@ -212,12 +205,12 @@ const syncIntegrationsHelper = async ({
* @returns {String} accessToken - decrypted access token
*/
const getIntegrationAuthAccessHelper = async ({ integrationAuthId }: { integrationAuthId: string }) => {
let accessId;
let accessToken;
try {
const integrationAuth = await IntegrationAuth
.findById(integrationAuthId)
.select('workspace integration +accessCiphertext +accessIV +accessTag +accessExpiresAt + refreshCiphertext');
.select('workspace integration +accessCiphertext +accessIV +accessTag +accessExpiresAt + refreshCiphertext +accessIdCiphertext +accessIdIV +accessIdTag');
if (!integrationAuth) throw UnauthorizedRequestError({message: 'Failed to locate Integration Authentication credentials'});
@ -241,6 +234,15 @@ const getIntegrationAuthAccessHelper = async ({ integrationAuthId }: { integrati
});
}
}
if (integrationAuth?.accessIdCiphertext && integrationAuth?.accessIdIV && integrationAuth?.accessIdTag) {
accessId = await BotService.decryptSymmetric({
workspaceId: integrationAuth.workspace.toString(),
ciphertext: integrationAuth.accessIdCiphertext as string,
iv: integrationAuth.accessIdIV as string,
tag: integrationAuth.accessIdTag as string
});
}
} catch (err) {
Sentry.setUser(null);
@ -251,7 +253,10 @@ const getIntegrationAuthAccessHelper = async ({ integrationAuthId }: { integrati
throw new Error('Failed to get integration access token');
}
return accessToken;
return ({
accessId,
accessToken
});
}
/**
@ -301,9 +306,9 @@ const setIntegrationAuthRefreshHelper = async ({
}
/**
* Encrypt access token [accessToken] using the bot's copy
* of the workspace key for workspace belonging to integration auth
* with id [integrationAuthId] and store it along with [accessExpiresAt]
* Encrypt access token [accessToken] and (optionally) access id [accessId]
* using the bot's copy of the workspace key for workspace belonging to
* integration auth with id [integrationAuthId] and store it along with [accessExpiresAt]
* @param {Object} obj
* @param {String} obj.integrationAuthId - id of integration auth
* @param {String} obj.accessToken - access token
@ -311,12 +316,14 @@ const setIntegrationAuthRefreshHelper = async ({
*/
const setIntegrationAuthAccessHelper = async ({
integrationAuthId,
accessId,
accessToken,
accessExpiresAt
}: {
integrationAuthId: string;
accessId: string | null;
accessToken: string;
accessExpiresAt: Date;
accessExpiresAt: Date | undefined;
}) => {
let integrationAuth;
try {
@ -324,17 +331,28 @@ const setIntegrationAuthAccessHelper = async ({
if (!integrationAuth) throw new Error('Failed to find integration auth');
const obj = await BotService.encryptSymmetric({
const encryptedAccessTokenObj = await BotService.encryptSymmetric({
workspaceId: integrationAuth.workspace.toString(),
plaintext: accessToken
});
let encryptedAccessIdObj;
if (accessId) {
encryptedAccessIdObj = await BotService.encryptSymmetric({
workspaceId: integrationAuth.workspace.toString(),
plaintext: accessId
});
}
integrationAuth = await IntegrationAuth.findOneAndUpdate({
_id: integrationAuthId
}, {
accessCiphertext: obj.ciphertext,
accessIV: obj.iv,
accessTag: obj.tag,
accessIdCiphertext: encryptedAccessIdObj?.ciphertext ?? undefined,
accessIdIV: encryptedAccessIdObj?.iv ?? undefined,
accessIdTag: encryptedAccessIdObj?.tag ?? undefined,
accessCiphertext: encryptedAccessTokenObj.ciphertext,
accessIV: encryptedAccessTokenObj.iv,
accessTag: encryptedAccessTokenObj.tag,
accessExpiresAt
}, {
new: true

View File

@ -7,6 +7,7 @@ import { Membership, Key } from '../models';
* @param {Object} obj
* @param {String} obj.userId - id of user to validate
* @param {String} obj.workspaceId - id of workspace
* @returns {Membership} membership - membership of user with id [userId] for workspace with id [workspaceId]
*/
const validateMembership = async ({
userId,

View File

@ -1,6 +1,42 @@
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import { MembershipOrg, Workspace, Membership, Key } from '../models';
/**
* Validate that user with id [userId] is a member of organization with id [organizationId]
* and has at least one of the roles in [acceptedRoles]
*
*/
const validateMembership = async ({
userId,
organizationId,
acceptedRoles
}: {
userId: string;
organizationId: string;
acceptedRoles: string[];
}) => {
let membership;
try {
membership = await MembershipOrg.findOne({
user: new Types.ObjectId(userId),
organization: new Types.ObjectId(organizationId)
});
if (!membership) throw new Error('Failed to find organization membership');
if (!acceptedRoles.includes(membership.role)) {
throw new Error('Failed to validate organization membership role');
}
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to validate organization membership');
}
return membership;
}
/**
* Return organization membership matching criteria specified in
* query [queryObj]
@ -84,6 +120,8 @@ const deleteMembershipOrg = async ({
_id: membershipOrgId
});
if (!deletedMembershipOrg) throw new Error('Failed to delete organization membership');
// delete keys associated with organization membership
if (deletedMembershipOrg?.user) {
// case: organization membership had a registered user
@ -117,4 +155,9 @@ const deleteMembershipOrg = async ({
return deletedMembershipOrg;
};
export { findMembershipOrg, addMembershipsOrg, deleteMembershipOrg };
export {
validateMembership,
findMembershipOrg,
addMembershipsOrg,
deleteMembershipOrg
};

View File

@ -3,6 +3,7 @@ import Stripe from 'stripe';
import {
STRIPE_SECRET_KEY,
STRIPE_PRODUCT_STARTER,
STRIPE_PRODUCT_TEAM,
STRIPE_PRODUCT_PRO
} from '../config';
const stripe = new Stripe(STRIPE_SECRET_KEY, {
@ -14,6 +15,7 @@ import { Organization, MembershipOrg } from '../models';
const productToPriceMap = {
starter: STRIPE_PRODUCT_STARTER,
team: STRIPE_PRODUCT_TEAM,
pro: STRIPE_PRODUCT_PRO
};
@ -55,7 +57,7 @@ const createOrganization = async ({
} catch (err) {
Sentry.setUser({ email });
Sentry.captureException(err);
throw new Error('Failed to create organization');
throw new Error(`Failed to create organization [err=${err}]`);
}
return organization;

View File

@ -1,38 +1,43 @@
import rateLimit from 'express-rate-limit';
// 300 requests per 15 minutes
// 120 requests per minute
const apiLimiter = rateLimit({
windowMs: 15 * 60 * 1000,
max: 450,
windowMs: 60 * 1000,
max: 240,
standardHeaders: true,
legacyHeaders: false,
skip: (request) => {
return request.path === '/healthcheck' || request.path === '/api/status'
},
keyGenerator: (req, res) => {
return req.clientIp
}
});
// 5 requests per hour
const signupLimiter = rateLimit({
windowMs: 60 * 60 * 1000,
// 10 requests per minute
const authLimiter = rateLimit({
windowMs: 60 * 1000,
max: 10,
standardHeaders: true,
legacyHeaders: false
legacyHeaders: false,
keyGenerator: (req, res) => {
return req.clientIp
}
});
// 10 requests per hour
const loginLimiter = rateLimit({
windowMs: 60 * 60 * 1000,
max: 25,
standardHeaders: true,
legacyHeaders: false
});
// 5 requests per hour
const passwordLimiter = rateLimit({
windowMs: 60 * 60 * 1000,
max: 10,
standardHeaders: true,
legacyHeaders: false
legacyHeaders: false,
keyGenerator: (req, res) => {
return req.clientIp
}
});
export { apiLimiter, signupLimiter, loginLimiter, passwordLimiter };
export {
apiLimiter,
authLimiter,
passwordLimiter
};

View File

@ -12,14 +12,17 @@ import {
import {
IAction
} from '../ee/models';
import {
SECRET_SHARED,
import {
SECRET_SHARED,
SECRET_PERSONAL,
ACTION_ADD_SECRETS,
ACTION_UPDATE_SECRETS,
ACTION_DELETE_SECRETS,
ACTION_READ_SECRETS
} from '../variables';
import _ from 'lodash';
import { ABILITY_WRITE } from '../variables/organization';
import { BadRequestError, UnauthorizedRequestError } from '../utils/errors';
/**
* Validate that user with id [userId] can modify secrets with ids [secretIds]
@ -34,28 +37,39 @@ const validateSecrets = async ({
}: {
userId: string;
secretIds: string[];
}) =>{
}) => {
let secrets;
try {
secrets = await Secret.find({
_id: {
$in: secretIds
$in: secretIds.map((secretId: string) => new Types.ObjectId(secretId))
}
});
const workspaceIdsSet = new Set((await Membership.find({
user: userId
}, 'workspace'))
.map((m) => m.workspace.toString()));
if (secrets.length != secretIds.length) {
throw BadRequestError({ message: 'Unable to validate some secrets' })
}
const userMemberships = await Membership.find({ user: userId })
const userMembershipById = _.keyBy(userMemberships, 'workspace');
const workspaceIdsSet = new Set(userMemberships.map((m) => m.workspace.toString()));
// for each secret check if the secret belongs to a workspace the user is a member of
secrets.forEach((secret: ISecret) => {
if (!workspaceIdsSet.has(secret.workspace.toString())) {
throw new Error('Failed to validate secret');
if (workspaceIdsSet.has(secret.workspace.toString())) {
const deniedMembershipPermissions = userMembershipById[secret.workspace.toString()].deniedPermissions;
const isDisallowed = _.some(deniedMembershipPermissions, { environmentSlug: secret.environment, ability: ABILITY_WRITE });
if (isDisallowed) {
throw UnauthorizedRequestError({ message: 'You do not have the required permissions to perform this action' });
}
} else {
throw BadRequestError({ message: 'You cannot edit secrets of a workspace you are not a member of' });
}
});
} catch (err) {
throw new Error('Failed to validate secrets');
throw BadRequestError({ message: 'Unable to validate secrets' })
}
return secrets;
@ -127,13 +141,13 @@ const v1PushSecrets = async ({
workspaceId,
environment
});
const oldSecretsObj: any = oldSecrets.reduce((accumulator, s: any) =>
const oldSecretsObj: any = oldSecrets.reduce((accumulator, s: any) =>
({ ...accumulator, [`${s.type}-${s.secretKeyHash}`]: s })
, {});
const newSecretsObj: any = secrets.reduce((accumulator, s) =>
, {});
const newSecretsObj: any = secrets.reduce((accumulator, s) =>
({ ...accumulator, [`${s.type}-${s.hashKey}`]: s })
, {});
, {});
// handle deleting secrets
const toDelete = oldSecrets
@ -150,12 +164,12 @@ const v1PushSecrets = async ({
secretIds: toDelete
});
}
const toUpdate = oldSecrets
.filter((s) => {
if (`${s.type}-${s.secretKeyHash}` in newSecretsObj) {
if (s.secretValueHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].hashValue
|| s.secretCommentHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].hashComment) {
if (s.secretValueHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].hashValue
|| s.secretCommentHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].hashComment) {
// case: filter secrets where value or comment changed
return true;
}
@ -165,7 +179,7 @@ const v1PushSecrets = async ({
return true;
}
}
return false;
});
@ -217,7 +231,7 @@ const v1PushSecrets = async ({
};
});
await Secret.bulkWrite(operations as any);
// (EE) add secret versions for updated secrets
await EESecretService.addSecretVersions({
secretVersions: toUpdate.map(({
@ -245,7 +259,7 @@ const v1PushSecrets = async ({
secretValueTag: newSecret.tagValue,
secretValueHash: newSecret.hashValue
})
})
})
});
// handle adding new secrets
@ -319,7 +333,7 @@ const v1PushSecrets = async ({
}))
});
}
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId
@ -344,7 +358,7 @@ const v1PushSecrets = async ({
* @param {String} obj.channel - channel (web/cli/auto)
* @param {String} obj.ipAddress - ip address of request to push secrets
*/
const v2PushSecrets = async ({
const v2PushSecrets = async ({
userId,
workspaceId,
environment,
@ -362,20 +376,20 @@ const v1PushSecrets = async ({
// TODO: clean up function and fix up types
try {
const actions: IAction[] = [];
// construct useful data structures
const oldSecrets = await getSecrets({
userId,
workspaceId,
environment
});
const oldSecretsObj: any = oldSecrets.reduce((accumulator, s: any) =>
const oldSecretsObj: any = oldSecrets.reduce((accumulator, s: any) =>
({ ...accumulator, [`${s.type}-${s.secretKeyHash}`]: s })
, {});
const newSecretsObj: any = secrets.reduce((accumulator, s) =>
, {});
const newSecretsObj: any = secrets.reduce((accumulator, s) =>
({ ...accumulator, [`${s.type}-${s.secretKeyHash}`]: s })
, {});
, {});
// handle deleting secrets
const toDelete = oldSecrets
@ -391,22 +405,22 @@ const v1PushSecrets = async ({
await EESecretService.markDeletedSecretVersions({
secretIds: toDelete
});
const deleteAction = await EELogService.createActionSecret({
const deleteAction = await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
userId,
workspaceId,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(userId),
secretIds: toDelete
});
deleteAction && actions.push(deleteAction);
}
const toUpdate = oldSecrets
.filter((s) => {
if (`${s.type}-${s.secretKeyHash}` in newSecretsObj) {
if (s.secretValueHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].secretValueHash
|| s.secretCommentHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].secretCommentHash) {
if (s.secretValueHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].secretValueHash
|| s.secretCommentHash !== newSecretsObj[`${s.type}-${s.secretKeyHash}`].secretCommentHash) {
// case: filter secrets where value or comment changed
return true;
}
@ -416,7 +430,7 @@ const v1PushSecrets = async ({
return true;
}
}
return false;
});
@ -469,7 +483,7 @@ const v1PushSecrets = async ({
};
});
await Secret.bulkWrite(operations as any);
// (EE) add secret versions for updated secrets
await EESecretService.addSecretVersions({
secretVersions: toUpdate.map((s) => {
@ -482,13 +496,13 @@ const v1PushSecrets = async ({
environment: s.environment,
isDeleted: false
})
})
})
});
const updateAction = await EELogService.createActionSecret({
const updateAction = await EELogService.createAction({
name: ACTION_UPDATE_SECRETS,
userId,
workspaceId,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId),
secretIds: toUpdate.map((u) => u._id)
});
@ -507,29 +521,30 @@ const v1PushSecrets = async ({
workspace: workspaceId,
type: toAdd[idx].type,
environment,
...( toAdd[idx].type === 'personal' ? { user: userId } : {})
...(toAdd[idx].type === 'personal' ? { user: userId } : {})
}))
);
// (EE) add secret versions for new secrets
EESecretService.addSecretVersions({
secretVersions: newSecrets.map((secretDocument) => {
secretVersions: newSecrets.map((secretDocument) => {
return {
...secretDocument.toObject(),
secret: secretDocument._id,
isDeleted: false
}})
}
})
});
const addAction = await EELogService.createActionSecret({
const addAction = await EELogService.createAction({
name: ACTION_ADD_SECRETS,
userId,
workspaceId,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId),
secretIds: newSecrets.map((n) => n._id)
});
addAction && actions.push(addAction);
}
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId
@ -538,8 +553,8 @@ const v1PushSecrets = async ({
// (EE) create (audit) log
if (actions.length > 0) {
await EELogService.createLog({
userId,
workspaceId,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId),
actions,
channel,
ipAddress
@ -560,7 +575,7 @@ const v1PushSecrets = async ({
* @param {String} obj.workspaceId - id of workspace to pull from
* @param {String} obj.environment - environment for secrets
*/
const getSecrets = async ({
const getSecrets = async ({
userId,
workspaceId,
environment
@ -570,7 +585,7 @@ const v1PushSecrets = async ({
environment: string;
}): Promise<ISecret[]> => {
let secrets: any; // TODO: FIX any
try {
// get shared workspace secrets
const sharedSecrets = await Secret.find({
@ -622,7 +637,7 @@ const pullSecrets = async ({
ipAddress: string;
}): Promise<ISecret[]> => {
let secrets: any;
try {
secrets = await getSecrets({
userId,
@ -630,16 +645,16 @@ const pullSecrets = async ({
environment
})
const readAction = await EELogService.createActionSecret({
const readAction = await EELogService.createAction({
name: ACTION_READ_SECRETS,
userId,
workspaceId,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId),
secretIds: secrets.map((n: any) => n._id)
});
readAction && await EELogService.createLog({
userId,
workspaceId,
userId: new Types.ObjectId(userId),
workspaceId: new Types.ObjectId(workspaceId),
actions: [readAction],
channel,
ipAddress
@ -698,10 +713,27 @@ const reformatPullSecrets = ({ secrets }: { secrets: ISecret[] }) => {
return reformatedSecrets;
};
const secretObjectHasRequiredFields = (secretObject: ISecret) => {
if (!secretObject.type ||
!(secretObject.type === SECRET_PERSONAL || secretObject.type === SECRET_SHARED) ||
!secretObject.secretKeyCiphertext ||
!secretObject.secretKeyIV ||
!secretObject.secretKeyTag ||
(typeof secretObject.secretValueCiphertext !== 'string') ||
!secretObject.secretValueIV ||
!secretObject.secretValueTag) {
return false
}
return true
}
export {
validateSecrets,
v1PushSecrets,
v2PushSecrets,
pullSecrets,
reformatPullSecrets
reformatPullSecrets,
secretObjectHasRequiredFields
};

View File

@ -1,12 +1,11 @@
import * as Sentry from '@sentry/node';
import crypto from 'crypto';
import { Token, IToken, IUser } from '../models';
import { IUser } from '../models';
import { createOrganization } from './organization';
import { addMembershipsOrg } from './membershipOrg';
import { createWorkspace } from './workspace';
import { addMemberships } from './membership';
import { OWNER, ADMIN, ACCEPTED } from '../variables';
import { OWNER, ACCEPTED } from '../variables';
import { sendMail } from '../helpers/nodemailer';
import { TokenService } from '../services';
import { TOKEN_EMAIL_CONFIRMATION } from '../variables';
/**
* Send magic link to verify email to [email]
@ -14,21 +13,13 @@ import { sendMail } from '../helpers/nodemailer';
* @param {Object} obj
* @param {String} obj.email - email
* @returns {Boolean} success - whether or not operation was successful
*
*/
const sendEmailVerification = async ({ email }: { email: string }) => {
try {
const token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
await Token.findOneAndUpdate(
{ email },
{
email,
token,
createdAt: new Date()
},
{ upsert: true, new: true }
);
const token = await TokenService.createToken({
type: TOKEN_EMAIL_CONFIRMATION,
email
});
// send mail
await sendMail({
@ -62,12 +53,11 @@ const checkEmailVerification = async ({
code: string;
}) => {
try {
const token = await Token.findOneAndDelete({
await TokenService.validateToken({
type: TOKEN_EMAIL_CONFIRMATION,
email,
token: code
});
if (!token) throw new Error('Failed to find email verification token');
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
@ -103,20 +93,8 @@ const initializeDefaultOrg = async ({
roles: [OWNER],
statuses: [ACCEPTED]
});
// initialize a default workspace inside the new organization
const workspace = await createWorkspace({
name: `Example Project`,
organizationId: organization._id.toString()
});
await addMemberships({
userIds: [user._id.toString()],
workspaceId: workspace._id.toString(),
roles: [ADMIN]
});
} catch (err) {
throw new Error('Failed to initialize default organization and workspace');
throw new Error(`Failed to initialize default organization and workspace [err=${err}]`);
}
};

View File

@ -0,0 +1,217 @@
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import { TokenData } from '../models';
import crypto from 'crypto';
import bcrypt from 'bcrypt';
import {
TOKEN_EMAIL_CONFIRMATION,
TOKEN_EMAIL_MFA,
TOKEN_EMAIL_ORG_INVITATION,
TOKEN_EMAIL_PASSWORD_RESET
} from '../variables';
import {
SALT_ROUNDS
} from '../config';
import { UnauthorizedRequestError } from '../utils/errors';
/**
* Create and store a token in the database for purpose [type]
* @param {Object} obj
* @param {String} obj.type
* @param {String} obj.email
* @param {String} obj.phoneNumber
* @param {Types.ObjectId} obj.organizationId
* @returns {String} token - the created token
*/
const createTokenHelper = async ({
type,
email,
phoneNumber,
organizationId
}: {
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
email?: string;
phoneNumber?: string;
organizationId?: Types.ObjectId
}) => {
let token, expiresAt, triesLeft;
try {
// generate random token based on specified token use-case
// type [type]
switch (type) {
case TOKEN_EMAIL_CONFIRMATION:
// generate random 6-digit code
token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
expiresAt = new Date((new Date()).getTime() + 86400000);
break;
case TOKEN_EMAIL_MFA:
// generate random 6-digit code
token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
triesLeft = 5;
expiresAt = new Date((new Date()).getTime() + 300000);
break;
case TOKEN_EMAIL_ORG_INVITATION:
// generate random hex
token = crypto.randomBytes(16).toString('hex');
expiresAt = new Date((new Date()).getTime() + 259200000);
break;
case TOKEN_EMAIL_PASSWORD_RESET:
// generate random hex
token = crypto.randomBytes(16).toString('hex');
expiresAt = new Date((new Date()).getTime() + 86400000);
break;
default:
token = crypto.randomBytes(16).toString('hex');
expiresAt = new Date();
break;
}
interface TokenDataQuery {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
}
interface TokenDataUpdate {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
tokenHash: string;
triesLeft?: number;
expiresAt: Date;
}
const query: TokenDataQuery = { type };
const update: TokenDataUpdate = {
type,
tokenHash: await bcrypt.hash(token, SALT_ROUNDS),
expiresAt
}
if (email) {
query.email = email;
update.email = email;
}
if (phoneNumber) {
query.phoneNumber = phoneNumber;
update.phoneNumber = phoneNumber;
}
if (organizationId) {
query.organization = organizationId
update.organization = organizationId
}
if (triesLeft) {
update.triesLeft = triesLeft;
}
await TokenData.findOneAndUpdate(
query,
update,
{
new: true,
upsert: true
}
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error(
"Failed to create token"
);
}
return token;
}
/**
*
* @param {Object} obj
* @param {String} obj.email - email associated with the token
* @param {String} obj.token - value of the token
*/
const validateTokenHelper = async ({
type,
email,
phoneNumber,
organizationId,
token
}: {
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
email?: string;
phoneNumber?: string;
organizationId?: Types.ObjectId;
token: string;
}) => {
interface Query {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
}
const query: Query = { type };
if (email) { query.email = email; }
if (phoneNumber) { query.phoneNumber = phoneNumber; }
if (organizationId) { query.organization = organizationId; }
const tokenData = await TokenData.findOne(query).select('+tokenHash');
if (!tokenData) throw new Error('Failed to find token to validate');
if (tokenData.expiresAt < new Date()) {
// case: token expired
await TokenData.findByIdAndDelete(tokenData._id);
throw UnauthorizedRequestError({
message: 'MFA session expired. Please log in again',
context: {
code: 'mfa_expired'
}
});
}
const isValid = await bcrypt.compare(token, tokenData.tokenHash);
if (!isValid) {
// case: token is not valid
if (tokenData?.triesLeft !== undefined) {
// case: token has a try-limit
if (tokenData.triesLeft === 1) {
// case: token is out of tries
await TokenData.findByIdAndDelete(tokenData._id);
} else {
// case: token has more than 1 try left
await TokenData.findByIdAndUpdate(tokenData._id, {
triesLeft: tokenData.triesLeft - 1
}, {
new: true
});
}
throw UnauthorizedRequestError({
message: 'MFA code is invalid',
context: {
code: 'mfa_invalid',
triesLeft: tokenData.triesLeft - 1
}
});
}
throw UnauthorizedRequestError({
message: 'MFA code is invalid',
context: {
code: 'mfa_invalid'
}
});
}
// case: token is valid
await TokenData.findByIdAndDelete(tokenData._id);
}
export {
createTokenHelper,
validateTokenHelper
}

View File

@ -1,5 +1,6 @@
import * as Sentry from '@sentry/node';
import { User, IUser } from '../models';
import { IUser, User } from '../models';
import { sendMail } from './nodemailer';
/**
* Initialize a user under email [email]
@ -28,10 +29,14 @@ const setupAccount = async ({ email }: { email: string }) => {
* @param {String} obj.userId - id of user to finish setting up
* @param {String} obj.firstName - first name of user
* @param {String} obj.lastName - last name of user
* @param {Number} obj.encryptionVersion - version of auth encryption scheme used
* @param {String} obj.protectedKey - protected key in encryption version 2
* @param {String} obj.protectedKeyIV - IV of protected key in encryption version 2
* @param {String} obj.protectedKeyTag - tag of protected key in encryption version 2
* @param {String} obj.publicKey - publickey of user
* @param {String} obj.encryptedPrivateKey - (encrypted) private key of user
* @param {String} obj.iv - iv for (encrypted) private key of user
* @param {String} obj.tag - tag for (encrypted) private key of user
* @param {String} obj.encryptedPrivateKeyIV - iv for (encrypted) private key of user
* @param {String} obj.encryptedPrivateKeyTag - tag for (encrypted) private key of user
* @param {String} obj.salt - salt for auth SRP
* @param {String} obj.verifier - verifier for auth SRP
* @returns {Object} user - the completed user
@ -40,20 +45,28 @@ const completeAccount = async ({
userId,
firstName,
lastName,
encryptionVersion,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
iv,
tag,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
}: {
userId: string;
firstName: string;
lastName: string;
encryptionVersion: number;
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
publicKey: string;
encryptedPrivateKey: string;
iv: string;
tag: string;
encryptedPrivateKeyIV: string;
encryptedPrivateKeyTag: string;
salt: string;
verifier: string;
}) => {
@ -67,10 +80,14 @@ const completeAccount = async ({
{
firstName,
lastName,
encryptionVersion,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
iv,
tag,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier
},
@ -85,4 +102,48 @@ const completeAccount = async ({
return user;
};
export { setupAccount, completeAccount };
/**
* Check if device with ip [ip] and user-agent [userAgent] has been seen for user [user].
* If the device is unseen, then notify the user of the new device
* @param {Object} obj
* @param {String} obj.ip - login ip address
* @param {String} obj.userAgent - login user-agent
*/
const checkUserDevice = async ({
user,
ip,
userAgent
}: {
user: IUser;
ip: string;
userAgent: string;
}) => {
const isDeviceSeen = user.devices.some((device) => device.ip === ip && device.userAgent === userAgent);
if (!isDeviceSeen) {
// case: unseen login ip detected for user
// -> notify user about the sign-in from new ip
user.devices = user.devices.concat([{
ip: String(ip),
userAgent
}]);
await user.save();
// send MFA code [code] to [email]
await sendMail({
template: 'newDevice.handlebars',
subjectLine: `Successful login from new device`,
recipients: [user.email],
substitutions: {
email: user.email,
timestamp: new Date().toString(),
ip,
userAgent
}
});
}
}
export { setupAccount, completeAccount, checkUserDevice };

View File

@ -8,6 +8,7 @@ import { DatabaseService } from './services';
import { setUpHealthEndpoint } from './services/health';
import { initSmtp } from './services/smtp';
import { setTransporter } from './helpers/nodemailer';
import { createTestUserForDevelopment } from './utils/addDevelopmentUser';
DatabaseService.initDatabase(MONGO_URL);
@ -23,3 +24,5 @@ if (NODE_ENV !== 'test') {
environment: NODE_ENV
});
}
createTestUserForDevelopment()

View File

@ -1,16 +1,27 @@
import axios from 'axios';
import * as Sentry from '@sentry/node';
import { Octokit } from '@octokit/rest';
import { IIntegrationAuth } from '../models';
import * as Sentry from "@sentry/node";
import { Octokit } from "@octokit/rest";
import { IIntegrationAuth } from "../models";
import request from '../config/request';
import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
INTEGRATION_HEROKU_API_URL,
INTEGRATION_VERCEL_API_URL,
INTEGRATION_NETLIFY_API_URL
} from '../variables';
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_RENDER_API_URL,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_TRAVISCI_API_URL,
} from "../variables";
/**
* Return list of names of apps for integration named [integration]
@ -22,54 +33,82 @@ import {
*/
const getApps = async ({
integrationAuth,
accessToken
accessToken,
}: {
integrationAuth: IIntegrationAuth;
accessToken: string;
}) => {
interface App {
name: string;
siteId?: string;
appId?: string;
owner?: string;
}
let apps: App[]; // TODO: add type and define payloads for apps
let apps: App[] = [];
try {
switch (integrationAuth.integration) {
case INTEGRATION_AZURE_KEY_VAULT:
apps = [];
break;
case INTEGRATION_AWS_PARAMETER_STORE:
apps = [];
break;
case INTEGRATION_AWS_SECRET_MANAGER:
apps = [];
break;
case INTEGRATION_HEROKU:
apps = await getAppsHeroku({
accessToken
accessToken,
});
break;
case INTEGRATION_VERCEL:
apps = await getAppsVercel({
integrationAuth,
accessToken
accessToken,
});
break;
case INTEGRATION_NETLIFY:
apps = await getAppsNetlify({
integrationAuth,
accessToken
accessToken,
});
break;
case INTEGRATION_GITHUB:
apps = await getAppsGithub({
integrationAuth,
accessToken
accessToken,
});
break;
case INTEGRATION_RENDER:
apps = await getAppsRender({
accessToken,
});
break;
case INTEGRATION_FLYIO:
apps = await getAppsFlyio({
accessToken,
});
break;
case INTEGRATION_CIRCLECI:
apps = await getAppsCircleCI({
accessToken,
});
break;
case INTEGRATION_TRAVISCI:
apps = await getAppsTravisCI({
accessToken,
})
break;
}
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get integration apps');
throw new Error("Failed to get integration apps");
}
return apps;
};
/**
* Return list of names of apps for Heroku integration
* Return list of apps for Heroku integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Heroku API
* @returns {Object[]} apps - names of Heroku apps
@ -79,21 +118,21 @@ const getAppsHeroku = async ({ accessToken }: { accessToken: string }) => {
let apps;
try {
const res = (
await axios.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
await request.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
headers: {
Accept: 'application/vnd.heroku+json; version=3',
Authorization: `Bearer ${accessToken}`
}
Accept: "application/vnd.heroku+json; version=3",
Authorization: `Bearer ${accessToken}`,
},
})
).data;
apps = res.map((a: any) => ({
name: a.name
name: a.name,
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get Heroku integration apps');
throw new Error("Failed to get Heroku integration apps");
}
return apps;
@ -106,50 +145,9 @@ const getAppsHeroku = async ({ accessToken }: { accessToken: string }) => {
* @returns {Object[]} apps - names of Vercel apps
* @returns {String} apps.name - name of Vercel app
*/
const getAppsVercel = async ({
const getAppsVercel = async ({
integrationAuth,
accessToken
}: {
integrationAuth: IIntegrationAuth;
accessToken: string;
}) => {
let apps;
try {
const res = (
await axios.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
headers: {
Authorization: `Bearer ${accessToken}`
},
...( integrationAuth?.teamId ? {
params: {
teamId: integrationAuth.teamId
}
} : {})
})
).data;
apps = res.projects.map((a: any) => ({
name: a.name
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get Vercel integration apps');
}
return apps;
};
/**
* Return list of names of sites for Netlify integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Netlify API
* @returns {Object[]} apps - names of Netlify sites
* @returns {String} apps.name - name of Netlify site
*/
const getAppsNetlify = async ({
integrationAuth,
accessToken
accessToken,
}: {
integrationAuth: IIntegrationAuth;
accessToken: string;
@ -157,64 +155,250 @@ const getAppsNetlify = async ({
let apps;
try {
const res = (
await axios.get(`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`, {
await request.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
headers: {
Authorization: `Bearer ${accessToken}`
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
...(integrationAuth?.teamId
? {
params: {
teamId: integrationAuth.teamId,
},
}
: {}),
})
).data;
apps = res.projects.map((a: any) => ({
name: a.name,
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to get Vercel integration apps");
}
return apps;
};
/**
* Return list of sites for Netlify integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Netlify API
* @returns {Object[]} apps - names of Netlify sites
* @returns {String} apps.name - name of Netlify site
*/
const getAppsNetlify = async ({ accessToken }: { accessToken: string }) => {
let apps;
try {
const res = (
await request.get(`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`, {
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
})
).data;
apps = res.map((a: any) => ({
name: a.name,
siteId: a.site_id
appId: a.site_id,
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get Netlify integration apps');
throw new Error("Failed to get Netlify integration apps");
}
return apps;
};
/**
* Return list of names of repositories for Github integration
* Return list of repositories for Github integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Netlify API
* @returns {Object[]} apps - names of Netlify sites
* @returns {String} apps.name - name of Netlify site
*/
const getAppsGithub = async ({
integrationAuth,
accessToken
}: {
integrationAuth: IIntegrationAuth;
accessToken: string;
}) => {
const getAppsGithub = async ({ accessToken }: { accessToken: string }) => {
let apps;
try {
const octokit = new Octokit({
auth: accessToken
auth: accessToken,
});
const repos = (await octokit.request(
'GET /user/repos{?visibility,affiliation,type,sort,direction,per_page,page,since,before}',
{}
)).data;
const repos = (
await octokit.request(
"GET /user/repos{?visibility,affiliation,type,sort,direction,per_page,page,since,before}",
{
per_page: 100,
}
)
).data;
apps = repos
.filter((a:any) => a.permissions.admin === true)
.filter((a: any) => a.permissions.admin === true)
.map((a: any) => ({
name: a.name
})
);
name: a.name,
owner: a.owner.login,
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get Github repos');
throw new Error("Failed to get Github repos");
}
return apps;
};
/**
* Return list of services for Render integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Render API
* @returns {Object[]} apps - names and ids of Render services
* @returns {String} apps.name - name of Render service
* @returns {String} apps.appId - id of Render service
*/
const getAppsRender = async ({ accessToken }: { accessToken: string }) => {
let apps: any;
try {
const res = (
await request.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: 'application/json',
'Accept-Encoding': 'application/json',
},
})
).data;
apps = res
.map((a: any) => ({
name: a.service.name,
appId: a.service.id
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to get Render services");
}
return apps;
};
/**
* Return list of apps for Fly.io integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Fly.io API
* @returns {Object[]} apps - names and ids of Fly.io apps
* @returns {String} apps.name - name of Fly.io apps
*/
const getAppsFlyio = async ({ accessToken }: { accessToken: string }) => {
let apps;
try {
const query = `
query($role: String) {
apps(type: "container", first: 400, role: $role) {
nodes {
id
name
hostname
}
}
}
`;
const res = (await request.post(INTEGRATION_FLYIO_API_URL, {
query,
variables: {
role: null,
},
}, {
headers: {
Authorization: "Bearer " + accessToken,
'Accept': 'application/json',
'Accept-Encoding': 'application/json',
},
})).data.data.apps.nodes;
apps = res.map((a: any) => ({
name: a.name,
}));
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to get Fly.io apps");
}
return apps;
};
/**
* Return list of projects for CircleCI integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for CircleCI API
* @returns {Object[]} apps -
* @returns {String} apps.name - name of CircleCI apps
*/
const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
let apps: any;
try {
const res = (
await request.get(
`${INTEGRATION_CIRCLECI_API_URL}/v1.1/projects`,
{
headers: {
"Circle-Token": accessToken,
"Accept-Encoding": "application/json",
},
}
)
).data
apps = res?.map((a: any) => {
return {
name: a?.reponame
}
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to get CircleCI projects");
}
return apps;
};
const getAppsTravisCI = async ({ accessToken }: { accessToken: string }) => {
let apps: any;
try {
const res = (
await request.get(
`${INTEGRATION_TRAVISCI_API_URL}/repos`,
{
headers: {
"Authorization": `token ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
)
).data;
apps = res?.map((a: any) => {
return {
name: a?.slug?.split("/")[1],
appId: a?.id,
}
});
}catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to get TravisCI projects");
}
return apps;
}
export { getApps };

View File

@ -1,10 +1,12 @@
import axios from 'axios';
import request from '../config/request';
import * as Sentry from '@sentry/node';
import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_AZURE_TOKEN_URL,
INTEGRATION_HEROKU_TOKEN_URL,
INTEGRATION_VERCEL_TOKEN_URL,
INTEGRATION_NETLIFY_TOKEN_URL,
@ -12,15 +14,27 @@ import {
} from '../variables';
import {
SITE_URL,
CLIENT_ID_AZURE,
CLIENT_ID_VERCEL,
CLIENT_ID_NETLIFY,
CLIENT_ID_GITHUB,
CLIENT_SECRET_AZURE,
CLIENT_SECRET_HEROKU,
CLIENT_SECRET_VERCEL,
CLIENT_SECRET_NETLIFY,
CLIENT_SECRET_GITHUB
} from '../config';
interface ExchangeCodeAzureResponse {
token_type: string;
scope: string;
expires_in: number;
ext_expires_in: number;
access_token: string;
refresh_token: string;
id_token: string;
}
interface ExchangeCodeHerokuResponse {
token_type: string;
access_token: string;
@ -75,6 +89,11 @@ const exchangeCode = async ({
try {
switch (integration) {
case INTEGRATION_AZURE_KEY_VAULT:
obj = await exchangeCodeAzure({
code
});
break;
case INTEGRATION_HEROKU:
obj = await exchangeCodeHeroku({
code
@ -105,6 +124,46 @@ const exchangeCode = async ({
return obj;
};
/**
* Return [accessToken] for Azure OAuth2 code-token exchange
* @param param0
*/
const exchangeCodeAzure = async ({
code
}: {
code: string;
}) => {
const accessExpiresAt = new Date();
let res: ExchangeCodeAzureResponse;
try {
res = (await request.post(
INTEGRATION_AZURE_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
scope: 'https://vault.azure.net/.default openid offline_access',
client_id: CLIENT_ID_AZURE,
client_secret: CLIENT_SECRET_AZURE,
redirect_uri: `${SITE_URL}/integrations/azure-key-vault/oauth2/callback`
} as any)
)).data;
accessExpiresAt.setSeconds(
accessExpiresAt.getSeconds() + res.expires_in
);
} catch (err: any) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Azure');
}
return ({
accessToken: res.access_token,
refreshToken: res.refresh_token,
accessExpiresAt
});
}
/**
* Return [accessToken], [accessExpiresAt], and [refreshToken] for Heroku
* OAuth2 code-token exchange
@ -116,36 +175,36 @@ const exchangeCode = async ({
* @returns {Date} obj2.accessExpiresAt - date of expiration for access token
*/
const exchangeCodeHeroku = async ({
code
code
}: {
code: string;
code: string;
}) => {
let res: ExchangeCodeHerokuResponse;
const accessExpiresAt = new Date();
try {
res = (await axios.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
client_secret: CLIENT_SECRET_HEROKU
} as any)
)).data;
accessExpiresAt.setSeconds(
accessExpiresAt.getSeconds() + res.expires_in
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Heroku');
}
return ({
accessToken: res.access_token,
refreshToken: res.refresh_token,
accessExpiresAt
});
let res: ExchangeCodeHerokuResponse;
const accessExpiresAt = new Date();
try {
res = (await request.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
client_secret: CLIENT_SECRET_HEROKU
} as any)
)).data;
accessExpiresAt.setSeconds(
accessExpiresAt.getSeconds() + res.expires_in
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Heroku');
}
return ({
accessToken: res.access_token,
refreshToken: res.refresh_token,
accessExpiresAt
});
}
/**
@ -162,20 +221,20 @@ const exchangeCodeVercel = async ({ code }: { code: string }) => {
let res: ExchangeCodeVercelResponse;
try {
res = (
await axios.post(
await request.post(
INTEGRATION_VERCEL_TOKEN_URL,
new URLSearchParams({
code: code,
client_id: CLIENT_ID_VERCEL,
client_secret: CLIENT_SECRET_VERCEL,
redirect_uri: `${SITE_URL}/vercel`
redirect_uri: `${SITE_URL}/integrations/vercel/oauth2/callback`
} as any)
)
).data;
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Vercel');
throw new Error(`Failed OAuth2 code-token exchange with Vercel [err=${err}]`);
}
return {
@ -201,26 +260,26 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
let accountId;
try {
res = (
await axios.post(
await request.post(
INTEGRATION_NETLIFY_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
client_id: CLIENT_ID_NETLIFY,
client_secret: CLIENT_SECRET_NETLIFY,
redirect_uri: `${SITE_URL}/netlify`
redirect_uri: `${SITE_URL}/integrations/netlify/oauth2/callback`
} as any)
)
).data;
const res2 = await axios.get('https://api.netlify.com/api/v1/sites', {
const res2 = await request.get('https://api.netlify.com/api/v1/sites', {
headers: {
Authorization: `Bearer ${res.access_token}`
}
});
const res3 = (
await axios.get('https://api.netlify.com/api/v1/accounts', {
await request.get('https://api.netlify.com/api/v1/accounts', {
headers: {
Authorization: `Bearer ${res.access_token}`
}
@ -255,15 +314,16 @@ const exchangeCodeGithub = async ({ code }: { code: string }) => {
let res: ExchangeCodeGithubResponse;
try {
res = (
await axios.get(INTEGRATION_GITHUB_TOKEN_URL, {
await request.get(INTEGRATION_GITHUB_TOKEN_URL, {
params: {
client_id: CLIENT_ID_GITHUB,
client_secret: CLIENT_SECRET_GITHUB,
code: code,
redirect_uri: `${SITE_URL}/github`
redirect_uri: `${SITE_URL}/integrations/github/oauth2/callback`
},
headers: {
Accept: 'application/json'
'Accept': 'application/json',
'Accept-Encoding': 'application/json'
}
})
).data;

View File

@ -1,13 +1,26 @@
import axios from 'axios';
import request from '../config/request';
import * as Sentry from '@sentry/node';
import { INTEGRATION_HEROKU } from '../variables';
import { INTEGRATION_AZURE_KEY_VAULT, INTEGRATION_HEROKU } from '../variables';
import {
CLIENT_SECRET_HEROKU
SITE_URL,
CLIENT_ID_AZURE,
CLIENT_SECRET_AZURE,
CLIENT_SECRET_HEROKU
} from '../config';
import {
INTEGRATION_HEROKU_TOKEN_URL
INTEGRATION_AZURE_TOKEN_URL,
INTEGRATION_HEROKU_TOKEN_URL
} from '../variables';
interface RefreshTokenAzureResponse {
token_type: string;
scope: string;
expires_in: number;
ext_expires_in: 4871;
access_token: string;
refresh_token: string;
}
/**
* Return new access token by exchanging refresh token [refreshToken] for integration
* named [integration]
@ -25,6 +38,11 @@ const exchangeRefresh = async ({
let accessToken;
try {
switch (integration) {
case INTEGRATION_AZURE_KEY_VAULT:
accessToken = await exchangeRefreshAzure({
refreshToken
});
break;
case INTEGRATION_HEROKU:
accessToken = await exchangeRefreshHeroku({
refreshToken
@ -40,6 +58,38 @@ const exchangeRefresh = async ({
return accessToken;
};
/**
* Return new access token by exchanging refresh token [refreshToken] for the
* Azure integration
* @param {Object} obj
* @param {String} obj.refreshToken - refresh token to use to get new access token for Azure
* @returns
*/
const exchangeRefreshAzure = async ({
refreshToken
}: {
refreshToken: string;
}) => {
try {
const res: RefreshTokenAzureResponse = (await request.post(
INTEGRATION_AZURE_TOKEN_URL,
new URLSearchParams({
client_id: CLIENT_ID_AZURE,
scope: 'openid offline_access',
refresh_token: refreshToken,
grant_type: 'refresh_token',
client_secret: CLIENT_SECRET_AZURE
} as any)
)).data;
return res.access_token;
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get refresh OAuth2 access token for Azure');
}
}
/**
* Return new access token by exchanging refresh token [refreshToken] for the
* Heroku integration
@ -52,23 +102,23 @@ const exchangeRefreshHeroku = async ({
}: {
refreshToken: string;
}) => {
let accessToken;
//TODO: Refactor code to take advantage of using RequestError. It's possible to create new types of errors for more detailed errors
try {
const res = await axios.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'refresh_token',
refresh_token: refreshToken,
client_secret: CLIENT_SECRET_HEROKU
} as any)
);
let accessToken;
try {
const res = await request.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'refresh_token',
refresh_token: refreshToken,
client_secret: CLIENT_SECRET_HEROKU
} as any)
);
accessToken = res.data.access_token;
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed to get new OAuth2 access token for Heroku');
throw new Error('Failed to refresh OAuth2 access token for Heroku');
}
return accessToken;

View File

@ -1,6 +1,11 @@
import axios from 'axios';
import * as Sentry from '@sentry/node';
import { IIntegrationAuth, IntegrationAuth, Integration } from '../models';
import {
IIntegrationAuth,
IntegrationAuth,
Integration,
Bot,
BotKey
} from '../models';
import {
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
@ -15,6 +20,7 @@ const revokeAccess = async ({
integrationAuth: IIntegrationAuth;
accessToken: string;
}) => {
let deletedIntegrationAuth;
try {
// add any integration-specific revocation logic
switch (integrationAuth.integration) {
@ -28,7 +34,7 @@ const revokeAccess = async ({
break;
}
const deletedIntegrationAuth = await IntegrationAuth.findOneAndDelete({
deletedIntegrationAuth = await IntegrationAuth.findOneAndDelete({
_id: integrationAuth._id
});
@ -42,6 +48,8 @@ const revokeAccess = async ({
Sentry.captureException(err);
throw new Error('Failed to delete integration authorization');
}
return deletedIntegrationAuth;
};
export { revokeAccess };

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,10 @@
import requireAuth from './requireAuth';
import requireMfaAuth from './requireMfaAuth';
import requireBotAuth from './requireBotAuth';
import requireSignupAuth from './requireSignupAuth';
import requireWorkspaceAuth from './requireWorkspaceAuth';
import requireMembershipAuth from './requireMembershipAuth';
import requireMembershipOrgAuth from './requireMembershipOrgAuth';
import requireOrganizationAuth from './requireOrganizationAuth';
import requireIntegrationAuth from './requireIntegrationAuth';
import requireIntegrationAuthorizationAuth from './requireIntegrationAuthorizationAuth';
@ -13,9 +16,12 @@ import validateRequest from './validateRequest';
export {
requireAuth,
requireMfaAuth,
requireBotAuth,
requireSignupAuth,
requireWorkspaceAuth,
requireMembershipAuth,
requireMembershipOrgAuth,
requireOrganizationAuth,
requireIntegrationAuth,
requireIntegrationAuthorizationAuth,

View File

@ -1,11 +1,12 @@
import { ErrorRequestHandler } from "express";
import * as Sentry from '@sentry/node';
import { InternalServerError } from "../utils/errors";
import { InternalServerError, UnauthorizedRequestError, UnprocessableEntityError } from "../utils/errors";
import { getLogger } from "../utils/logger";
import RequestError, { LogLevel } from "../utils/requestError";
import { NODE_ENV } from "../config";
import mongoose from "mongoose";
export const requestErrorHandler: ErrorRequestHandler = (error: RequestError | Error, req, res, next) => {
if (res.headersSent) return next();
@ -33,4 +34,17 @@ export const requestErrorHandler: ErrorRequestHandler = (error: RequestError | E
res.status((<RequestError>error).statusCode).json((<RequestError>error).format(req))
next()
}
export const handleMongoInvalidDataError = (err: any, req: any, res: any, next: any) => {
if (err instanceof mongoose.Error.ValidationError) {
const errors: any = {};
for (const field in err.errors) {
errors[field] = err.errors[field].message;
}
throw UnprocessableEntityError({ message: JSON.stringify(errors) })
} else {
next(err);
}
}

View File

@ -7,7 +7,6 @@ import {
getAuthSTDPayload,
getAuthAPIKeyPayload
} from '../helpers/auth';
import { BadRequestError } from '../utils/errors';
declare module 'jsonwebtoken' {
export interface UserIDJwtPayload extends jwt.JwtPayload {
@ -31,37 +30,28 @@ const requireAuth = ({
acceptedAuthModes: string[];
}) => {
return async (req: Request, res: Response, next: NextFunction) => {
const [AUTH_TOKEN_TYPE, AUTH_TOKEN_VALUE] = <[string, string]>req.headers['authorization']?.split(' ', 2) ?? [null, null]
if (AUTH_TOKEN_TYPE === null)
return next(BadRequestError({ message: `Missing Authorization Header in the request header.` }))
if (AUTH_TOKEN_TYPE.toLowerCase() !== 'bearer')
return next(BadRequestError({ message: `The provided authentication type '${AUTH_TOKEN_TYPE}' is not supported.` }))
if (AUTH_TOKEN_VALUE === null)
return next(BadRequestError({ message: 'Missing Authorization Body in the request header' }))
// validate auth token against
const authMode = validateAuthMode({
authTokenValue: AUTH_TOKEN_VALUE,
// validate auth token against accepted auth modes [acceptedAuthModes]
// and return token type [authTokenType] and value [authTokenValue]
const { authTokenType, authTokenValue } = validateAuthMode({
headers: req.headers,
acceptedAuthModes
});
if (!acceptedAuthModes.includes(authMode)) throw new Error('Failed to validate auth mode');
// attach auth payloads
switch (authMode) {
switch (authTokenType) {
case 'serviceToken':
req.serviceTokenData = await getAuthSTDPayload({
authTokenValue: AUTH_TOKEN_VALUE
authTokenValue
});
break;
case 'apiKey':
req.user = await getAuthAPIKeyPayload({
authTokenValue: AUTH_TOKEN_VALUE
authTokenValue
});
break;
default:
req.user = await getAuthUserPayload({
authTokenValue: AUTH_TOKEN_VALUE
authTokenValue
});
break;
}

View File

@ -1,10 +1,12 @@
import * as Sentry from '@sentry/node';
import { Request, Response, NextFunction } from 'express';
import { IntegrationAuth } from '../models';
import { IntegrationAuth, IWorkspace } from '../models';
import { IntegrationService } from '../services';
import { validateMembership } from '../helpers/membership';
import { UnauthorizedRequestError } from '../utils/errors';
type req = 'params' | 'body' | 'query';
/**
* Validate if user on request is a member of workspace with proper roles associated
* with the integration authorization on request params.
@ -14,17 +16,20 @@ import { UnauthorizedRequestError } from '../utils/errors';
*/
const requireIntegrationAuthorizationAuth = ({
acceptedRoles,
attachAccessToken = true
attachAccessToken = true,
location = 'params'
}: {
acceptedRoles: string[];
attachAccessToken?: boolean;
location?: req;
}) => {
return async (req: Request, res: Response, next: NextFunction) => {
const { integrationAuthId } = req.params;
const { integrationAuthId } = req[location];
const integrationAuth = await IntegrationAuth.findOne({
_id: integrationAuthId
}).select(
})
.populate<{ workspace: IWorkspace }>('workspace')
.select(
'+refreshCiphertext +refreshIV +refreshTag +accessCiphertext +accessIV +accessTag +accessExpiresAt'
);
@ -34,15 +39,16 @@ const requireIntegrationAuthorizationAuth = ({
await validateMembership({
userId: req.user._id.toString(),
workspaceId: integrationAuth.workspace.toString(),
workspaceId: integrationAuth.workspace._id.toString(),
acceptedRoles
});
req.integrationAuth = integrationAuth;
if (attachAccessToken) {
req.accessToken = await IntegrationService.getIntegrationAuthAccess({
const access = await IntegrationService.getIntegrationAuthAccess({
integrationAuthId: integrationAuth._id.toString()
});
req.accessToken = access.accessToken;
}
return next();

View File

@ -0,0 +1,59 @@
import { Request, Response, NextFunction } from 'express';
import { UnauthorizedRequestError } from '../utils/errors';
import {
Membership,
} from '../models';
import { validateMembership } from '../helpers/membership';
type req = 'params' | 'body' | 'query';
/**
* Validate membership with id [membershipId] and that user with id
* [req.user._id] can modify that membership.
* @param {Object} obj
* @param {String[]} obj.acceptedRoles - accepted workspace roles
* @param {String[]} obj.location - location of [workspaceId] on request (e.g. params, body) for parsing
*/
const requireMembershipAuth = ({
acceptedRoles,
location = 'params'
}: {
acceptedRoles: string[];
location?: req;
}) => {
return async (
req: Request,
res: Response,
next: NextFunction
) => {
try {
const { membershipId } = req[location];
const membership = await Membership.findById(membershipId);
if (!membership) throw new Error('Failed to find target membership');
const userMembership = await Membership.findOne({
workspace: membership.workspace
});
if (!userMembership) throw new Error('Failed to validate own membership')
const targetMembership = await validateMembership({
userId: req.user._id.toString(),
workspaceId: membership.workspace.toString(),
acceptedRoles
});
req.targetMembership = targetMembership;
return next();
} catch (err) {
return next(UnauthorizedRequestError({
message: 'Unable to validate workspace membership'
}));
}
}
}
export default requireMembershipAuth;

View File

@ -0,0 +1,49 @@
import { Request, Response, NextFunction } from 'express';
import { UnauthorizedRequestError } from '../utils/errors';
import {
MembershipOrg
} from '../models';
import { validateMembership } from '../helpers/membershipOrg';
type req = 'params' | 'body' | 'query';
/**
* Validate (organization) membership id [membershipId] and that user with id
* [req.user._id] can modify that membership.
* @param {Object} obj
* @param {String[]} obj.acceptedRoles - accepted organization roles
* @param {String[]} obj.location - location of [membershipId] on request (e.g. params, body) for parsing
*/
const requireMembershipOrgAuth = ({
acceptedRoles,
location = 'params'
}: {
acceptedRoles: string[];
location?: req;
}) => {
return async (req: Request, res: Response, next: NextFunction) => {
try {
const { membershipId } = req[location];
const membershipOrg = await MembershipOrg.findById(membershipId);
if (!membershipOrg) throw new Error('Failed to find target organization membership');
const targetMembership = await validateMembership({
userId: req.user._id.toString(),
organizationId: membershipOrg.organization.toString(),
acceptedRoles
});
req.targetMembership = targetMembership;
return next();
} catch (err) {
return next(UnauthorizedRequestError({
message: 'Unable to validate organization membership'
}));
}
}
}
export default requireMembershipOrgAuth;

View File

@ -0,0 +1,43 @@
import jwt from 'jsonwebtoken';
import { Request, Response, NextFunction } from 'express';
import { User } from '../models';
import { JWT_MFA_SECRET } from '../config';
import { BadRequestError, UnauthorizedRequestError } from '../utils/errors';
declare module 'jsonwebtoken' {
export interface UserIDJwtPayload extends jwt.JwtPayload {
userId: string;
}
}
/**
* Validate if (MFA) JWT temporary token on request is valid (e.g. not expired)
* and if there is an associated user.
*/
const requireMfaAuth = async (
req: Request,
res: Response,
next: NextFunction
) => {
// JWT (temporary) authentication middleware for complete signup
const [ AUTH_TOKEN_TYPE, AUTH_TOKEN_VALUE ] = <[string, string]>req.headers['authorization']?.split(' ', 2) ?? [null, null]
if(AUTH_TOKEN_TYPE === null) return next(BadRequestError({message: `Missing Authorization Header in the request header.`}))
if(AUTH_TOKEN_TYPE.toLowerCase() !== 'bearer') return next(BadRequestError({message: `The provided authentication type '${AUTH_TOKEN_TYPE}' is not supported.`}))
if(AUTH_TOKEN_VALUE === null) return next(BadRequestError({message: 'Missing Authorization Body in the request header'}))
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(AUTH_TOKEN_VALUE, JWT_MFA_SECRET)
);
const user = await User.findOne({
_id: decodedToken.userId
}).select('+publicKey');
if (!user)
return next(UnauthorizedRequestError({message: 'Unable to authenticate for User account completion. Try logging in again'}))
req.user = user;
return next();
};
export default requireMfaAuth;

View File

@ -17,10 +17,10 @@ const requireServiceTokenDataAuth = ({
const serviceTokenData = await ServiceTokenData
.findById(req[location].serviceTokenDataId)
.select('+encryptedKey +iv +tag');
.select('+encryptedKey +iv +tag').populate('user');
if (!serviceTokenData) {
return next(AccountNotFoundError({message: 'Failed to locate service token data'}));
return next(AccountNotFoundError({ message: 'Failed to locate service token data' }));
}
if (req.user) {
@ -31,9 +31,9 @@ const requireServiceTokenDataAuth = ({
acceptedRoles
});
}
req.serviceTokenData = serviceTokenData;
next();
}
}

View File

@ -10,12 +10,13 @@ import MembershipOrg, { IMembershipOrg } from './membershipOrg';
import Organization, { IOrganization } from './organization';
import Secret, { ISecret } from './secret';
import ServiceToken, { IServiceToken } from './serviceToken';
import Token, { IToken } from './token';
import TokenData, { ITokenData } from './tokenData';
import User, { IUser } from './user';
import UserAction, { IUserAction } from './userAction';
import Workspace, { IWorkspace } from './workspace';
import ServiceTokenData, { IServiceTokenData } from './serviceTokenData';
import APIKeyData, { IAPIKeyData } from './apiKeyData';
import LoginSRPDetail, { ILoginSRPDetail } from './loginSRPDetail';
export {
BackupPrivateKey,
@ -42,8 +43,8 @@ export {
ISecret,
ServiceToken,
IServiceToken,
Token,
IToken,
TokenData,
ITokenData,
User,
IUser,
UserAction,
@ -53,5 +54,7 @@ export {
ServiceTokenData,
IServiceTokenData,
APIKeyData,
IAPIKeyData
IAPIKeyData,
LoginSRPDetail,
ILoginSRPDetail
};

View File

@ -1,25 +1,41 @@
import { Schema, model, Types } from 'mongoose';
import { Schema, model, Types } from "mongoose";
import {
ENV_DEV,
ENV_TESTING,
ENV_STAGING,
ENV_PROD,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB
} from '../variables';
INTEGRATION_GITHUB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
} from "../variables";
export interface IIntegration {
_id: Types.ObjectId;
workspace: Types.ObjectId;
environment: 'dev' | 'test' | 'staging' | 'prod';
environment: string;
isActive: boolean;
app: string;
target: string;
context: string;
siteId: string;
integration: 'heroku' | 'vercel' | 'netlify' | 'github';
owner: string;
targetEnvironment: string;
appId: string;
path: string;
region: string;
integration:
| 'azure-key-vault'
| 'aws-parameter-store'
| 'aws-secret-manager'
| 'heroku'
| 'vercel'
| 'netlify'
| 'github'
| 'render'
| 'flyio'
| 'circleci'
| 'travisci';
integrationAuth: Types.ObjectId;
}
@ -27,59 +43,77 @@ const integrationSchema = new Schema<IIntegration>(
{
workspace: {
type: Schema.Types.ObjectId,
ref: 'Workspace',
required: true
ref: "Workspace",
required: true,
},
environment: {
type: String,
enum: [ENV_DEV, ENV_TESTING, ENV_STAGING, ENV_PROD],
required: true
required: true,
},
isActive: {
type: Boolean,
required: true
required: true,
},
app: {
// name of app in provider
type: String,
default: null
default: null,
},
target: {
// vercel-specific target (environment)
appId: {
// (new)
// id of app in provider
type: String,
default: null,
},
targetEnvironment: {
// (new)
// target environment
type: String,
default: null,
},
owner: {
// github-specific repo owner-login
type: String,
default: null,
},
path: {
// aws-parameter-store-specific path
type: String,
default: null
},
context: {
// netlify-specific context (deploy)
type: String,
default: null
},
siteId: {
// netlify-specific site (app) id
region: {
// aws-parameter-store-specific path
type: String,
default: null
},
integration: {
type: String,
enum: [
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB
INTEGRATION_GITHUB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
],
required: true
required: true,
},
integrationAuth: {
type: Schema.Types.ObjectId,
ref: 'IntegrationAuth',
required: true
}
ref: "IntegrationAuth",
required: true,
},
},
{
timestamps: true
timestamps: true,
}
);
const Integration = model<IIntegration>('Integration', integrationSchema);
const Integration = model<IIntegration>("Integration", integrationSchema);
export default Integration;

View File

@ -1,20 +1,30 @@
import { Schema, model, Types } from 'mongoose';
import { Schema, model, Types } from "mongoose";
import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB
} from '../variables';
INTEGRATION_GITHUB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
} from "../variables";
export interface IIntegrationAuth {
_id: Types.ObjectId;
workspace: Types.ObjectId;
integration: 'heroku' | 'vercel' | 'netlify' | 'github';
integration: 'heroku' | 'vercel' | 'netlify' | 'github' | 'render' | 'flyio' | 'azure-key-vault' | 'circleci' | 'travisci' | 'aws-parameter-store' | 'aws-secret-manager';
teamId: string;
accountId: string;
refreshCiphertext?: string;
refreshIV?: string;
refreshTag?: string;
accessIdCiphertext?: string; // new
accessIdIV?: string; // new
accessIdTag?: string; // new
accessCiphertext?: string;
accessIV?: string;
accessTag?: string;
@ -25,62 +35,82 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
{
workspace: {
type: Schema.Types.ObjectId,
required: true
ref: "Workspace",
required: true,
},
integration: {
type: String,
enum: [
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB
INTEGRATION_GITHUB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
],
required: true
required: true,
},
teamId: {
// vercel-specific integration param
type: String
type: String,
},
accountId: {
// netlify-specific integration param
type: String
type: String,
},
refreshCiphertext: {
type: String,
select: false
select: false,
},
refreshIV: {
type: String,
select: false
select: false,
},
refreshTag: {
type: String,
select: false,
},
accessIdCiphertext: {
type: String,
select: false
},
accessIdIV: {
type: String,
select: false
},
accessIdTag: {
type: String,
select: false
},
accessCiphertext: {
type: String,
select: false
select: false,
},
accessIV: {
type: String,
select: false
select: false,
},
accessTag: {
type: String,
select: false
select: false,
},
accessExpiresAt: {
type: Date,
select: false
}
select: false,
},
},
{
timestamps: true
timestamps: true,
}
);
const IntegrationAuth = model<IIntegrationAuth>(
'IntegrationAuth',
"IntegrationAuth",
integrationAuthSchema
);

View File

@ -0,0 +1,29 @@
import mongoose, { Schema, model, Types } from 'mongoose';
export interface ILoginSRPDetail {
_id: Types.ObjectId;
clientPublicKey: string;
email: string;
serverBInt: mongoose.Schema.Types.Buffer;
expireAt: Date;
}
const loginSRPDetailSchema = new Schema<ILoginSRPDetail>(
{
clientPublicKey: {
type: String,
required: true
},
email: {
type: String,
required: true,
unique: true
},
serverBInt: { type: mongoose.Schema.Types.Buffer },
expireAt: { type: Date }
}
);
const LoginSRPDetail = model('LoginSRPDetail', loginSRPDetailSchema);
export default LoginSRPDetail;

View File

@ -1,15 +1,21 @@
import { Schema, model, Types } from 'mongoose';
import { ADMIN, MEMBER } from '../variables';
export interface IMembershipPermission {
environmentSlug: string,
ability: string
}
export interface IMembership {
_id: Types.ObjectId;
user: Types.ObjectId;
inviteEmail?: string;
workspace: Types.ObjectId;
role: 'admin' | 'member';
deniedPermissions: IMembershipPermission[]
}
const membershipSchema = new Schema(
const membershipSchema = new Schema<IMembership>(
{
user: {
type: Schema.Types.ObjectId,
@ -23,6 +29,18 @@ const membershipSchema = new Schema(
ref: 'Workspace',
required: true
},
deniedPermissions: {
type: [
{
environmentSlug: String,
ability: {
type: String,
enum: ['read', 'write']
},
},
],
default: []
},
role: {
type: String,
enum: [ADMIN, MEMBER],

View File

@ -2,10 +2,6 @@ import { Schema, model, Types } from 'mongoose';
import {
SECRET_SHARED,
SECRET_PERSONAL,
ENV_DEV,
ENV_TESTING,
ENV_STAGING,
ENV_PROD
} from '../variables';
export interface ISecret {
@ -27,9 +23,10 @@ export interface ISecret {
secretCommentIV?: string;
secretCommentTag?: string;
secretCommentHash?: string;
tags?: string[];
}
const secretSchema = new Schema<ISecret>(
export const secretSchema = new Schema<ISecret>(
{
version: {
type: Number,
@ -51,9 +48,13 @@ const secretSchema = new Schema<ISecret>(
type: Schema.Types.ObjectId,
ref: 'User'
},
tags: {
ref: 'Tag',
type: [Schema.Types.ObjectId],
default: []
},
environment: {
type: String,
enum: [ENV_DEV, ENV_TESTING, ENV_STAGING, ENV_PROD],
required: true
},
secretKeyCiphertext: {
@ -108,6 +109,9 @@ const secretSchema = new Schema<ISecret>(
}
);
secretSchema.index({ tags: 1 }, { background: true })
const Secret = model<ISecret>('Secret', secretSchema);
export default Secret;

View File

@ -0,0 +1,121 @@
import mongoose, { Schema, model } from 'mongoose';
import Secret, { ISecret, secretSchema } from './secret';
export interface IRequestedChange {
_id: string
userId: mongoose.Types.ObjectId;
status: ApprovalStatus;
modifiedSecretDetails: ISecret,
modifiedSecretParentId: mongoose.Types.ObjectId,
type: string,
approvers: IApprover[]
merged: boolean
}
interface ISecretApprovalRequest {
environment: string;
workspace: mongoose.Types.ObjectId;
requestedChanges: IRequestedChange[];
requestedByUserId: mongoose.Types.ObjectId;
timestamp: Date;
requestType: ChangeType;
requestId: string;
}
export interface IApprover {
userId: mongoose.Types.ObjectId;
status: ApprovalStatus;
}
export enum ApprovalStatus {
PENDING = 'pending',
APPROVED = 'approved',
REJECTED = 'rejected'
}
export enum ChangeType {
UPDATE = 'update',
DELETE = 'delete',
CREATE = 'create'
}
const approverSchema = new mongoose.Schema({
userId: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User',
required: false,
},
status: {
type: String,
enum: [ApprovalStatus],
default: ApprovalStatus.PENDING
}
}, { timestamps: true });
// extend the Secret Schema by taking all but removing _id and version fields
const SecretModificationSchema = new Schema({
...secretSchema.obj,
}, {
_id: false,
});
SecretModificationSchema.remove("version")
const requestedChangeSchema = new mongoose.Schema(
{
_id: { type: mongoose.Schema.Types.ObjectId, auto: true },
modifiedSecretDetails: SecretModificationSchema,
modifiedSecretParentId: { // used to fetch the current version of this secret for comparing
type: mongoose.Schema.Types.ObjectId,
ref: 'Secret'
},
type: {
type: String,
enum: ChangeType,
required: true
},
status: {
type: String,
enum: ApprovalStatus,
default: ApprovalStatus.PENDING // the overall status of the requested change
},
approvers: [approverSchema],
merged: {
type: Boolean,
default: false,
}
},
{ timestamps: true }
);
const secretApprovalRequestSchema = new Schema<ISecretApprovalRequest>(
{
environment: {
type: String, // The secret changes were requested for
ref: 'Secret'
},
workspace: {
type: mongoose.Schema.Types.ObjectId, // workspace id of the secret
ref: 'Workspace'
},
requestedChanges: [requestedChangeSchema], // the changes that the requested user wants to make to the existing secret
requestedByUserId: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
},
requestId: {
type: String,
}
},
{
timestamps: true
}
);
secretApprovalRequestSchema.index({ 'requestedChanges.approvers.userId': 1 });
const SecretApprovalRequest = model<ISecretApprovalRequest>('secret_approval_request', secretApprovalRequestSchema);
export default SecretApprovalRequest;

View File

@ -1,7 +1,4 @@
import { Schema, model, Types } from 'mongoose';
import { ENV_DEV, ENV_TESTING, ENV_STAGING, ENV_PROD } from '../variables';
// TODO: deprecate
export interface IServiceToken {
_id: Types.ObjectId;
name: string;
@ -33,7 +30,6 @@ const serviceTokenSchema = new Schema<IServiceToken>(
},
environment: {
type: String,
enum: [ENV_DEV, ENV_TESTING, ENV_STAGING, ENV_PROD],
required: true
},
expiresAt: {

49
backend/src/models/tag.ts Normal file
View File

@ -0,0 +1,49 @@
import { Schema, model, Types } from 'mongoose';
export interface ITag {
_id: Types.ObjectId;
name: string;
slug: string;
user: Types.ObjectId;
workspace: Types.ObjectId;
}
const tagSchema = new Schema<ITag>(
{
name: {
type: String,
required: true,
trim: true,
},
slug: {
type: String,
required: true,
trim: true,
lowercase: true,
validate: [
function (value: any) {
return value.indexOf(' ') === -1;
},
'slug cannot contain spaces'
]
},
user: {
type: Schema.Types.ObjectId,
ref: 'User'
},
workspace: {
type: Schema.Types.ObjectId,
ref: 'Workspace'
},
},
{
timestamps: true
}
);
tagSchema.index({ slug: 1, workspace: 1 }, { unique: true })
tagSchema.index({ workspace: 1 })
const Tag = model<ITag>('Tag', tagSchema);
export default Tag;

View File

@ -5,6 +5,7 @@ export interface IToken {
email: string;
token: string;
createdAt: Date;
ttl: number;
}
const tokenSchema = new Schema<IToken>({
@ -19,14 +20,13 @@ const tokenSchema = new Schema<IToken>({
createdAt: {
type: Date,
default: Date.now
},
ttl: {
type: Number,
}
});
tokenSchema.index({
createdAt: 1
}, {
expireAfterSeconds: parseInt(EMAIL_TOKEN_LIFETIME)
});
tokenSchema.index({ email: 1 });
const Token = model<IToken>('Token', tokenSchema);

View File

@ -0,0 +1,55 @@
import { Schema, Types, model } from 'mongoose';
export interface ITokenData {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
tokenHash: string;
triesLeft?: number;
expiresAt: Date;
createdAt: Date;
updatedAt: Date;
}
const tokenDataSchema = new Schema<ITokenData>({
type: {
type: String,
enum: [
'emailConfirmation',
'emailMfa',
'organizationInvitation',
'passwordReset'
],
required: true
},
email: {
type: String
},
phoneNumber: {
type: String
},
organization: { // organizationInvitation-specific field
type: Schema.Types.ObjectId,
ref: 'Organization'
},
tokenHash: {
type: String,
select: false,
required: true
},
triesLeft: {
type: Number
},
expiresAt: {
type: Date,
expires: 0,
required: true
}
}, {
timestamps: true
});
const TokenData = model<ITokenData>('TokenData', tokenDataSchema);
export default TokenData;

View File

@ -1,10 +1,14 @@
import { Schema, model, Types } from 'mongoose';
import { Schema, model, Types, Document } from 'mongoose';
export interface IUser {
export interface IUser extends Document {
_id: Types.ObjectId;
email: string;
firstName?: string;
lastName?: string;
encryptionVersion: number;
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
publicKey?: string;
encryptedPrivateKey?: string;
iv?: string;
@ -12,6 +16,12 @@ export interface IUser {
salt?: string;
verifier?: string;
refreshVersion?: number;
isMfaEnabled: boolean;
mfaMethods: boolean;
devices: {
ip: string;
userAgent: string;
}[];
}
const userSchema = new Schema<IUser>(
@ -26,6 +36,23 @@ const userSchema = new Schema<IUser>(
lastName: {
type: String
},
encryptionVersion: {
type: Number,
select: false,
default: 1 // to resolve backward-compatibility issues
},
protectedKey: { // introduced as part of encryption version 2
type: String,
select: false
},
protectedKeyIV: { // introduced as part of encryption version 2
type: String,
select: false
},
protectedKeyTag: { // introduced as part of encryption version 2
type: String,
select: false
},
publicKey: {
type: String,
select: false
@ -34,11 +61,11 @@ const userSchema = new Schema<IUser>(
type: String,
select: false
},
iv: {
iv: { // iv of [encryptedPrivateKey]
type: String,
select: false
},
tag: {
tag: { // tag of [encryptedPrivateKey]
type: String,
select: false
},
@ -54,8 +81,22 @@ const userSchema = new Schema<IUser>(
type: Number,
default: 0,
select: false
},
isMfaEnabled: {
type: Boolean,
default: false
},
mfaMethods: [{
type: String
}],
devices: {
type: [{
ip: String,
userAgent: String
}],
default: []
}
},
},
{
timestamps: true
}

View File

@ -1,21 +1,74 @@
import { Schema, model, Types } from 'mongoose';
import mongoose, { Schema, model, Types } from 'mongoose';
export interface DesignatedApprovers {
environment: string,
approvers: [mongoose.Schema.Types.ObjectId]
}
export interface IWorkspace {
_id: Types.ObjectId;
name: string;
organization: Types.ObjectId;
approvers: [DesignatedApprovers];
environments: Array<{
name: string;
slug: string;
}>;
autoCapitalization: boolean;
}
const approverSchema = new mongoose.Schema({
userId: {
type: Schema.Types.ObjectId,
ref: 'User',
},
environment: {
type: String
}
}, { _id: false });
const workspaceSchema = new Schema<IWorkspace>({
name: {
type: String,
required: true
},
autoCapitalization: {
type: Boolean,
default: true,
},
approvers: [approverSchema],
organization: {
type: Schema.Types.ObjectId,
ref: 'Organization',
required: true
}
},
environments: {
type: [
{
name: String,
slug: String,
},
],
default: [
{
name: "Development",
slug: "dev"
},
{
name: "Test",
slug: "test"
},
{
name: "Staging",
slug: "staging"
},
{
name: "Production",
slug: "prod"
}
],
},
});
const Workspace = model<IWorkspace>('Workspace', workspaceSchema);

View File

@ -3,22 +3,22 @@ const router = express.Router();
import { body } from 'express-validator';
import { requireAuth, validateRequest } from '../../middleware';
import { authController } from '../../controllers/v1';
import { loginLimiter } from '../../helpers/rateLimiter';
import { authLimiter } from '../../helpers/rateLimiter';
router.post('/token', validateRequest, authController.getNewToken);
router.post(
router.post( // deprecated (moved to api/v2/auth/login1)
'/login1',
loginLimiter,
authLimiter,
body('email').exists().trim().notEmpty(),
body('clientPublicKey').exists().trim().notEmpty(),
validateRequest,
authController.login1
);
router.post(
router.post( // deprecated (moved to api/v2/auth/login2)
'/login2',
loginLimiter,
authLimiter,
body('email').exists().trim().notEmpty(),
body('clientProof').exists().trim().notEmpty(),
validateRequest,
@ -27,11 +27,13 @@ router.post(
router.post(
'/logout',
authLimiter,
requireAuth({
acceptedAuthModes: ['jwt']
}),
authController.logout
);
router.post(
'/checkAuth',
requireAuth({

View File

@ -31,7 +31,7 @@ router.patch(
requireBotAuth({
acceptedRoles: [ADMIN, MEMBER]
}),
body('isActive').isBoolean(),
body('isActive').exists().isBoolean(),
body('botKey'),
validateRequest,
botController.setBotActiveState

View File

@ -15,6 +15,7 @@ import password from './password';
import stripe from './stripe';
import integration from './integration';
import integrationAuth from './integrationAuth';
import secretApprovalRequest from './secretApprovalsRequest'
export {
signup,
@ -33,5 +34,6 @@ export {
password,
stripe,
integration,
integrationAuth
integrationAuth,
secretApprovalRequest
};

View File

@ -3,12 +3,35 @@ const router = express.Router();
import {
requireAuth,
requireIntegrationAuth,
requireIntegrationAuthorizationAuth,
validateRequest
} from '../../middleware';
import { ADMIN, MEMBER } from '../../variables';
import { body, param } from 'express-validator';
import { integrationController } from '../../controllers/v1';
router.post( // new: add new integration for integration auth
'/',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireIntegrationAuthorizationAuth({
acceptedRoles: [ADMIN, MEMBER],
location: 'body'
}),
body('integrationAuthId').exists().isString().trim(),
body('app').trim(),
body('isActive').exists().isBoolean(),
body('appId').trim(),
body('sourceEnvironment').trim(),
body('targetEnvironment').trim(),
body('owner').trim(),
body('path').trim(),
body('region').trim(),
validateRequest,
integrationController.createIntegration
);
router.patch(
'/:integrationId',
requireAuth({
@ -18,12 +41,12 @@ router.patch(
acceptedRoles: [ADMIN, MEMBER]
}),
param('integrationId').exists().trim(),
body('isActive').exists().isBoolean(),
body('app').exists().trim(),
body('environment').exists().trim(),
body('isActive').exists().isBoolean(),
body('target').exists(),
body('context').exists(),
body('siteId').exists(),
body('appId').exists(),
body('targetEnvironment').exists(),
body('owner').exists(),
validateRequest,
integrationController.updateIntegration
);

View File

@ -18,6 +18,19 @@ router.get(
integrationAuthController.getIntegrationOptions
);
router.get(
'/:integrationAuthId',
requireAuth({
acceptedAuthModes: ['jwt']
}),
requireIntegrationAuthorizationAuth({
acceptedRoles: [ADMIN, MEMBER]
}),
param('integrationAuthId'),
validateRequest,
integrationAuthController.getIntegrationAuth
);
router.post(
'/oauth-token',
requireAuth({
@ -34,6 +47,23 @@ router.post(
integrationAuthController.oAuthExchange
);
router.post(
'/access-token',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
location: 'body'
}),
body('workspaceId').exists().trim().notEmpty(),
body('accessId').trim(),
body('accessToken').exists().trim().notEmpty(),
body('integration').exists().trim().notEmpty(),
validateRequest,
integrationAuthController.saveIntegrationAccessToken
);
router.get(
'/:integrationAuthId/apps',
requireAuth({

View File

@ -3,12 +3,15 @@ const router = express.Router();
import { body, param } from 'express-validator';
import { requireAuth, validateRequest } from '../../middleware';
import { membershipController } from '../../controllers/v1';
import { membershipController as EEMembershipControllers } from '../../ee/controllers/v1';
router.get( // used for CLI (deprecate)
// note: ALL DEPRECIATED (moved to api/v2/workspace/:workspaceId/memberships/:membershipId)
router.get( // used for old CLI (deprecate)
'/:workspaceId/connect',
requireAuth({
acceptedAuthModes: ['jwt']
}),
acceptedAuthModes: ['jwt']
}),
param('workspaceId').exists().trim(),
validateRequest,
membershipController.validateMembership
@ -17,8 +20,8 @@ router.get( // used for CLI (deprecate)
router.delete(
'/:membershipId',
requireAuth({
acceptedAuthModes: ['jwt']
}),
acceptedAuthModes: ['jwt']
}),
param('membershipId').exists().trim(),
validateRequest,
membershipController.deleteMembership
@ -27,11 +30,22 @@ router.delete(
router.post(
'/:membershipId/change-role',
requireAuth({
acceptedAuthModes: ['jwt']
}),
acceptedAuthModes: ['jwt']
}),
body('role').exists().trim(),
validateRequest,
membershipController.changeMembershipRole
);
router.post(
'/:membershipId/deny-permissions',
requireAuth({
acceptedAuthModes: ['jwt']
}),
param('membershipId').isMongoId().exists().trim(),
body('permissions').isArray().exists(),
validateRequest,
EEMembershipControllers.denyMembershipPermissions
);
export default router;

View File

@ -9,7 +9,7 @@ import {
import { OWNER, ADMIN, MEMBER, ACCEPTED } from '../../variables';
import { organizationController } from '../../controllers/v1';
router.get(
router.get( // deprecated (moved to api/v2/users/me/organizations)
'/',
requireAuth({
acceptedAuthModes: ['jwt']
@ -41,7 +41,7 @@ router.get(
organizationController.getOrganization
);
router.get(
router.get( // deprecated (moved to api/v2/organizations/:organizationId/memberships)
'/:organizationId/users',
requireAuth({
acceptedAuthModes: ['jwt']
@ -56,7 +56,7 @@ router.get(
);
router.get(
'/:organizationId/my-workspaces',
'/:organizationId/my-workspaces', // deprecated (moved to api/v2/organizations/:organizationId/workspaces)
requireAuth({
acceptedAuthModes: ['jwt']
}),
@ -156,4 +156,19 @@ router.get(
organizationController.getOrganizationSubscriptions
);
router.get(
'/:organizationId/workspace-memberships',
requireAuth({
acceptedAuthModes: ['jwt']
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED]
}),
param('organizationId').exists().trim(),
validateRequest,
organizationController.getOrganizationMembersAndTheirWorkspaces
);
export default router;

View File

@ -10,7 +10,7 @@ router.post(
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('clientPublicKey').exists().trim().notEmpty(),
body('clientPublicKey').exists().isString().trim().notEmpty(),
validateRequest,
passwordController.srp1
);
@ -22,11 +22,14 @@ router.post(
acceptedAuthModes: ['jwt']
}),
body('clientProof').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().trim().notEmpty().notEmpty(), // private key encrypted under new pwd
body('iv').exists().trim().notEmpty(), // new iv for private key
body('tag').exists().trim().notEmpty(), // new tag for private key
body('salt').exists().trim().notEmpty(), // part of new pwd
body('verifier').exists().trim().notEmpty(), // part of new pwd
body('protectedKey').exists().isString().trim().notEmpty(),
body('protectedKeyIV').exists().isString().trim().notEmpty(),
body('protectedKeyTag').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // private key encrypted under new pwd
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(), // new iv for private key
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(), // new tag for private key
body('salt').exists().isString().trim().notEmpty(), // part of new pwd
body('verifier').exists().isString().trim().notEmpty(), // part of new pwd
validateRequest,
passwordController.changePassword
);
@ -34,7 +37,7 @@ router.post(
router.post(
'/email/password-reset',
passwordLimiter,
body('email').exists().trim().notEmpty(),
body('email').exists().isString().trim().notEmpty().isEmail(),
validateRequest,
passwordController.emailPasswordReset
);
@ -42,8 +45,8 @@ router.post(
router.post(
'/email/password-reset-verify',
passwordLimiter,
body('email').exists().trim().notEmpty().isEmail(),
body('code').exists().trim().notEmpty(),
body('email').exists().isString().trim().notEmpty().isEmail(),
body('code').exists().isString().trim().notEmpty(),
validateRequest,
passwordController.emailPasswordResetVerify
);
@ -61,12 +64,12 @@ router.post(
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('clientProof').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().trim().notEmpty(), // (backup) private key encrypted under a strong key
body('iv').exists().trim().notEmpty(), // new iv for (backup) private key
body('tag').exists().trim().notEmpty(), // new tag for (backup) private key
body('salt').exists().trim().notEmpty(), // salt generated from strong key
body('verifier').exists().trim().notEmpty(), // salt generated from strong key
body('clientProof').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // (backup) private key encrypted under a strong key
body('iv').exists().isString().trim().notEmpty(), // new iv for (backup) private key
body('tag').exists().isString().trim().notEmpty(), // new tag for (backup) private key
body('salt').exists().isString().trim().notEmpty(), // salt generated from strong key
body('verifier').exists().isString().trim().notEmpty(), // salt generated from strong key
validateRequest,
passwordController.createBackupPrivateKey
);
@ -74,11 +77,14 @@ router.post(
router.post(
'/password-reset',
requireSignupAuth,
body('encryptedPrivateKey').exists().trim().notEmpty(), // private key encrypted under new pwd
body('iv').exists().trim().notEmpty(), // new iv for private key
body('tag').exists().trim().notEmpty(), // new tag for private key
body('salt').exists().trim().notEmpty(), // part of new pwd
body('verifier').exists().trim().notEmpty(), // part of new pwd
body('protectedKey').exists().isString().trim().notEmpty(),
body('protectedKeyIV').exists().isString().trim().notEmpty(),
body('protectedKeyTag').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // private key encrypted under new pwd
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(), // new iv for private key
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(), // new tag for private key
body('salt').exists().isString().trim().notEmpty(), // part of new pwd
body('verifier').exists().isString().trim().notEmpty(), // part of new pwd
validateRequest,
passwordController.resetPassword
);

Some files were not shown because too many files have changed in this diff Show More