Compare commits

..

190 Commits

Author SHA1 Message Date
625fa0725e add approvals-needed and sent api 2023-02-26 19:05:36 -05:00
e32cb8c24f add index for requestedChanges.approvers.userId 2023-02-26 19:04:44 -05:00
de724d2804 improve add/remove approvers into workspace 2023-02-26 18:29:35 -05:00
42d94521a4 extend Secres schema in requestedChangeSchema 2023-02-26 15:15:41 -05:00
203a603769 complete merge API and improve reject/approve api 2023-02-26 15:15:41 -05:00
e1a88b2d1a add method to check required secret fields 2023-02-26 15:15:41 -05:00
53237dd52c merge approval requests 2023-02-26 15:15:41 -05:00
6a906b17ad add approvals request reject, approve and merge api 2023-02-26 15:15:41 -05:00
f38a364d3b add approve request api 2023-02-26 15:15:41 -05:00
4749e243bb get approvals by requester & improve create request 2023-02-26 15:15:41 -05:00
eb055e8b16 create approval 2023-02-26 15:15:41 -05:00
0e17c9a6db Merge pull request #383 from 5h4k4r/use-multi-stage-build
Builds Docker image of backend component with multi-stage Dockerfile
2023-02-26 10:50:56 -05:00
1c4dd78dea use node alpine base image 2023-02-26 10:49:01 -05:00
23418b3a09 Builds Docker image of backend component with multi-stage Dockerfile
Updates package.json `npm start` command
Delete unnecessary `npm run start` commands

Signed-off-by: Shakar <5h4k4r.b4kr@gmail.com>
2023-02-26 14:50:49 +03:00
0f143adbde Merge pull request #377 from caioluis/main
feat(webui localization): add support for pt-PT
2023-02-25 12:01:51 -08:00
1f3f4b7900 Merge pull request #353 from Grraahaam/chore/helm-docs
chore(docs): helm charts + local cluster setup script
2023-02-25 13:25:47 -05:00
2c5f26380e Merge pull request #373 from jon4hz/default-env
fix: properly support default environment
2023-02-25 13:08:00 -05:00
8f974fb087 Add docs for default environment 2023-02-25 13:07:07 -05:00
a0722b4ca5 Merge pull request #372 from jon4hz/format-cfg
fix: pretty print workspace file
2023-02-25 12:42:46 -05:00
41e039578a Merge pull request #374 from jon4hz/shell
fix: always execute cmd in subshell
2023-02-25 12:39:08 -05:00
c89e8e8a96 Merge pull request #375 from jon4hz/workspace
feat: search for workspace config in parent dir
2023-02-25 11:27:34 -05:00
cac83ab927 add debug log to workspace path location 2023-02-25 11:25:49 -05:00
0f0b894363 Revert "feat(webui-localization): fix and update pt-br copies"
I was dizzy and forgot to make a branch

This reverts commit 43f9af1bc6d27b0e6fc477535f5a17d963df6b51.

Signed-off-by: Caio Gomes <ocaioluis@gmail.com>
2023-02-25 08:05:43 +00:00
43f9af1bc6 feat(webui-localization): fix and update pt-br copies 2023-02-25 08:04:02 +00:00
f5ed14c84c fix(localization): add pt-PT to the options and fix a copy 2023-02-25 07:22:34 +00:00
2dd57d7c73 feat(localization): add locales for pt-PT 2023-02-25 06:53:02 +00:00
0b1891b64a Merge pull request #371 from jon4hz/env-filter
Improve env filter
2023-02-24 23:12:20 -05:00
5614b0f58a nit: method name change 2023-02-24 22:21:20 -05:00
3bb178976d Remove auto delete index 2023-02-24 21:57:13 -05:00
1777f98aef feat: search for workspace config in parent dir 2023-02-25 01:25:39 +01:00
45e3706335 fix: always execute cmd in subshell 2023-02-25 00:19:58 +01:00
337ed1fc46 fix: properly support default environment 2023-02-24 23:20:30 +01:00
d1ea76e5a0 fix: pretty print workspace file 2023-02-24 22:28:28 +01:00
4a72d725b1 fix: use function to get secrets by key 2023-02-24 22:03:50 +01:00
1693db3199 fix: preallocate map size 2023-02-24 22:03:13 +01:00
1ff42991b3 fix: improve filtering of reserved env vars 2023-02-24 21:57:48 +01:00
978423ba5b Merge pull request #364 from alexdanilowicz/patch-3
chore(docs): Update README MIT License badge link
2023-02-24 11:41:28 -08:00
4d0dc0d7b7 Update README.md 2023-02-24 11:40:03 -08:00
3817e666a9 Merge pull request #365 from Infisical/travisci
Add Travis CI Docs
2023-02-24 16:44:15 +07:00
b61350f6a4 Add Travis CI docs, minor edits 2023-02-24 16:37:26 +07:00
0fb1a1dc6f Merge remote-tracking branch 'origin' into travisci 2023-02-24 16:08:53 +07:00
9eefc87b7a Merge pull request #350 from Aashish-Upadhyay-101/Aashish-Upadhyay-101/TravisCI-integration
Feat: travis ci integration
2023-02-24 16:08:19 +07:00
53d35757ee Make minor changes to TravisCI sync, faster, reliable 2023-02-24 15:59:12 +07:00
e80e8e00b1 Replace axios with request 2023-02-24 15:31:46 +07:00
0b08e574c7 Merge remote-tracking branch 'refs/remotes/origin/main' into Aashish-Upadhyay-101/TravisCI-integration 2023-02-24 13:36:35 +05:45
499323d0e3 Sync travis-ci 2023-02-24 13:36:17 +05:45
89ad2f163a chore(docs): README
Another small README typo that I noticed. Links to a different repo.
2023-02-23 23:27:28 -08:00
7f04617b7d update brew command to update cli 2023-02-23 19:37:19 -05:00
44904628bc Merge pull request #360 from bngmnn/main
add german as readme language
2023-02-23 15:10:44 -08:00
fafde7b1ad update other languages' readme files with 'de' flag 2023-02-23 23:56:48 +01:00
7e65314670 add german readme 2023-02-23 23:56:17 +01:00
df52c56e83 patch login bug in cli 2023-02-23 14:14:41 -05:00
4276fb54cc Add docs for git branch mapping 2023-02-23 13:07:18 -05:00
bb5a0db79c Merge pull request #359 from Infisical/axios-retry
Add axios-retry for Vercel integration for now
2023-02-24 00:59:29 +07:00
b906048ea1 Add axios-retry for Vercel integration for now 2023-02-24 00:52:43 +07:00
7ce9c816c5 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-23 21:38:21 +07:00
3fef6e4849 Replace Promise.all with for-await Vercel getting envars 2023-02-23 21:38:11 +07:00
e7ce1e36e7 Update README.md 2023-02-23 06:25:48 -08:00
734c915206 Merge pull request #358 from alisson-acioli/main
Readme translation to PT-BR.
2023-02-23 06:23:35 -08:00
783174adc6 Add for-await for better Vercel integration reliability 2023-02-23 20:21:17 +07:00
d769db7668 integrações world 2023-02-23 09:52:58 -03:00
00e532fce4 remove english 2023-02-23 09:52:08 -03:00
7cf8cba54b remove word 2023-02-23 09:51:30 -03:00
70b26811d9 Remove word API 2023-02-23 09:50:13 -03:00
e7aafecbc2 Update language readme options 2023-02-23 09:49:03 -03:00
949fb052cd Readme PT-BR 2023-02-23 09:45:58 -03:00
fcb1f5a51b Merge pull request #357 from Infisical/vercel-integration-patch
Patch Vercel case where secrets can be of type plain and sensitive
2023-02-23 16:53:44 +07:00
e24f70b891 Patch Vercel case where secrets can be of type plain and sensitive 2023-02-23 16:47:21 +07:00
bd233ebe9b allow git branch mapping to env 2023-02-23 00:00:56 -05:00
f92269f2ec Merge pull request #274 from Grraahaam/feat/pr-template
feat(docs): added a base pull request template
2023-02-23 11:16:19 +07:00
2143db5eb5 Update messaging 2023-02-22 17:21:44 -08:00
0c72f50b5e Updated readme's 2023-02-22 16:29:53 -08:00
3c4c616242 update 2fa prompt in cli 2023-02-22 18:52:09 -05:00
153baad49f Merge pull request #349 from ImBIOS/docs/translation-indonesia
translate: add Bahasa Indonesia for README
2023-02-22 15:22:17 -08:00
75a2ab636c Merge pull request #354 from akhilmhdh/feat/table-loader
Table loading state and empty states
2023-02-22 14:57:23 -08:00
05a77e612c Minor style updates 2023-02-22 14:54:59 -08:00
d02bc06dce Merge pull request #355 from alexdanilowicz/patch-2
docs: fix nit typos in README
2023-02-22 14:54:53 -05:00
e1f88f1a7b docs: fix nit typos in README
Fix 'development' typo in sub header of README and do not hyphenate use-cases.
2023-02-22 11:51:41 -08:00
86a2647134 feat(ui): added table skeleton and loading for settings page,
fix(ui): resolved missing loading state in add new member and whitespace in project settings page
2023-02-22 23:29:26 +05:30
621b640af4 feat(ui): added new components empty state and skeleton 2023-02-22 23:27:36 +05:30
40c80f417c chore(script): added warning 2023-02-22 10:30:33 +01:00
7bb2c1c278 fix(script): auto generate secrets at runtime 2023-02-22 10:25:08 +01:00
a5278affe6 chore(docs): improved charts related documentation 2023-02-22 10:18:22 +01:00
2f953192d6 feat(script): kind local development setup 2023-02-22 10:18:22 +01:00
af64582efd Merge pull request #352 from Infisical/socket-labs-smtp
Add support and docs for SocketLabs email SMTP
2023-02-22 15:47:53 +07:00
6ad70f24a2 Add support and docs for SocketLabs email SMTP 2023-02-22 15:44:47 +07:00
8bf8968588 Other frontend configurations for travis-ci 2023-02-22 13:36:22 +05:45
7e9ce0360a Create.tsx for travis-ci 2023-02-22 13:26:10 +05:45
1d35c41dcb Authorize.tsx for travis-ci 2023-02-22 13:20:34 +05:45
824315f773 model steup for travis-ci 2023-02-22 13:14:06 +05:45
8a74799d64 variable setup for travis-ci 2023-02-22 12:52:42 +05:45
f0f6e8a988 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-02-22 12:29:49 +05:45
89bc9a823c translate: add Bahasa Indonesia for README 2023-02-22 12:03:57 +07:00
40250b7ecf Merge pull request #344 from Infisical/ip-alerts
New device (IP and user agent) login detection and alert
2023-02-21 18:11:16 +07:00
2d6d32923d Finish alert for new device login detection 2023-02-21 18:01:26 +07:00
7cb6aee3f7 Add docs for MFA 2023-02-21 16:08:09 +07:00
469d042f4b Add CircleCI docs 2023-02-21 13:07:17 +07:00
c38ccdb915 Merge pull request #343 from Infisical/mfa
MFA
2023-02-21 12:47:56 +07:00
baaa92427f Remove dependency cycle 2023-02-21 12:43:17 +07:00
1ff2c61b3a Remove storage of protected key 2023-02-21 12:31:19 +07:00
0b356e0e83 Update README.md 2023-02-20 21:14:50 -08:00
eb55c053eb Merge pull request #341 from umrak11/feature/improve-backup-pdf-generation
Refactored and improved PDF backup generation
2023-02-20 20:56:15 -08:00
07b307e4b1 Merge pull request #333 from esau-morais/issue-318
Add missing `forgot-password` in pt-BR and save selected env on URL
2023-02-20 20:44:27 -08:00
5bee6a5e24 Merge branch 'issue-318' of https://github.com/esau-morais/infisical into issue-318 2023-02-20 20:41:00 -08:00
bdc99e34cc Fix TS issue 2023-02-20 20:40:32 -08:00
cee10fb507 Merge branch 'main' into issue-318 2023-02-20 20:36:44 -08:00
74e78bb967 Merge remote-tracking branch 'origin' into mfa 2023-02-21 11:31:17 +07:00
ea5811c24c Fixed the bug with the tab indicators 2023-02-20 20:30:03 -08:00
d31b7ae4af Merge pull request #335 from animeshdas2000/fix/undefined-url
Fix: After trying to delete the last remaining project, it keeps loading and returns undefined in the URL
2023-02-20 19:54:41 -08:00
75eac1b972 Add LoginSRPDetail to v2 auth route 2023-02-21 10:52:31 +07:00
c65ce14de3 Merge branch 'main' into fix/undefined-url 2023-02-20 19:49:04 -08:00
f8c4ccd64c Updated readme 2023-02-20 19:17:04 -08:00
43ce222725 Minor style updates 2023-02-20 18:12:29 -08:00
c7ebeecb6b Merge pull request #337 from akhilmhdh/feat/new-org-settings
Revamped the org settings page
2023-02-20 17:25:27 -08:00
243c6ca22e feat: changed delete org membership to v2 2023-02-20 23:22:34 +05:30
66f1c57a2a feat(ui): completed org settings page revamp 2023-02-20 23:22:34 +05:30
c0d1495761 feat(ui): api hooks for new org settings page 2023-02-20 23:22:34 +05:30
e5f6ed3dc7 feat(ui): components changes for new org settings page 2023-02-20 23:22:32 +05:30
ab62d91b09 Refactored and improved PDF backup generation 2023-02-20 10:12:38 +01:00
59beabb445 Fix change password private key removal 2023-02-20 13:27:16 +07:00
d5bc377e3d Added notifications to 2FA and fixed state 2023-02-19 20:11:24 -08:00
2bdb20f42f Merge pull request #339 from Infisical/mfa
MFA
2023-02-19 18:33:11 -05:00
0062df58a2 Fixed dashboard for the corner case of purely personal secrets 2023-02-19 12:02:30 -08:00
b6bbfc08ad increase helm chart 2023-02-19 12:52:28 -05:00
5baccc73c9 Add mongo root password 2023-02-19 12:52:28 -05:00
20e7eae4fe Add more Accept-Encoding to integrations syncs 2023-02-19 13:42:37 +07:00
8432f71d58 add secrets auto reload 2023-02-19 00:44:38 -05:00
604c22d64d Merge remote-tracking branch 'origin' into mfa 2023-02-19 12:41:23 +07:00
c1deb08df8 always pull gamma iamge 2023-02-19 00:08:07 -05:00
66f201746f update gamma values 2023-02-19 00:06:38 -05:00
1c61ffbd36 patch upload script 2023-02-19 00:00:58 -05:00
e5ba8eb281 fix upload script 2023-02-18 23:50:35 -05:00
f542e07c33 download dependency for helm chart in upload step 2023-02-18 23:40:10 -05:00
1082d7f869 update helm docs for self install 2023-02-18 23:28:52 -05:00
4a3adaa347 Begin in-memory privat key storage 2023-02-19 10:54:54 +07:00
1659dab87d Merge pull request #314 from Grraahaam/feat/helm-mongodb-persistence
feat(chart): mongodb persistence
2023-02-18 18:06:39 -05:00
d88599714f update helm chart to refect dep update 2023-02-18 11:33:39 -05:00
71bf56a2b7 Update k8 operator dependencies 2023-02-18 11:32:46 -05:00
0fba78ad16 Update http net package 2023-02-18 10:16:27 -05:00
92560f5e1f Merge remote-tracking branch 'origin' into mfa 2023-02-18 16:49:51 +07:00
0d484b93eb Merge pull request #338 from Infisical/smoothen-integrations
Smoothen integrations
2023-02-18 16:30:56 +07:00
5f3b8c55b8 Merge branch 'main' into smoothen-integrations 2023-02-18 16:27:58 +07:00
553416689c Fix merge conflicts with main 2023-02-18 16:23:05 +07:00
b0744fd21d Wired frontend to use the batch stucture 2023-02-18 00:02:52 -08:00
be38844a5b Add 2FA in CLI 2023-02-17 20:48:19 -05:00
54e2b661bc ran prettier to fix indentation 2023-02-17 14:20:58 +05:30
b81d8eba25 added notification to .env errors 2023-02-16 21:05:38 -08:00
dbcd2b0988 Patch undefined req.secrets 2023-02-17 11:56:06 +07:00
1d11f11eaf Refactor login cmd 2023-02-16 15:49:14 -08:00
f2d7401d1d Add support for version 2 auth 2023-02-16 15:47:59 -08:00
91cb9750b4 Merge branch 'main' into fix/undefined-url 2023-02-17 00:33:55 +05:30
3e0d4cb70a fixed infinite loading in dashboard 2023-02-17 00:31:00 +05:30
dab677b360 Merge branch 'smoothen-integrations' of https://github.com/Infisical/infisical into smoothen-integrations 2023-02-17 01:18:31 +07:00
625c0785b5 Add validation to batch secret endpoint 2023-02-17 01:12:13 +07:00
540a8b4201 Fixed the bug with updating tags 2023-02-16 09:53:33 -08:00
11f86da1f6 Fixed the bug with updating tags 2023-02-16 09:35:57 -08:00
ab5ffa9ee6 Updated billing in the settings tab 2023-02-16 08:13:49 -08:00
65bec23292 Begin rewiring frontend to use batch route for CRUD secret ops 2023-02-16 22:43:53 +07:00
635ae941d7 Merge branch 'main' of https://github.com/Infisical/infisical 2023-02-15 23:56:33 -08:00
a9753fb784 Update the text in start guide 2023-02-15 23:56:16 -08:00
b587d9b35a Update README.md 2023-02-15 23:03:51 -08:00
aa68bc05d9 Updated 'are you sure?' message 2023-02-15 13:46:13 -08:00
66566a401f Updated the infisical guide 2023-02-15 13:23:14 -08:00
5aa75ecd3f feat: save selected env on url 2023-02-15 12:46:54 -03:00
0a77f9a0c8 fix(i18n): add missing forgot-password in pt-BR 2023-02-15 11:06:48 -03:00
b5d4cfed03 chore(docs): chart documentation generation 2023-02-15 09:43:17 +01:00
c57394bdab Merge remote-tracking branch 'origin' into mfa 2023-02-15 11:44:42 +07:00
754ea09400 Add simple tries left for MFA 2023-02-15 10:28:23 +07:00
f28a2ea151 Small nits 2023-02-14 18:40:02 -08:00
b710944630 Add more edge-cases to MFA 2023-02-15 01:29:40 +07:00
280f482fc8 Fix merge conflicts 2023-02-14 17:40:42 +07:00
e1ad8fbee8 Refactoring functions into services, helper functions, hooks, patch bugs 2023-02-14 17:38:58 +07:00
56ca6039ba fix(chart): backend service typos 2023-02-14 10:50:34 +01:00
d3fcb69c50 fix(chart): backend service missing configuration 2023-02-14 01:18:41 +01:00
2db4a29ad7 chore(docs): chart setup + chart release notes.txt 2023-02-14 01:18:41 +01:00
4df82a6ff1 feat(chart): mailhog for local development 2023-02-14 01:18:41 +01:00
cdf73043e1 fix(chart): mongodb custom users + docs 2023-02-14 01:18:41 +01:00
ca07d1c50e fix(chart): helpers template auth.password typo 2023-02-14 01:18:41 +01:00
868011479b feat(chart): mongodb persistence 2023-02-14 01:18:41 +01:00
13b1805d04 Checkpoint 2 2023-02-13 17:11:31 +07:00
c233fd8ed1 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-02-13 15:39:38 +05:45
30b2b85446 Fix merge conflicts 2023-02-13 15:24:19 +07:00
e53fd110f6 Checkpoint weaving frontend and backend MFA 2023-02-13 12:29:36 +07:00
17406e413d Merge remote-tracking branch 'origin/2fa' into mfa 2023-02-11 16:08:13 +07:00
9b219f67b0 Update package.json 2023-02-11 16:07:32 +07:00
669861d7a8 General frontend structure for 2FA - done 2023-02-09 15:49:47 -08:00
bb752863fa Fix merge conflicts 2023-01-30 19:48:12 +07:00
cf5603c8e3 Finish preliminary backwards-compatible transition from user encryption scheme v1 to v2 with argon2 and protected key 2023-01-30 19:38:13 +07:00
77b1011207 feat(docs): added a pull request template 2023-01-30 11:54:52 +01:00
5cadb9e2f9 Finish MFA v1 and refactor all tokens into separate TokenService with modified collection 2023-01-23 22:10:15 +07:00
292 changed files with 11475 additions and 3624 deletions

22
.github/pull_request_template.md vendored Normal file
View File

@ -0,0 +1,22 @@
# Description 📣
*Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.*
## Type ✨
- [ ] Bug fix
- [ ] New feature
- [ ] Breaking change
- [ ] Documentation
# Tests 🛠️
*Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration. You may want to add screenshots when relevant and possible*
```sh
# Here's some code block to paste some code snippets
```
---
- [ ] I have read the [contributing guide](https://infisical.com/docs/contributing/overview), agreed and acknowledged the [code of conduct](https://infisical.com/docs/contributing/code-of-conduct). 📝

86
.github/values.yaml vendored
View File

@ -1,11 +1,5 @@
#####
# INFISICAL K8 DEFAULT VALUES FILE
# PLEASE REPLACE VALUES/EDIT AS REQUIRED
#####
nameOverride: ""
frontend:
enabled: true
name: frontend
podAnnotations: {}
deploymentAnnotations:
@ -13,17 +7,18 @@ frontend:
replicaCount: 2
image:
repository: infisical/frontend
pullPolicy: Always
tag: "latest"
pullPolicy: Always
kubeSecretRef: managed-secret-frontend
service:
# type of the frontend service
type: ClusterIP
# define the nodePort if service type is NodePort
# nodePort:
annotations: {}
type: ClusterIP
nodePort: ""
frontendEnvironmentVariables: null
backend:
enabled: true
name: backend
podAnnotations: {}
deploymentAnnotations:
@ -31,63 +26,46 @@ backend:
replicaCount: 2
image:
repository: infisical/backend
pullPolicy: Always
tag: "latest"
pullPolicy: Always
kubeSecretRef: managed-backend-secret
service:
annotations: {}
type: ClusterIP
nodePort: ""
backendEnvironmentVariables: null
## Mongo DB persistence
mongodb:
name: mongodb
podAnnotations: {}
image:
repository: mongo
pullPolicy: IfNotPresent
tag: "latest"
service:
annotations: {}
enabled: true
persistence:
enabled: false
# By default the backend will be connected to a Mongo instance in the cluster.
# However, it is recommended to add a managed document DB connection string because the DB instance in the cluster does not have persistence yet ( data will be deleted on next deploy).
# Learn about connection string type here https://www.mongodb.com/docs/manual/reference/connection-string/
mongodbConnection: {}
# externalMongoDBConnectionString: <>
## By default the backend will be connected to a Mongo instance within the cluster
## However, it is recommended to add a managed document DB connection string for production-use (DBaaS)
## Learn about connection string type here https://www.mongodb.com/docs/manual/reference/connection-string/
## e.g. "mongodb://<user>:<pass>@<host>:<port>/<database-name>"
mongodbConnection:
externalMongoDBConnectionString: ""
ingress:
enabled: true
annotations:
kubernetes.io/ingress.class: "nginx"
hostName: gamma.infisical.com # replace with your domain
frontend:
# cert-manager.io/issuer: letsencrypt-nginx
hostName: gamma.infisical.com ## <- Replace with your own domain
frontend:
path: /
pathType: Prefix
backend:
path: /api
pathType: Prefix
tls: []
tls:
[]
# - secretName: letsencrypt-nginx
# hosts:
# - infisical.local
## Complete Ingress example
# ingress:
# enabled: true
# annotations:
# kubernetes.io/ingress.class: "nginx"
# cert-manager.io/issuer: letsencrypt-nginx
# hostName: k8.infisical.com
# frontend:
# path: /
# pathType: Prefix
# backend:
# path: /api
# pathType: Prefix
# tls:
# - secretName: letsencrypt-nginx
# hosts:
# - k8.infisical.com
###
### YOU MUST FILL IN ALL SECRETS BELOW
###
backendEnvironmentVariables: {}
frontendEnvironmentVariables: {}
mailhog:
enabled: false

105
README.md

File diff suppressed because one or more lines are too long

View File

@ -1,18 +1,27 @@
FROM node:16-bullseye-slim
# Build stage
FROM node:16-alpine AS build
WORKDIR /app
COPY package.json package-lock.json ./
# RUN npm ci --only-production --ignore-scripts
# "prepare": "cd .. && npm install"
COPY package*.json ./
RUN npm ci --only-production
COPY . .
RUN npm run build
# Production stage
FROM node:16-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only-production
COPY --from=build /app .
HEALTHCHECK --interval=10s --timeout=3s --start-period=10s \
CMD node healthcheck.js
EXPOSE 4000
CMD ["npm", "run", "start"]
CMD ["npm", "run", "start"]

View File

@ -19,6 +19,7 @@
"await-to-js": "^3.0.0",
"aws-sdk": "^2.1311.0",
"axios": "^1.1.3",
"axios-retry": "^3.4.0",
"bcrypt": "^5.1.0",
"bigint-conversion": "^2.2.2",
"builder-pattern": "^2.2.0",
@ -3473,6 +3474,17 @@
"@babel/core": "^7.0.0-0"
}
},
"node_modules/@babel/runtime": {
"version": "7.21.0",
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.21.0.tgz",
"integrity": "sha512-xwII0//EObnq89Ji5AKYQaRYiW/nZ3llSv29d49IuxPhKbtJoLP+9QUUZ4nVragQVtaVGeZrpB+ZtG/Pdy/POw==",
"dependencies": {
"regenerator-runtime": "^0.13.11"
},
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/template": {
"version": "7.18.10",
"resolved": "https://registry.npmjs.org/@babel/template/-/template-7.18.10.tgz",
@ -5317,6 +5329,15 @@
"proxy-from-env": "^1.1.0"
}
},
"node_modules/axios-retry": {
"version": "3.4.0",
"resolved": "https://registry.npmjs.org/axios-retry/-/axios-retry-3.4.0.tgz",
"integrity": "sha512-VdgaP+gHH4iQYCCNUWF2pcqeciVOdGrBBAYUfTY+wPcO5Ltvp/37MLFNCmJKo7Gj3SHvCSdL8ouI1qLYJN3liA==",
"dependencies": {
"@babel/runtime": "^7.15.4",
"is-retry-allowed": "^2.2.0"
}
},
"node_modules/babel-jest": {
"version": "29.3.1",
"resolved": "https://registry.npmjs.org/babel-jest/-/babel-jest-29.3.1.tgz",
@ -7604,6 +7625,17 @@
"node": ">=0.10.0"
}
},
"node_modules/is-retry-allowed": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/is-retry-allowed/-/is-retry-allowed-2.2.0.tgz",
"integrity": "sha512-XVm7LOeLpTW4jV19QSH38vkswxoLud8sQ57YwJVTPWdiaI9I8keEhGFpBlslyVsgdQy4Opg8QOLb8YRgsyZiQg==",
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-stream": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-2.0.1.tgz",
@ -12179,6 +12211,11 @@
"node": ">=8.10.0"
}
},
"node_modules/regenerator-runtime": {
"version": "0.13.11",
"resolved": "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.13.11.tgz",
"integrity": "sha512-kY1AZVr2Ra+t+piVaJ4gxaFaReZVH40AKNo7UCX6W+dEwBo/2oZJzqfuN1qLq1oL45o56cPaTXELwrTh8Fpggg=="
},
"node_modules/regexpp": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/regexpp/-/regexpp-3.2.0.tgz",
@ -16584,6 +16621,14 @@
"@babel/helper-plugin-utils": "^7.19.0"
}
},
"@babel/runtime": {
"version": "7.21.0",
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.21.0.tgz",
"integrity": "sha512-xwII0//EObnq89Ji5AKYQaRYiW/nZ3llSv29d49IuxPhKbtJoLP+9QUUZ4nVragQVtaVGeZrpB+ZtG/Pdy/POw==",
"requires": {
"regenerator-runtime": "^0.13.11"
}
},
"@babel/template": {
"version": "7.18.10",
"resolved": "https://registry.npmjs.org/@babel/template/-/template-7.18.10.tgz",
@ -18075,6 +18120,15 @@
"proxy-from-env": "^1.1.0"
}
},
"axios-retry": {
"version": "3.4.0",
"resolved": "https://registry.npmjs.org/axios-retry/-/axios-retry-3.4.0.tgz",
"integrity": "sha512-VdgaP+gHH4iQYCCNUWF2pcqeciVOdGrBBAYUfTY+wPcO5Ltvp/37MLFNCmJKo7Gj3SHvCSdL8ouI1qLYJN3liA==",
"requires": {
"@babel/runtime": "^7.15.4",
"is-retry-allowed": "^2.2.0"
}
},
"babel-jest": {
"version": "29.3.1",
"resolved": "https://registry.npmjs.org/babel-jest/-/babel-jest-29.3.1.tgz",
@ -19771,6 +19825,11 @@
"resolved": "https://registry.npmjs.org/is-plain-object/-/is-plain-object-5.0.0.tgz",
"integrity": "sha512-VRSzKkbMm5jMDoKLbltAkFQ5Qr7VDiTFGXxYFXXowVj387GeGNOCsOH6Msy00SGZ3Fp84b1Naa1psqgcCIEP5Q=="
},
"is-retry-allowed": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/is-retry-allowed/-/is-retry-allowed-2.2.0.tgz",
"integrity": "sha512-XVm7LOeLpTW4jV19QSH38vkswxoLud8sQ57YwJVTPWdiaI9I8keEhGFpBlslyVsgdQy4Opg8QOLb8YRgsyZiQg=="
},
"is-stream": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-2.0.1.tgz",
@ -23088,6 +23147,11 @@
"picomatch": "^2.2.1"
}
},
"regenerator-runtime": {
"version": "0.13.11",
"resolved": "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.13.11.tgz",
"integrity": "sha512-kY1AZVr2Ra+t+piVaJ4gxaFaReZVH40AKNo7UCX6W+dEwBo/2oZJzqfuN1qLq1oL45o56cPaTXELwrTh8Fpggg=="
},
"regexpp": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/regexpp/-/regexpp-3.2.0.tgz",

View File

@ -10,6 +10,7 @@
"await-to-js": "^3.0.0",
"aws-sdk": "^2.1311.0",
"axios": "^1.1.3",
"axios-retry": "^3.4.0",
"bcrypt": "^5.1.0",
"bigint-conversion": "^2.2.2",
"builder-pattern": "^2.2.0",
@ -47,7 +48,7 @@
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "npm run build && node build/index.js",
"start": "node build/index.js",
"dev": "nodemon",
"swagger-autogen": "node ./swagger/index.ts",
"build": "rimraf ./build && tsc && cp -R ./src/templates ./build",

View File

@ -1,7 +1,7 @@
// eslint-disable-next-line @typescript-eslint/no-var-requires
const { patchRouterParam } = require('./utils/patchAsyncRoutes');
import express, { Request, Response } from 'express';
import express from 'express';
import helmet from 'helmet';
import cors from 'cors';
import cookieParser from 'cookie-parser';
@ -39,9 +39,12 @@ import {
password as v1PasswordRouter,
stripe as v1StripeRouter,
integration as v1IntegrationRouter,
integrationAuth as v1IntegrationAuthRouter
integrationAuth as v1IntegrationAuthRouter,
secretApprovalRequest as v1SecretApprovalRequest
} from './routes/v1';
import {
signup as v2SignupRouter,
auth as v2AuthRouter,
users as v2UsersRouter,
organizations as v2OrganizationsRouter,
workspace as v2WorkspaceRouter,
@ -57,7 +60,7 @@ import { healthCheck } from './routes/status';
import { getLogger } from './utils/logger';
import { RouteNotFoundError } from './utils/errors';
import { requestErrorHandler } from './middleware/requestErrorHandler';
import { handleMongoInvalidDataError, requestErrorHandler } from './middleware/requestErrorHandler';
// patch async route params to handle Promise Rejections
patchRouterParam();
@ -108,8 +111,11 @@ app.use('/api/v1/password', v1PasswordRouter);
app.use('/api/v1/stripe', v1StripeRouter);
app.use('/api/v1/integration', v1IntegrationRouter);
app.use('/api/v1/integration-auth', v1IntegrationAuthRouter);
app.use('/api/v1/secrets-approval-request', v1SecretApprovalRequest)
// v2 routes
app.use('/api/v2/signup', v2SignupRouter);
app.use('/api/v2/auth', v2AuthRouter);
app.use('/api/v2/users', v2UsersRouter);
app.use('/api/v2/organizations', v2OrganizationsRouter);
app.use('/api/v2/workspace', v2EnvironmentRouter);
@ -132,6 +138,9 @@ app.use((req, res, next) => {
next(RouteNotFoundError({ message: `The requested source '(${req.method})${req.url}' was not found` }))
})
// handle mongo validation errors
app.use(handleMongoInvalidDataError);
//* Error Handling Middleware (must be after all routing logic)
app.use(requestErrorHandler)

View File

@ -5,6 +5,8 @@ const ENCRYPTION_KEY = process.env.ENCRYPTION_KEY!;
const SALT_ROUNDS = parseInt(process.env.SALT_ROUNDS!) || 10;
const JWT_AUTH_LIFETIME = process.env.JWT_AUTH_LIFETIME! || '10d';
const JWT_AUTH_SECRET = process.env.JWT_AUTH_SECRET!;
const JWT_MFA_LIFETIME = process.env.JWT_MFA_LIFETIME! || '5m';
const JWT_MFA_SECRET = process.env.JWT_MFA_SECRET!;
const JWT_REFRESH_LIFETIME = process.env.JWT_REFRESH_LIFETIME! || '90d';
const JWT_REFRESH_SECRET = process.env.JWT_REFRESH_SECRET!;
const JWT_SERVICE_SECRET = process.env.JWT_SERVICE_SECRET!;
@ -56,6 +58,8 @@ export {
SALT_ROUNDS,
JWT_AUTH_LIFETIME,
JWT_AUTH_SECRET,
JWT_MFA_LIFETIME,
JWT_MFA_SECRET,
JWT_REFRESH_LIFETIME,
JWT_REFRESH_SECRET,
JWT_SERVICE_SECRET,

View File

@ -0,0 +1,16 @@
import axios from 'axios';
import axiosRetry from 'axios-retry';
const axiosInstance = axios.create();
// add retry functionality to the axios instance
axiosRetry(axiosInstance, {
retries: 3,
retryDelay: axiosRetry.exponentialDelay, // exponential back-off delay between retries
retryCondition: (error) => {
// only retry if the error is a network error or a 5xx server error
return axiosRetry.isNetworkError(error) || axiosRetry.isRetryableError(error);
},
});
export default axiosInstance;

View File

@ -5,7 +5,8 @@ import * as Sentry from '@sentry/node';
import * as bigintConversion from 'bigint-conversion';
const jsrp = require('jsrp');
import { User, LoginSRPDetail } from '../../models';
import { createToken, issueTokens, clearTokens } from '../../helpers/auth';
import { createToken, issueAuthTokens, clearTokens } from '../../helpers/auth';
import { checkUserDevice } from '../../helpers/user';
import {
ACTION_LOGIN,
ACTION_LOGOUT
@ -111,7 +112,14 @@ export const login2 = async (req: Request, res: Response) => {
// compare server and client shared keys
if (server.checkClientProof(clientProof)) {
// issue tokens
const tokens = await issueTokens({ userId: user._id.toString() });
await checkUserDevice({
user,
ip: req.ip,
userAgent: req.headers['user-agent'] ?? ''
});
const tokens = await issueAuthTokens({ userId: user._id.toString() });
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {

View File

@ -14,6 +14,7 @@ import * as stripeController from './stripeController';
import * as userActionController from './userActionController';
import * as userController from './userController';
import * as workspaceController from './workspaceController';
import * as secretApprovalController from './secretApprovalController';
export {
authController,
@ -31,5 +32,6 @@ export {
stripeController,
userActionController,
userController,
workspaceController
workspaceController,
secretApprovalController
};

View File

@ -1,14 +1,13 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import crypto from 'crypto';
import { SITE_URL, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, EMAIL_TOKEN_LIFETIME } from '../../config';
import { MembershipOrg, Organization, User, Token } from '../../models';
import { SITE_URL, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET } from '../../config';
import { MembershipOrg, Organization, User } from '../../models';
import { deleteMembershipOrg as deleteMemberFromOrg } from '../../helpers/membershipOrg';
import { checkEmailVerification } from '../../helpers/signup';
import { createToken } from '../../helpers/auth';
import { updateSubscriptionOrgQuantity } from '../../helpers/organization';
import { sendMail } from '../../helpers/nodemailer';
import { OWNER, ADMIN, MEMBER, ACCEPTED, INVITED } from '../../variables';
import { TokenService } from '../../services';
import { OWNER, ADMIN, MEMBER, ACCEPTED, INVITED, TOKEN_EMAIL_ORG_INVITATION } from '../../variables';
/**
* Delete organization membership with id [membershipOrgId] from organization
@ -163,18 +162,11 @@ export const inviteUserToOrganization = async (req: Request, res: Response) => {
const organization = await Organization.findOne({ _id: organizationId });
if (organization) {
const token = crypto.randomBytes(16).toString('hex');
await Token.findOneAndUpdate(
{ email: inviteeEmail },
{
email: inviteeEmail,
token,
createdAt: new Date(),
ttl: Math.floor(+new Date() / 1000) + EMAIL_TOKEN_LIFETIME // time in seconds, i.e unix
},
{ upsert: true, new: true }
);
const token = await TokenService.createToken({
type: TOKEN_EMAIL_ORG_INVITATION,
email: inviteeEmail,
organizationId: organization._id
});
await sendMail({
template: 'organizationInvitation.handlebars',
@ -226,10 +218,12 @@ export const verifyUserToOrganization = async (req: Request, res: Response) => {
if (!membershipOrg)
throw new Error('Failed to find any invitations for email');
await checkEmailVerification({
await TokenService.validateToken({
type: TOKEN_EMAIL_ORG_INVITATION,
email,
code
organizationId: membershipOrg.organization,
token: code
});
if (user && user?.publicKey) {

View File

@ -1,14 +1,14 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import crypto from 'crypto';
// eslint-disable-next-line @typescript-eslint/no-var-requires
const jsrp = require('jsrp');
import * as bigintConversion from 'bigint-conversion';
import { User, Token, BackupPrivateKey, LoginSRPDetail } from '../../models';
import { checkEmailVerification } from '../../helpers/signup';
import { User, BackupPrivateKey, LoginSRPDetail } from '../../models';
import { createToken } from '../../helpers/auth';
import { sendMail } from '../../helpers/nodemailer';
import { EMAIL_TOKEN_LIFETIME, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, SITE_URL } from '../../config';
import { TokenService } from '../../services';
import { JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, SITE_URL } from '../../config';
import { TOKEN_EMAIL_PASSWORD_RESET } from '../../variables';
import { BadRequestError } from '../../utils/errors';
/**
@ -31,20 +31,12 @@ export const emailPasswordReset = async (req: Request, res: Response) => {
error: 'Failed to send email verification for password reset'
});
}
const token = crypto.randomBytes(16).toString('hex');
await Token.findOneAndUpdate(
{ email },
{
email,
token,
createdAt: new Date(),
ttl: Math.floor(+new Date() / 1000) + EMAIL_TOKEN_LIFETIME // time in seconds, i.e unix
},
{ upsert: true, new: true }
);
const token = await TokenService.createToken({
type: TOKEN_EMAIL_PASSWORD_RESET,
email
});
await sendMail({
template: 'passwordReset.handlebars',
subjectLine: 'Infisical password reset',
@ -55,7 +47,6 @@ export const emailPasswordReset = async (req: Request, res: Response) => {
callback_url: SITE_URL + '/password-reset'
}
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
@ -88,10 +79,11 @@ export const emailPasswordResetVerify = async (req: Request, res: Response) => {
error: 'Failed email verification for password reset'
});
}
await checkEmailVerification({
await TokenService.validateToken({
type: TOKEN_EMAIL_PASSWORD_RESET,
email,
code
token: code
});
// generate temporary password-reset token
@ -174,8 +166,18 @@ export const srp1 = async (req: Request, res: Response) => {
*/
export const changePassword = async (req: Request, res: Response) => {
try {
const { clientProof, encryptedPrivateKey, iv, tag, salt, verifier } =
req.body;
const {
clientProof,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
} = req.body;
const user = await User.findOne({
email: req.user.email
}).select('+salt +verifier');
@ -205,9 +207,13 @@ export const changePassword = async (req: Request, res: Response) => {
await User.findByIdAndUpdate(
req.user._id.toString(),
{
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv,
tag,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier
},
@ -341,9 +347,12 @@ export const getBackupPrivateKey = async (req: Request, res: Response) => {
export const resetPassword = async (req: Request, res: Response) => {
try {
const {
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv,
tag,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier,
} = req.body;
@ -351,9 +360,13 @@ export const resetPassword = async (req: Request, res: Response) => {
await User.findByIdAndUpdate(
req.user._id.toString(),
{
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
encryptedPrivateKey,
iv,
tag,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier
},

View File

@ -0,0 +1,320 @@
import { Request, Response } from 'express';
import SecretApprovalRequest, { ApprovalStatus, ChangeType, IApprover, IRequestedChange } from '../../models/secretApprovalRequest';
import { Builder, IBuilder } from "builder-pattern"
import { secretObjectHasRequiredFields, validateSecrets } from '../../helpers/secret';
import _ from 'lodash';
import { SECRET_PERSONAL, SECRET_SHARED } from '../../variables';
import { BadRequestError, ResourceNotFound, UnauthorizedRequestError } from '../../utils/errors';
import { ISecret, Membership, Secret, Workspace } from '../../models';
import mongoose from 'mongoose';
export const createApprovalRequest = async (req: Request, res: Response) => {
const { workspaceId, environment, requestedChanges } = req.body;
// validate workspace
const workspaceFromDB = await Workspace.findById(workspaceId)
if (!workspaceFromDB) {
throw ResourceNotFound()
}
const environmentBelongsToWorkspace = _.some(workspaceFromDB.environments, { slug: environment })
if (!environmentBelongsToWorkspace) {
throw ResourceNotFound()
}
// check for secret duplicates
const hasSecretIdDuplicates = requestedChanges.length !== _.uniqBy(requestedChanges, 'modifiedSecretParentId').length;
if (hasSecretIdDuplicates) {
throw BadRequestError({ message: "Request cannot contain changes for duplicate secrets" })
}
// ensure the workspace has approvers set
if (!workspaceFromDB.approvers.length) {
throw BadRequestError({ message: "There are no designated approvers for this project, you must set approvers first before making a request" })
}
const approverIds = _.compact(_.map(workspaceFromDB.approvers, "userId"))
const approversFormatted: IApprover[] = approverIds.map(id => {
return { "userId": id, status: ApprovalStatus.PENDING }
})
const listOfSecretIdsToModify = _.compact(_.map(requestedChanges, "modifiedSecretParentId"))
// Ensure that the user requesting changes for the set of secrets can indeed interact with said secrets
if (listOfSecretIdsToModify.length > 0) {
await validateSecrets({
userId: req.user._id.toString(),
secretIds: listOfSecretIdsToModify
});
}
const sanitizedRequestedChangesList: IRequestedChange[] = []
requestedChanges.forEach((requestedChange: IRequestedChange) => {
const secretDetailsIsValid = secretObjectHasRequiredFields(requestedChange.modifiedSecretDetails)
if (!secretDetailsIsValid) {
throw BadRequestError({ message: "One or more required fields are missing from your modified secret" })
}
if (!requestedChange.modifiedSecretParentId && (requestedChange.type != ChangeType.DELETE.toString() && requestedChange.type != ChangeType.CREATE.toString())) {
throw BadRequestError({ message: "modifiedSecretParentId can only be empty when secret change type is DELETE or CREATE" })
}
sanitizedRequestedChangesList.push(Builder<IRequestedChange>()
.modifiedSecretParentId(requestedChange.modifiedSecretParentId)
.modifiedSecretDetails(requestedChange.modifiedSecretDetails)
.approvers(approversFormatted)
.type(requestedChange.type).build())
});
const newApprovalRequest = await SecretApprovalRequest.create({
workspace: workspaceId,
requestedByUserId: req.user._id.toString(),
environment: environment,
requestedChanges: sanitizedRequestedChangesList
})
const populatedNewApprovalRequest = await newApprovalRequest.populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
return res.send({ approvalRequest: populatedNewApprovalRequest });
};
export const getAllApprovalRequestsForUser = async (req: Request, res: Response) => {
const approvalRequests = await SecretApprovalRequest.find({
requestedByUserId: req.user._id.toString()
}).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
.sort({ updatedAt: -1 })
res.send({ approvalRequests: approvalRequests })
}
export const getAllApprovalRequestsThatRequireUserApproval = async (req: Request, res: Response) => {
const approvalRequests = await SecretApprovalRequest.find({
'requestedChanges.approvers.userId': req.user._id.toString()
}).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
.sort({ updatedAt: -1 })
res.send({ approvalRequests: approvalRequests })
}
export const approveApprovalRequest = async (req: Request, res: Response) => {
const { requestedChangeIds } = req.body;
const { reviewId } = req.params
const approvalRequestFromDB = await SecretApprovalRequest.findById(reviewId)
if (!approvalRequestFromDB) {
throw ResourceNotFound()
}
const requestedChangesFromDB: IRequestedChange[] = approvalRequestFromDB.requestedChanges
const filteredChangesByIds = requestedChangesFromDB.filter(change => requestedChangeIds.includes(change._id.toString()))
if (filteredChangesByIds.length != requestedChangeIds.length) {
throw BadRequestError({ message: "All requestedChangeIds should exist in this approval request" })
}
const changesThatRequireUserApproval = _.filter(filteredChangesByIds, change => {
return _.some(change.approvers, approver => {
return approver.userId.toString() == req.user._id.toString();
});
});
if (!changesThatRequireUserApproval.length) {
throw UnauthorizedRequestError({ message: "Your approval is not required for this review" })
}
if (changesThatRequireUserApproval.length != filteredChangesByIds.length) {
throw BadRequestError({ message: "You may only request to approve changes that require your approval" })
}
changesThatRequireUserApproval.forEach((requestedChange) => {
const overallChangeStatus = requestedChange.status
const currentLoggedInUserId = req.user._id.toString()
if (overallChangeStatus == ApprovalStatus.PENDING.toString()) {
requestedChange.approvers.forEach((approver) => {
if (approver.userId.toString() == currentLoggedInUserId && approver.status == ApprovalStatus.PENDING.toString()) {
approver.status = ApprovalStatus.APPROVED
}
})
let updateOverallStatusToApproved = true
requestedChange.approvers.forEach((approver) => {
if (approver.status != ApprovalStatus.APPROVED.toString()) {
updateOverallStatusToApproved = false
}
})
if (updateOverallStatusToApproved) {
requestedChange.status = ApprovalStatus.APPROVED
}
}
})
const updatedApprovalRequest = await SecretApprovalRequest.findByIdAndUpdate(reviewId, {
requestedChanges: requestedChangesFromDB
}, { new: true }).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
res.send({ approvalRequest: updatedApprovalRequest })
}
export const rejectApprovalRequest = async (req: Request, res: Response) => {
const { requestedChangeIds } = req.body;
const { reviewId } = req.params
const approvalRequestFromDB = await SecretApprovalRequest.findById(reviewId)
if (!approvalRequestFromDB) {
throw ResourceNotFound()
}
const requestedChangesFromDB: IRequestedChange[] = approvalRequestFromDB.requestedChanges
const filteredChangesByIds = requestedChangesFromDB.filter(change => requestedChangeIds.includes(change._id.toString()))
if (filteredChangesByIds.length != requestedChangeIds.length) {
throw BadRequestError({ message: "All requestedChangeIds should exist in this approval request" })
}
const changesThatRequireUserApproval = _.filter(filteredChangesByIds, change => {
return _.some(change.approvers, approver => {
return approver.userId.toString() == req.user._id.toString();
});
});
if (!changesThatRequireUserApproval.length) {
throw UnauthorizedRequestError({ message: "Your approval is not required for this review" })
}
if (changesThatRequireUserApproval.length != filteredChangesByIds.length) {
throw BadRequestError({ message: "You may only request to reject changes that require your approval" })
}
changesThatRequireUserApproval.forEach((requestedChange) => {
const overallChangeStatus = requestedChange.status
const currentLoggedInUserId = req.user._id.toString()
if (overallChangeStatus == ApprovalStatus.PENDING.toString()) {
requestedChange.approvers.forEach((approver) => {
if (approver.userId.toString() == currentLoggedInUserId && approver.status == ApprovalStatus.PENDING.toString()) {
approver.status = ApprovalStatus.REJECTED
requestedChange.status = ApprovalStatus.REJECTED
}
})
}
})
const updatedApprovalRequest = await SecretApprovalRequest.findByIdAndUpdate(reviewId, {
requestedChanges: requestedChangesFromDB
}, { new: true }).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
res.send({ approvalRequest: updatedApprovalRequest })
};
export const mergeApprovalRequestSecrets = async (req: Request, res: Response) => {
const { requestedChangeIds } = req.body;
const { reviewId } = req.params
// only the user who requested the set of changes can merge it
const approvalRequestFromDB = await SecretApprovalRequest.findOne({ _id: reviewId, requestedByUserId: req.user._id })
if (!approvalRequestFromDB) {
throw ResourceNotFound()
}
// ensure that this user is a member of this workspace
const membershipDetails = await Membership.find({ user: req.user._id, workspace: approvalRequestFromDB.workspace })
if (!membershipDetails) {
throw UnauthorizedRequestError()
}
// filter not merged, approved, and change ids specified in this request
const filteredChangesToMerge: IRequestedChange[] = approvalRequestFromDB.requestedChanges.filter(change => change.merged == false && change.status == ApprovalStatus.APPROVED && requestedChangeIds.includes(change._id.toString()))
if (filteredChangesToMerge.length != requestedChangeIds.length) {
throw BadRequestError({ message: "One or more changes in this approval is either already merged/not approved or do not exist" })
}
const secretsToCreate: ISecret[] = []
const secretsToUpdate: any[] = []
const secretsIdsToDelete: any[] = []
const secretIdsToModify: any[] = []
filteredChangesToMerge.forEach((requestedChange: any) => {
const overallChangeStatus = requestedChange.status
const currentLoggedInUserId = req.user._id.toString()
if (overallChangeStatus == ApprovalStatus.APPROVED.toString()) {
if (ChangeType.CREATE.toString() == requestedChange.type) {
const modifiedSecret = requestedChange.modifiedSecretDetails.toObject()
secretsToCreate.push({
...modifiedSecret,
user: requestedChange.modifiedSecretDetails.type === SECRET_PERSONAL ? currentLoggedInUserId : undefined,
})
}
if (ChangeType.UPDATE.toString() == requestedChange.type) {
const modifiedSecret = requestedChange.modifiedSecretDetails.toObject()
secretIdsToModify.push(requestedChange.modifiedSecretParentId)
secretsToUpdate.push({
filter: { _id: requestedChange.modifiedSecretParentId },
update: {
$set: {
...modifiedSecret,
user: requestedChange.modifiedSecretDetails.type === SECRET_PERSONAL ? currentLoggedInUserId : undefined,
},
$inc: {
version: 1
}
}
})
}
if (ChangeType.DELETE.toString() == requestedChange.type) {
secretsIdsToDelete.push({
_id: requestedChange.modifiedSecretParentId.toString()
})
}
requestedChange.merged = true
}
})
// ensure all secrets that are to be updated exist
const numSecretsFromDBThatRequireUpdate = await Secret.countDocuments({ _id: { $in: secretIdsToModify } });
const numSecretsFromDBThatRequireDelete = await Secret.countDocuments({ _id: { $in: secretsIdsToDelete } });
if (numSecretsFromDBThatRequireUpdate != secretIdsToModify.length || numSecretsFromDBThatRequireDelete != secretsIdsToDelete.length) {
throw BadRequestError({ message: "You cannot merge changes for secrets that no longer exist" })
}
// Add add CRUD operations into a single list of operations
const allOperationsForBulkWrite: any[] = [];
for (const updateStatement of secretsToUpdate) {
allOperationsForBulkWrite.push({ updateOne: updateStatement });
}
for (const secretId of secretsIdsToDelete) {
allOperationsForBulkWrite.push({ deleteOne: { filter: { _id: secretId } } });
}
for (const createStatement of secretsToCreate) {
allOperationsForBulkWrite.push({ insertOne: { document: createStatement } });
}
// start transaction
const session = await mongoose.startSession();
session.startTransaction();
try {
await Secret.bulkWrite(allOperationsForBulkWrite);
await SecretApprovalRequest.updateOne({ _id: reviewId, 'requestedChanges._id': { $in: requestedChangeIds } },
{ $set: { 'requestedChanges.$.merged': true } })
const updatedApproval = await SecretApprovalRequest.findById(reviewId).populate(["requestedChanges.modifiedSecretParentId", { path: 'requestedChanges.approvers.userId', select: 'firstName lastName _id' }])
res.send(updatedApproval)
} catch (error) {
await session.abortTransaction();
throw error
} finally {
session.endSession();
}
};

View File

@ -1,16 +1,12 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { NODE_ENV, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, INVITE_ONLY_SIGNUP } from '../../config';
import { User, MembershipOrg } from '../../models';
import { completeAccount } from '../../helpers/user';
import { User } from '../../models';
import { JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, INVITE_ONLY_SIGNUP } from '../../config';
import {
sendEmailVerification,
checkEmailVerification,
initializeDefaultOrg
} from '../../helpers/signup';
import { issueTokens, createToken } from '../../helpers/auth';
import { INVITED, ACCEPTED } from '../../variables';
import axios from 'axios';
import { createToken } from '../../helpers/auth';
import { BadRequestError } from '../../utils/errors';
/**
@ -112,201 +108,3 @@ export const verifyEmailSignup = async (req: Request, res: Response) => {
token
});
};
/**
* Complete setting up user by adding their personal and auth information as part of the
* signup flow
* @param req
* @param res
* @returns
*/
export const completeAccountSignup = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier,
organizationName
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user'); // ensure user is non-null
// initialize default organization and workspace
await initializeDefaultOrg({
organizationName,
user
});
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueTokens({
userId: user._id.toString()
});
token = tokens.token;
refreshToken = tokens.refreshToken;
// sending a welcome email to new users
if (process.env.LOOPS_API_KEY) {
await axios.post("https://app.loops.so/api/v1/events/send", {
"email": email,
"eventName": "Sign Up",
"firstName": firstName,
"lastName": lastName
}, {
headers: {
"Accept": "application/json",
"Authorization": "Bearer " + process.env.LOOPS_API_KEY
},
});
}
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token,
refreshToken
});
};
/**
* Complete setting up user by adding their personal and auth information as part of the
* invite flow
* @param req
* @param res
* @returns
*/
export const completeAccountInvite = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
const membershipOrg = await MembershipOrg.findOne({
inviteEmail: email,
status: INVITED
});
if (!membershipOrg) throw new Error('Failed to find invitations for email');
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
publicKey,
encryptedPrivateKey,
iv,
tag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user');
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueTokens({
userId: user._id.toString()
});
token = tokens.token;
refreshToken = tokens.refreshToken;
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token,
refreshToken
});
};

View File

@ -16,7 +16,8 @@ import {
} from "../../helpers/workspace";
import { addMemberships } from "../../helpers/membership";
import { ADMIN } from "../../variables";
import { BadRequestError, ResourceNotFound, UnauthorizedRequestError } from "../../utils/errors";
import _ from "lodash";
/**
* Return public keys of members of workspace with id [workspaceId]
* @param req
@ -303,6 +304,112 @@ export const getWorkspaceIntegrationAuthorizations = async (
});
};
export const addApproverForWorkspaceAndEnvironment = async (
req: Request,
res: Response
) => {
interface Approver {
environment: string;
userId: string;
}
const { workspaceId } = req.params;
const { approvers }: { approvers: Approver[] } = req.body;
const workspaceFromDB = await Workspace.findById(workspaceId)
if (!workspaceFromDB) {
throw ResourceNotFound()
}
const allAvailableWorkspaceEnvironments = _.map(workspaceFromDB.environments, 'slug');
const environmentsFromApprovers = _.map(approvers, "environment")
const filteredApprovers = environmentsFromApprovers.map(environment => allAvailableWorkspaceEnvironments.includes(environment))
// validate environments
if (filteredApprovers.length != environmentsFromApprovers.length) {
const err = `One or more environments set for approver(s) is invalid`
throw BadRequestError({ message: err })
}
const approverIds = _.map(approvers, "userId")
// validate approvers membership
const approversMemberships = await Membership.find({
workspace: workspaceId,
user: { $in: approverIds }
})
if (!approversMemberships) {
throw ResourceNotFound()
}
if (approversMemberships.length != approverIds.length) {
throw UnauthorizedRequestError({ message: "Approvers must be apart of the workspace they are being added to" })
}
const updatedWorkspace = await Workspace.findByIdAndUpdate(workspaceId,
{
$addToSet: {
approvers: {
$each: approvers,
}
}
}, { new: true })
return res.json(updatedWorkspace)
};
export const removeApproverForWorkspaceAndEnvironment = async (
req: Request,
res: Response
) => {
interface Approver {
environment: string;
userId: string;
}
const { workspaceId } = req.params;
const { approvers }: { approvers: Approver[] } = req.body;
const workspaceFromDB = await Workspace.findById(workspaceId)
if (!workspaceFromDB) {
throw ResourceNotFound()
}
const allAvailableWorkspaceEnvironments = _.map(workspaceFromDB.environments, 'slug');
const environmentsFromApprovers = _.map(approvers, "environment")
const filteredApprovers = environmentsFromApprovers.map(environment => allAvailableWorkspaceEnvironments.includes(environment))
// validate environments
if (filteredApprovers.length != environmentsFromApprovers.length) {
const err = `One or more environments set for approver(s) is invalid`
throw BadRequestError({ message: err })
}
const approverIds = _.map(approvers, "userId")
// validate approvers membership
const approversMemberships = await Membership.find({
workspace: workspaceId,
user: { $in: approverIds }
})
if (!approversMemberships) {
throw ResourceNotFound()
}
if (approversMemberships.length != approverIds.length) {
throw UnauthorizedRequestError({ message: "Approvers must be apart of the workspace they are being added to" })
}
const updatedWorkspace = await Workspace.findByIdAndUpdate(workspaceId, { $pullAll: { approvers: approvers } }, { new: true })
return res.json(updatedWorkspace)
};
/**
* Return service service tokens for workspace [workspaceId] belonging to user
* @param req

View File

@ -0,0 +1,351 @@
/* eslint-disable @typescript-eslint/no-var-requires */
import { Request, Response } from 'express';
import jwt from 'jsonwebtoken';
import * as Sentry from '@sentry/node';
import * as bigintConversion from 'bigint-conversion';
const jsrp = require('jsrp');
import { User, LoginSRPDetail } from '../../models';
import { issueAuthTokens, createToken } from '../../helpers/auth';
import { checkUserDevice } from '../../helpers/user';
import { sendMail } from '../../helpers/nodemailer';
import { TokenService } from '../../services';
import { EELogService } from '../../ee/services';
import {
NODE_ENV,
JWT_MFA_LIFETIME,
JWT_MFA_SECRET
} from '../../config';
import { BadRequestError, InternalServerError } from '../../utils/errors';
import {
TOKEN_EMAIL_MFA,
ACTION_LOGIN
} from '../../variables';
import { getChannelFromUserAgent } from '../../utils/posthog'; // TODO: move this
declare module 'jsonwebtoken' {
export interface UserIDJwtPayload extends jwt.JwtPayload {
userId: string;
}
}
const clientPublicKeys: any = {};
/**
* Log in user step 1: Return [salt] and [serverPublicKey] as part of step 1 of SRP protocol
* @param req
* @param res
* @returns
*/
export const login1 = async (req: Request, res: Response) => {
try {
const {
email,
clientPublicKey
}: { email: string; clientPublicKey: string } = req.body;
const user = await User.findOne({
email
}).select('+salt +verifier');
if (!user) throw new Error('Failed to find user');
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier
},
async () => {
// generate server-side public key
const serverPublicKey = server.getPublicKey();
await LoginSRPDetail.findOneAndReplace({ email: email }, {
email: email,
clientPublicKey: clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt),
}, { upsert: true, returnNewDocument: false });
return res.status(200).send({
serverPublicKey,
salt: user.salt
});
}
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to start authentication process'
});
}
};
/**
* Log in user step 2: complete step 2 of SRP protocol and return token and their (encrypted)
* private key
* @param req
* @param res
* @returns
*/
export const login2 = async (req: Request, res: Response) => {
try {
if (!req.headers['user-agent']) throw InternalServerError({ message: 'User-Agent header is required' });
const { email, clientProof } = req.body;
const user = await User.findOne({
email
}).select('+salt +verifier +encryptionVersion +protectedKey +protectedKeyIV +protectedKeyTag +publicKey +encryptedPrivateKey +iv +tag');
if (!user) throw new Error('Failed to find user');
const loginSRPDetail = await LoginSRPDetail.findOneAndDelete({ email: email })
if (!loginSRPDetail) {
return BadRequestError(Error("Failed to find login details for SRP"))
}
const server = new jsrp.server();
server.init(
{
salt: user.salt,
verifier: user.verifier,
b: loginSRPDetail.serverBInt
},
async () => {
server.setClientPublicKey(loginSRPDetail.clientPublicKey);
// compare server and client shared keys
if (server.checkClientProof(clientProof)) {
if (user.isMfaEnabled) {
// case: user has MFA enabled
// generate temporary MFA token
const token = createToken({
payload: {
userId: user._id.toString()
},
expiresIn: JWT_MFA_LIFETIME,
secret: JWT_MFA_SECRET
});
const code = await TokenService.createToken({
type: TOKEN_EMAIL_MFA,
email
});
// send MFA code [code] to [email]
await sendMail({
template: 'emailMfa.handlebars',
subjectLine: 'Infisical MFA code',
recipients: [email],
substitutions: {
code
}
});
return res.status(200).send({
mfaEnabled: true,
token
});
}
await checkUserDevice({
user,
ip: req.ip,
userAgent: req.headers['user-agent'] ?? ''
});
// issue tokens
const tokens = await issueAuthTokens({ userId: user._id.toString() });
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
// case: user does not have MFA enablgged
// return (access) token in response
interface ResponseData {
mfaEnabled: boolean;
encryptionVersion: any;
protectedKey?: string;
protectedKeyIV?: string;
protectedKeyTag?: string;
token: string;
publicKey?: string;
encryptedPrivateKey?: string;
iv?: string;
tag?: string;
}
const response: ResponseData = {
mfaEnabled: false,
encryptionVersion: user.encryptionVersion,
token: tokens.token,
publicKey: user.publicKey,
encryptedPrivateKey: user.encryptedPrivateKey,
iv: user.iv,
tag: user.tag
}
if (
user?.protectedKey &&
user?.protectedKeyIV &&
user?.protectedKeyTag
) {
response.protectedKey = user.protectedKey;
response.protectedKeyIV = user.protectedKeyIV
response.protectedKeyTag = user.protectedKeyTag;
}
const loginAction = await EELogService.createAction({
name: ACTION_LOGIN,
userId: user._id
});
loginAction && await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getChannelFromUserAgent(req.headers['user-agent']),
ipAddress: req.ip
});
return res.status(200).send(response);
}
return res.status(400).send({
message: 'Failed to authenticate. Try again?'
});
}
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to authenticate. Try again?'
});
}
};
/**
* Send MFA token to email [email]
* @param req
* @param res
*/
export const sendMfaToken = async (req: Request, res: Response) => {
try {
const { email } = req.body;
const code = await TokenService.createToken({
type: TOKEN_EMAIL_MFA,
email
});
// send MFA code [code] to [email]
await sendMail({
template: 'emailMfa.handlebars',
subjectLine: 'Infisical MFA code',
recipients: [email],
substitutions: {
code
}
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to send MFA code'
});
}
return res.status(200).send({
message: 'Successfully sent new MFA code'
});
}
/**
* Verify MFA token [mfaToken] and issue JWT and refresh tokens if the
* MFA token [mfaToken] is valid
* @param req
* @param res
*/
export const verifyMfaToken = async (req: Request, res: Response) => {
const { email, mfaToken } = req.body;
await TokenService.validateToken({
type: TOKEN_EMAIL_MFA,
email,
token: mfaToken
});
const user = await User.findOne({
email
}).select('+salt +verifier +encryptionVersion +protectedKey +protectedKeyIV +protectedKeyTag +publicKey +encryptedPrivateKey +iv +tag');
if (!user) throw new Error('Failed to find user');
await checkUserDevice({
user,
ip: req.ip,
userAgent: req.headers['user-agent'] ?? ''
});
// issue tokens
const tokens = await issueAuthTokens({ userId: user._id.toString() });
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
interface VerifyMfaTokenRes {
encryptionVersion: number;
protectedKey?: string;
protectedKeyIV?: string;
protectedKeyTag?: string;
token: string;
publicKey: string;
encryptedPrivateKey: string;
iv: string;
tag: string;
}
const resObj: VerifyMfaTokenRes = {
encryptionVersion: user.encryptionVersion,
token: tokens.token,
publicKey: user.publicKey as string,
encryptedPrivateKey: user.encryptedPrivateKey as string,
iv: user.iv as string,
tag: user.tag as string
}
if (user?.protectedKey && user?.protectedKeyIV && user?.protectedKeyTag) {
resObj.protectedKey = user.protectedKey;
resObj.protectedKeyIV = user.protectedKeyIV;
resObj.protectedKeyTag = user.protectedKeyTag;
}
const loginAction = await EELogService.createAction({
name: ACTION_LOGIN,
userId: user._id
});
loginAction && await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getChannelFromUserAgent(req.headers['user-agent']),
ipAddress: req.ip
});
return res.status(200).send(resObj);
}

View File

@ -1,3 +1,5 @@
import * as authController from './authController';
import * as signupController from './signupController';
import * as usersController from './usersController';
import * as organizationsController from './organizationsController';
import * as workspaceController from './workspaceController';
@ -9,6 +11,8 @@ import * as environmentController from './environmentController';
import * as tagController from './tagController';
export {
authController,
signupController,
usersController,
organizationsController,
workspaceController,

View File

@ -1,7 +1,8 @@
import to from 'await-to-js';
import { Types } from 'mongoose';
import { Request, Response } from 'express';
import { ISecret, Membership, Secret, Workspace } from '../../models';
import { ISecret, Secret } from '../../models';
import { IAction } from '../../ee/models';
import {
SECRET_PERSONAL,
SECRET_SHARED,
@ -20,6 +21,252 @@ import { ABILITY_READ, ABILITY_WRITE } from '../../variables/organization';
import { userHasNoAbility, userHasWorkspaceAccess, userHasWriteOnlyAbility } from '../../ee/helpers/checkMembershipPermissions';
import Tag from '../../models/tag';
import _ from 'lodash';
import {
BatchSecretRequest,
BatchSecret
} from '../../types/secret';
/**
* Peform a batch of any specified CUD secret operations
* @param req
* @param res
*/
export const batchSecrets = async (req: Request, res: Response) => {
const channel = getChannelFromUserAgent(req.headers['user-agent']);
const {
workspaceId,
environment,
requests
}: {
workspaceId: string;
environment: string;
requests: BatchSecretRequest[];
}= req.body;
const createSecrets: BatchSecret[] = [];
const updateSecrets: BatchSecret[] = [];
const deleteSecrets: Types.ObjectId[] = [];
const actions: IAction[] = [];
requests.forEach((request) => {
switch (request.method) {
case 'POST':
createSecrets.push({
...request.secret,
version: 1,
user: request.secret.type === SECRET_PERSONAL ? req.user : undefined,
environment,
workspace: new Types.ObjectId(workspaceId)
});
break;
case 'PATCH':
updateSecrets.push({
...request.secret,
_id: new Types.ObjectId(request.secret._id)
});
break;
case 'DELETE':
deleteSecrets.push(new Types.ObjectId(request.secret._id));
break;
}
});
// handle create secrets
let createdSecrets: ISecret[] = [];
if (createSecrets.length > 0) {
createdSecrets = await Secret.insertMany(createSecrets);
// (EE) add secret versions for new secrets
await EESecretService.addSecretVersions({
secretVersions: createdSecrets.map((n: any) => {
return ({
...n._doc,
_id: new Types.ObjectId(),
secret: n._id,
isDeleted: false
});
})
});
const addAction = await EELogService.createAction({
name: ACTION_ADD_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: createdSecrets.map((n) => n._id)
}) as IAction;
actions.push(addAction);
if (postHogClient) {
postHogClient.capture({
event: 'secrets added',
distinctId: req.user.email,
properties: {
numberOfSecrets: createdSecrets.length,
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
}
});
}
}
// handle update secrets
let updatedSecrets: ISecret[] = [];
if (updateSecrets.length > 0 && req.secrets) {
// construct object containing all secrets
let listedSecretsObj: {
[key: string]: {
version: number;
type: string;
}
} = {};
listedSecretsObj = req.secrets.reduce((obj: any, secret: ISecret) => ({
...obj,
[secret._id.toString()]: secret
}), {});
const updateOperations = updateSecrets.map((u) => ({
updateOne: {
filter: { _id: new Types.ObjectId(u._id) },
update: {
$inc: {
version: 1
},
...u,
_id: new Types.ObjectId(u._id)
}
}
}));
await Secret.bulkWrite(updateOperations);
const secretVersions = updateSecrets.map((u) => ({
secret: new Types.ObjectId(u._id),
version: listedSecretsObj[u._id.toString()].version,
workspace: new Types.ObjectId(workspaceId),
type: listedSecretsObj[u._id.toString()].type,
environment,
isDeleted: false,
secretKeyCiphertext: u.secretKeyCiphertext,
secretKeyIV: u.secretKeyIV,
secretKeyTag: u.secretKeyTag,
secretValueCiphertext: u.secretValueCiphertext,
secretValueIV: u.secretValueIV,
secretValueTag: u.secretValueTag,
secretCommentCiphertext: u.secretCommentCiphertext,
secretCommentIV: u.secretCommentIV,
secretCommentTag: u.secretCommentTag,
tags: u.tags
}));
await EESecretService.addSecretVersions({
secretVersions
});
updatedSecrets = await Secret.find({
_id: {
$in: updateSecrets.map((u) => new Types.ObjectId(u._id))
}
});
const updateAction = await EELogService.createAction({
name: ACTION_UPDATE_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: updatedSecrets.map((u) => u._id)
}) as IAction;
actions.push(updateAction);
if (postHogClient) {
postHogClient.capture({
event: 'secrets modified',
distinctId: req.user.email,
properties: {
numberOfSecrets: updateSecrets.length,
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
}
});
}
}
// handle delete secrets
if (deleteSecrets.length > 0) {
await Secret.deleteMany({
_id: {
$in: deleteSecrets
}
});
await EESecretService.markDeletedSecretVersions({
secretIds: deleteSecrets
});
const deleteAction = await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
userId: req.user._id,
workspaceId: new Types.ObjectId(workspaceId),
secretIds: deleteSecrets
}) as IAction;
actions.push(deleteAction);
if (postHogClient) {
postHogClient.capture({
event: 'secrets deleted',
distinctId: req.user.email,
properties: {
numberOfSecrets: deleteSecrets.length,
environment,
workspaceId,
channel: channel,
userAgent: req.headers?.['user-agent']
}
});
}
}
if (actions.length > 0) {
// (EE) create (audit) log
await EELogService.createLog({
userId: req.user._id.toString(),
workspaceId: new Types.ObjectId(workspaceId),
actions,
channel,
ipAddress: req.ip
});
}
// // trigger event - push secrets
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId
})
});
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId
});
const resObj: { [key: string]: ISecret[] | string[] } = {}
if (createSecrets.length > 0) {
resObj['createdSecrets'] = createdSecrets;
}
if (updateSecrets.length > 0) {
resObj['updatedSecrets'] = updatedSecrets;
}
if (deleteSecrets.length > 0) {
resObj['deletedSecrets'] = deleteSecrets.map((d) => d.toString());
}
return res.status(200).send(resObj);
}
/**
* Create secret(s) for workspace with id [workspaceId] and environment [environment]
@ -166,11 +413,9 @@ export const createSecrets = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,
@ -187,11 +432,9 @@ export const createSecrets = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash,
secretCommentCiphertext,
secretCommentIV,
secretCommentTag,

View File

@ -0,0 +1,250 @@
import { Request, Response } from 'express';
import * as Sentry from '@sentry/node';
import { User, MembershipOrg } from '../../models';
import { completeAccount } from '../../helpers/user';
import {
initializeDefaultOrg
} from '../../helpers/signup';
import { issueAuthTokens } from '../../helpers/auth';
import { INVITED, ACCEPTED } from '../../variables';
import { NODE_ENV } from '../../config';
import request from '../../config/request';
/**
* Complete setting up user by adding their personal and auth information as part of the
* signup flow
* @param req
* @param res
* @returns
*/
export const completeAccountSignup = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier,
organizationName
}: {
email: string;
firstName: string;
lastName: string;
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
publicKey: string;
encryptedPrivateKey: string;
encryptedPrivateKeyIV: string;
encryptedPrivateKeyTag: string;
salt: string;
verifier: string;
organizationName: string;
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user'); // ensure user is non-null
// initialize default organization and workspace
await initializeDefaultOrg({
organizationName,
user
});
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueAuthTokens({
userId: user._id.toString()
});
token = tokens.token;
// sending a welcome email to new users
if (process.env.LOOPS_API_KEY) {
await request.post("https://app.loops.so/api/v1/events/send", {
"email": email,
"eventName": "Sign Up",
"firstName": firstName,
"lastName": lastName
}, {
headers: {
"Accept": "application/json",
"Authorization": "Bearer " + process.env.LOOPS_API_KEY
},
});
}
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token
});
};
/**
* Complete setting up user by adding their personal and auth information as part of the
* invite flow
* @param req
* @param res
* @returns
*/
export const completeAccountInvite = async (req: Request, res: Response) => {
let user, token, refreshToken;
try {
const {
email,
firstName,
lastName,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
} = req.body;
// get user
user = await User.findOne({ email });
if (!user || (user && user?.publicKey)) {
// case 1: user doesn't exist.
// case 2: user has already completed account
return res.status(403).send({
error: 'Failed to complete account for complete user'
});
}
const membershipOrg = await MembershipOrg.findOne({
inviteEmail: email,
status: INVITED
});
if (!membershipOrg) throw new Error('Failed to find invitations for email');
// complete setting up user's account
user = await completeAccount({
userId: user._id.toString(),
firstName,
lastName,
encryptionVersion: 2,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
});
if (!user)
throw new Error('Failed to complete account for non-existent user');
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
{
inviteEmail: email,
status: INVITED
},
{
user,
status: ACCEPTED
}
);
// issue tokens
const tokens = await issueAuthTokens({
userId: user._id.toString()
});
token = tokens.token;
// store (refresh) token in httpOnly cookie
res.cookie('jid', tokens.refreshToken, {
httpOnly: true,
path: '/',
sameSite: 'strict',
secure: NODE_ENV === 'production' ? true : false
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
return res.status(400).send({
message: 'Failed to complete account setup'
});
}
return res.status(200).send({
message: 'Successfully set up account',
user,
token
});
};

View File

@ -55,6 +55,44 @@ export const getMe = async (req: Request, res: Response) => {
});
}
/**
* Update the current user's MFA-enabled status [isMfaEnabled].
* Note: Infisical currently only supports email-based 2FA only; this will expand to
* include SMS and authenticator app modes of authentication in the future.
* @param req
* @param res
* @returns
*/
export const updateMyMfaEnabled = async (req: Request, res: Response) => {
let user;
try {
const { isMfaEnabled }: { isMfaEnabled: boolean } = req.body;
req.user.isMfaEnabled = isMfaEnabled;
if (isMfaEnabled) {
// TODO: adapt this route/controller
// to work for different forms of MFA
req.user.mfaMethods = ['email'];
} else {
req.user.mfaMethods = [];
}
await req.user.save();
user = req.user;
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
return res.status(400).send({
message: "Failed to update current user's MFA status"
});
}
return res.status(200).send({
user
});
}
/**
* Return organizations that the current user is part of.
* @param req

View File

@ -158,11 +158,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
} = oldSecretVersion;
// update secret
@ -179,11 +177,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
},
{
new: true
@ -204,11 +200,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
secretKeyCiphertext,
secretKeyIV,
secretKeyTag,
secretKeyHash,
secretValueCiphertext,
secretValueIV,
secretValueTag,
secretValueHash
secretValueTag
}).save();
// take secret snapshot

View File

@ -5,22 +5,19 @@ import {
} from '../../variables';
export interface ISecretVersion {
_id: Types.ObjectId;
secret: Types.ObjectId;
version: number;
workspace: Types.ObjectId; // new
type: string; // new
user: Types.ObjectId; // new
user?: Types.ObjectId; // new
environment: string; // new
isDeleted: boolean;
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretKeyHash: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretValueHash: string;
tags?: string[];
}
@ -72,9 +69,6 @@ const secretVersionSchema = new Schema<ISecretVersion>(
type: String, // symmetric
required: true
},
secretKeyHash: {
type: String
},
secretValueCiphertext: {
type: String,
required: true
@ -87,9 +81,6 @@ const secretVersionSchema = new Schema<ISecretVersion>(
type: String, // symmetric
required: true
},
secretValueHash: {
type: String
},
tags: {
ref: 'Tag',
type: [Schema.Types.ObjectId],

View File

@ -211,7 +211,7 @@ const getAuthAPIKeyPayload = async ({
* @return {String} obj.token - issued JWT token
* @return {String} obj.refreshToken - issued refresh token
*/
const issueTokens = async ({ userId }: { userId: string }) => {
const issueAuthTokens = async ({ userId }: { userId: string }) => {
let token: string;
let refreshToken: string;
try {
@ -298,6 +298,6 @@ export {
getAuthSTDPayload,
getAuthAPIKeyPayload,
createToken,
issueTokens,
issueAuthTokens,
clearTokens
};

View File

@ -713,10 +713,27 @@ const reformatPullSecrets = ({ secrets }: { secrets: ISecret[] }) => {
return reformatedSecrets;
};
const secretObjectHasRequiredFields = (secretObject: ISecret) => {
if (!secretObject.type ||
!(secretObject.type === SECRET_PERSONAL || secretObject.type === SECRET_SHARED) ||
!secretObject.secretKeyCiphertext ||
!secretObject.secretKeyIV ||
!secretObject.secretKeyTag ||
(typeof secretObject.secretValueCiphertext !== 'string') ||
!secretObject.secretValueIV ||
!secretObject.secretValueTag) {
return false
}
return true
}
export {
validateSecrets,
v1PushSecrets,
v2PushSecrets,
pullSecrets,
reformatPullSecrets
reformatPullSecrets,
secretObjectHasRequiredFields
};

View File

@ -1,13 +1,11 @@
import * as Sentry from '@sentry/node';
import crypto from 'crypto';
import { Token, IToken, IUser } from '../models';
import { IUser } from '../models';
import { createOrganization } from './organization';
import { addMembershipsOrg } from './membershipOrg';
import { createWorkspace } from './workspace';
import { addMemberships } from './membership';
import { OWNER, ADMIN, ACCEPTED } from '../variables';
import { OWNER, ACCEPTED } from '../variables';
import { sendMail } from '../helpers/nodemailer';
import { EMAIL_TOKEN_LIFETIME } from '../config';
import { TokenService } from '../services';
import { TOKEN_EMAIL_CONFIRMATION } from '../variables';
/**
* Send magic link to verify email to [email]
@ -15,22 +13,13 @@ import { EMAIL_TOKEN_LIFETIME } from '../config';
* @param {Object} obj
* @param {String} obj.email - email
* @returns {Boolean} success - whether or not operation was successful
*
*/
const sendEmailVerification = async ({ email }: { email: string }) => {
try {
const token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
await Token.findOneAndUpdate(
{ email },
{
email,
token,
createdAt: new Date(),
ttl: Math.floor(+new Date() / 1000) + EMAIL_TOKEN_LIFETIME // time in seconds, i.e unix
},
{ upsert: true, new: true }
);
const token = await TokenService.createToken({
type: TOKEN_EMAIL_CONFIRMATION,
email
});
// send mail
await sendMail({
@ -64,21 +53,11 @@ const checkEmailVerification = async ({
code: string;
}) => {
try {
const token = await Token.findOne({
await TokenService.validateToken({
type: TOKEN_EMAIL_CONFIRMATION,
email,
token: code
});
if (token && Math.floor(Date.now() / 1000) > token.ttl) {
await Token.deleteOne({
email,
token: code
});
throw new Error('Verification token has expired')
}
if (!token) throw new Error('Failed to find email verification token');
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
@ -114,18 +93,6 @@ const initializeDefaultOrg = async ({
roles: [OWNER],
statuses: [ACCEPTED]
});
// initialize a default workspace inside the new organization
const workspace = await createWorkspace({
name: `Example Project`,
organizationId: organization._id.toString()
});
await addMemberships({
userIds: [user._id.toString()],
workspaceId: workspace._id.toString(),
roles: [ADMIN]
});
} catch (err) {
throw new Error(`Failed to initialize default organization and workspace [err=${err}]`);
}

View File

@ -0,0 +1,217 @@
import * as Sentry from '@sentry/node';
import { Types } from 'mongoose';
import { TokenData } from '../models';
import crypto from 'crypto';
import bcrypt from 'bcrypt';
import {
TOKEN_EMAIL_CONFIRMATION,
TOKEN_EMAIL_MFA,
TOKEN_EMAIL_ORG_INVITATION,
TOKEN_EMAIL_PASSWORD_RESET
} from '../variables';
import {
SALT_ROUNDS
} from '../config';
import { UnauthorizedRequestError } from '../utils/errors';
/**
* Create and store a token in the database for purpose [type]
* @param {Object} obj
* @param {String} obj.type
* @param {String} obj.email
* @param {String} obj.phoneNumber
* @param {Types.ObjectId} obj.organizationId
* @returns {String} token - the created token
*/
const createTokenHelper = async ({
type,
email,
phoneNumber,
organizationId
}: {
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
email?: string;
phoneNumber?: string;
organizationId?: Types.ObjectId
}) => {
let token, expiresAt, triesLeft;
try {
// generate random token based on specified token use-case
// type [type]
switch (type) {
case TOKEN_EMAIL_CONFIRMATION:
// generate random 6-digit code
token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
expiresAt = new Date((new Date()).getTime() + 86400000);
break;
case TOKEN_EMAIL_MFA:
// generate random 6-digit code
token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
triesLeft = 5;
expiresAt = new Date((new Date()).getTime() + 300000);
break;
case TOKEN_EMAIL_ORG_INVITATION:
// generate random hex
token = crypto.randomBytes(16).toString('hex');
expiresAt = new Date((new Date()).getTime() + 259200000);
break;
case TOKEN_EMAIL_PASSWORD_RESET:
// generate random hex
token = crypto.randomBytes(16).toString('hex');
expiresAt = new Date((new Date()).getTime() + 86400000);
break;
default:
token = crypto.randomBytes(16).toString('hex');
expiresAt = new Date();
break;
}
interface TokenDataQuery {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
}
interface TokenDataUpdate {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
tokenHash: string;
triesLeft?: number;
expiresAt: Date;
}
const query: TokenDataQuery = { type };
const update: TokenDataUpdate = {
type,
tokenHash: await bcrypt.hash(token, SALT_ROUNDS),
expiresAt
}
if (email) {
query.email = email;
update.email = email;
}
if (phoneNumber) {
query.phoneNumber = phoneNumber;
update.phoneNumber = phoneNumber;
}
if (organizationId) {
query.organization = organizationId
update.organization = organizationId
}
if (triesLeft) {
update.triesLeft = triesLeft;
}
await TokenData.findOneAndUpdate(
query,
update,
{
new: true,
upsert: true
}
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error(
"Failed to create token"
);
}
return token;
}
/**
*
* @param {Object} obj
* @param {String} obj.email - email associated with the token
* @param {String} obj.token - value of the token
*/
const validateTokenHelper = async ({
type,
email,
phoneNumber,
organizationId,
token
}: {
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
email?: string;
phoneNumber?: string;
organizationId?: Types.ObjectId;
token: string;
}) => {
interface Query {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
}
const query: Query = { type };
if (email) { query.email = email; }
if (phoneNumber) { query.phoneNumber = phoneNumber; }
if (organizationId) { query.organization = organizationId; }
const tokenData = await TokenData.findOne(query).select('+tokenHash');
if (!tokenData) throw new Error('Failed to find token to validate');
if (tokenData.expiresAt < new Date()) {
// case: token expired
await TokenData.findByIdAndDelete(tokenData._id);
throw UnauthorizedRequestError({
message: 'MFA session expired. Please log in again',
context: {
code: 'mfa_expired'
}
});
}
const isValid = await bcrypt.compare(token, tokenData.tokenHash);
if (!isValid) {
// case: token is not valid
if (tokenData?.triesLeft !== undefined) {
// case: token has a try-limit
if (tokenData.triesLeft === 1) {
// case: token is out of tries
await TokenData.findByIdAndDelete(tokenData._id);
} else {
// case: token has more than 1 try left
await TokenData.findByIdAndUpdate(tokenData._id, {
triesLeft: tokenData.triesLeft - 1
}, {
new: true
});
}
throw UnauthorizedRequestError({
message: 'MFA code is invalid',
context: {
code: 'mfa_invalid',
triesLeft: tokenData.triesLeft - 1
}
});
}
throw UnauthorizedRequestError({
message: 'MFA code is invalid',
context: {
code: 'mfa_invalid'
}
});
}
// case: token is valid
await TokenData.findByIdAndDelete(tokenData._id);
}
export {
createTokenHelper,
validateTokenHelper
}

View File

@ -1,5 +1,6 @@
import * as Sentry from '@sentry/node';
import { User, IUser } from '../models';
import { IUser, User } from '../models';
import { sendMail } from './nodemailer';
/**
* Initialize a user under email [email]
@ -28,10 +29,14 @@ const setupAccount = async ({ email }: { email: string }) => {
* @param {String} obj.userId - id of user to finish setting up
* @param {String} obj.firstName - first name of user
* @param {String} obj.lastName - last name of user
* @param {Number} obj.encryptionVersion - version of auth encryption scheme used
* @param {String} obj.protectedKey - protected key in encryption version 2
* @param {String} obj.protectedKeyIV - IV of protected key in encryption version 2
* @param {String} obj.protectedKeyTag - tag of protected key in encryption version 2
* @param {String} obj.publicKey - publickey of user
* @param {String} obj.encryptedPrivateKey - (encrypted) private key of user
* @param {String} obj.iv - iv for (encrypted) private key of user
* @param {String} obj.tag - tag for (encrypted) private key of user
* @param {String} obj.encryptedPrivateKeyIV - iv for (encrypted) private key of user
* @param {String} obj.encryptedPrivateKeyTag - tag for (encrypted) private key of user
* @param {String} obj.salt - salt for auth SRP
* @param {String} obj.verifier - verifier for auth SRP
* @returns {Object} user - the completed user
@ -40,20 +45,28 @@ const completeAccount = async ({
userId,
firstName,
lastName,
encryptionVersion,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
iv,
tag,
encryptedPrivateKeyIV,
encryptedPrivateKeyTag,
salt,
verifier
}: {
userId: string;
firstName: string;
lastName: string;
encryptionVersion: number;
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
publicKey: string;
encryptedPrivateKey: string;
iv: string;
tag: string;
encryptedPrivateKeyIV: string;
encryptedPrivateKeyTag: string;
salt: string;
verifier: string;
}) => {
@ -67,10 +80,14 @@ const completeAccount = async ({
{
firstName,
lastName,
encryptionVersion,
protectedKey,
protectedKeyIV,
protectedKeyTag,
publicKey,
encryptedPrivateKey,
iv,
tag,
iv: encryptedPrivateKeyIV,
tag: encryptedPrivateKeyTag,
salt,
verifier
},
@ -85,4 +102,48 @@ const completeAccount = async ({
return user;
};
export { setupAccount, completeAccount };
/**
* Check if device with ip [ip] and user-agent [userAgent] has been seen for user [user].
* If the device is unseen, then notify the user of the new device
* @param {Object} obj
* @param {String} obj.ip - login ip address
* @param {String} obj.userAgent - login user-agent
*/
const checkUserDevice = async ({
user,
ip,
userAgent
}: {
user: IUser;
ip: string;
userAgent: string;
}) => {
const isDeviceSeen = user.devices.some((device) => device.ip === ip && device.userAgent === userAgent);
if (!isDeviceSeen) {
// case: unseen login ip detected for user
// -> notify user about the sign-in from new ip
user.devices = user.devices.concat([{
ip: String(ip),
userAgent
}]);
await user.save();
// send MFA code [code] to [email]
await sendMail({
template: 'newDevice.handlebars',
subjectLine: `Successful login from new device`,
recipients: [user.email],
substitutions: {
email: user.email,
timestamp: new Date().toString(),
ip,
userAgent
}
});
}
}
export { setupAccount, completeAccount, checkUserDevice };

View File

@ -1,7 +1,7 @@
import axios from "axios";
import * as Sentry from "@sentry/node";
import { Octokit } from "@octokit/rest";
import { IIntegrationAuth } from "../models";
import request from '../config/request';
import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
@ -13,12 +13,14 @@ import {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
INTEGRATION_HEROKU_API_URL,
INTEGRATION_VERCEL_API_URL,
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_RENDER_API_URL,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_TRAVISCI_API_URL,
} from "../variables";
/**
@ -42,7 +44,7 @@ const getApps = async ({
owner?: string;
}
let apps: App[];
let apps: App[] = [];
try {
switch (integrationAuth.integration) {
case INTEGRATION_AZURE_KEY_VAULT:
@ -90,6 +92,11 @@ const getApps = async ({
accessToken,
});
break;
case INTEGRATION_TRAVISCI:
apps = await getAppsTravisCI({
accessToken,
})
break;
}
} catch (err) {
Sentry.setUser(null);
@ -111,7 +118,7 @@ const getAppsHeroku = async ({ accessToken }: { accessToken: string }) => {
let apps;
try {
const res = (
await axios.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
await request.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
headers: {
Accept: "application/vnd.heroku+json; version=3",
Authorization: `Bearer ${accessToken}`,
@ -148,7 +155,7 @@ const getAppsVercel = async ({
let apps;
try {
const res = (
await axios.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
await request.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
@ -186,7 +193,7 @@ const getAppsNetlify = async ({ accessToken }: { accessToken: string }) => {
let apps;
try {
const res = (
await axios.get(`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`, {
await request.get(`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`, {
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
@ -257,7 +264,7 @@ const getAppsRender = async ({ accessToken }: { accessToken: string }) => {
let apps: any;
try {
const res = (
await axios.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
await request.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: 'application/json',
@ -303,23 +310,18 @@ const getAppsFlyio = async ({ accessToken }: { accessToken: string }) => {
}
`;
const res = (
await axios({
url: INTEGRATION_FLYIO_API_URL,
method: "post",
headers: {
Authorization: "Bearer " + accessToken,
const res = (await request.post(INTEGRATION_FLYIO_API_URL, {
query,
variables: {
role: null,
},
}, {
headers: {
Authorization: "Bearer " + accessToken,
'Accept': 'application/json',
'Accept-Encoding': 'application/json',
},
data: {
query,
variables: {
role: null,
},
},
})
).data.data.apps.nodes;
},
})).data.data.apps.nodes;
apps = res.map((a: any) => ({
name: a.name,
@ -344,7 +346,7 @@ const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
let apps: any;
try {
const res = (
await axios.get(
await request.get(
`${INTEGRATION_CIRCLECI_API_URL}/v1.1/projects`,
{
headers: {
@ -369,4 +371,34 @@ const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
return apps;
};
const getAppsTravisCI = async ({ accessToken }: { accessToken: string }) => {
let apps: any;
try {
const res = (
await request.get(
`${INTEGRATION_TRAVISCI_API_URL}/repos`,
{
headers: {
"Authorization": `token ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
)
).data;
apps = res?.map((a: any) => {
return {
name: a?.slug?.split("/")[1],
appId: a?.id,
}
});
}catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to get TravisCI projects");
}
return apps;
}
export { getApps };

View File

@ -1,4 +1,4 @@
import axios from 'axios';
import request from '../config/request';
import * as Sentry from '@sentry/node';
import {
INTEGRATION_AZURE_KEY_VAULT,
@ -136,9 +136,9 @@ const exchangeCodeAzure = async ({
const accessExpiresAt = new Date();
let res: ExchangeCodeAzureResponse;
try {
res = (await axios.post(
res = (await request.post(
INTEGRATION_AZURE_TOKEN_URL,
new URLSearchParams({
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
scope: 'https://vault.azure.net/.default openid offline_access',
@ -147,16 +147,16 @@ const exchangeCodeAzure = async ({
redirect_uri: `${SITE_URL}/integrations/azure-key-vault/oauth2/callback`
} as any)
)).data;
accessExpiresAt.setSeconds(
accessExpiresAt.getSeconds() + res.expires_in
accessExpiresAt.getSeconds() + res.expires_in
);
} catch (err: any) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Azure');
}
return ({
accessToken: res.access_token,
refreshToken: res.refresh_token,
@ -175,36 +175,36 @@ const exchangeCodeAzure = async ({
* @returns {Date} obj2.accessExpiresAt - date of expiration for access token
*/
const exchangeCodeHeroku = async ({
code
code
}: {
code: string;
code: string;
}) => {
let res: ExchangeCodeHerokuResponse;
const accessExpiresAt = new Date();
try {
res = (await axios.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
client_secret: CLIENT_SECRET_HEROKU
} as any)
)).data;
accessExpiresAt.setSeconds(
accessExpiresAt.getSeconds() + res.expires_in
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Heroku');
}
return ({
accessToken: res.access_token,
refreshToken: res.refresh_token,
accessExpiresAt
});
let res: ExchangeCodeHerokuResponse;
const accessExpiresAt = new Date();
try {
res = (await request.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
code: code,
client_secret: CLIENT_SECRET_HEROKU
} as any)
)).data;
accessExpiresAt.setSeconds(
accessExpiresAt.getSeconds() + res.expires_in
);
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Heroku');
}
return ({
accessToken: res.access_token,
refreshToken: res.refresh_token,
accessExpiresAt
});
}
/**
@ -221,7 +221,7 @@ const exchangeCodeVercel = async ({ code }: { code: string }) => {
let res: ExchangeCodeVercelResponse;
try {
res = (
await axios.post(
await request.post(
INTEGRATION_VERCEL_TOKEN_URL,
new URLSearchParams({
code: code,
@ -234,7 +234,7 @@ const exchangeCodeVercel = async ({ code }: { code: string }) => {
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error('Failed OAuth2 code-token exchange with Vercel');
throw new Error(`Failed OAuth2 code-token exchange with Vercel [err=${err}]`);
}
return {
@ -260,7 +260,7 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
let accountId;
try {
res = (
await axios.post(
await request.post(
INTEGRATION_NETLIFY_TOKEN_URL,
new URLSearchParams({
grant_type: 'authorization_code',
@ -272,14 +272,14 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
)
).data;
const res2 = await axios.get('https://api.netlify.com/api/v1/sites', {
const res2 = await request.get('https://api.netlify.com/api/v1/sites', {
headers: {
Authorization: `Bearer ${res.access_token}`
}
});
const res3 = (
await axios.get('https://api.netlify.com/api/v1/accounts', {
await request.get('https://api.netlify.com/api/v1/accounts', {
headers: {
Authorization: `Bearer ${res.access_token}`
}
@ -314,7 +314,7 @@ const exchangeCodeGithub = async ({ code }: { code: string }) => {
let res: ExchangeCodeGithubResponse;
try {
res = (
await axios.get(INTEGRATION_GITHUB_TOKEN_URL, {
await request.get(INTEGRATION_GITHUB_TOKEN_URL, {
params: {
client_id: CLIENT_ID_GITHUB,
client_secret: CLIENT_SECRET_GITHUB,

View File

@ -1,4 +1,4 @@
import axios from 'axios';
import request from '../config/request';
import * as Sentry from '@sentry/node';
import { INTEGRATION_AZURE_KEY_VAULT, INTEGRATION_HEROKU } from '../variables';
import {
@ -71,7 +71,7 @@ const exchangeRefreshAzure = async ({
refreshToken: string;
}) => {
try {
const res: RefreshTokenAzureResponse = (await axios.post(
const res: RefreshTokenAzureResponse = (await request.post(
INTEGRATION_AZURE_TOKEN_URL,
new URLSearchParams({
client_id: CLIENT_ID_AZURE,
@ -105,7 +105,7 @@ const exchangeRefreshHeroku = async ({
let accessToken;
try {
const res = await axios.post(
const res = await request.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: 'refresh_token',

View File

@ -1,4 +1,3 @@
import axios from "axios";
import * as Sentry from "@sentry/node";
import _ from 'lodash';
import AWS from 'aws-sdk';
@ -23,14 +22,16 @@ import {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
INTEGRATION_HEROKU_API_URL,
INTEGRATION_VERCEL_API_URL,
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_RENDER_API_URL,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_TRAVISCI_API_URL,
} from "../variables";
import { access, appendFile } from "fs";
import request from '../config/request';
/**
* Sync/push [secrets] to [app] in integration named [integration]
@ -129,6 +130,14 @@ const syncSecrets = async ({
secrets,
accessToken,
});
break;
case INTEGRATION_TRAVISCI:
await syncSecretsTravisCI({
integration,
secrets,
accessToken,
});
break;
}
} catch (err) {
Sentry.setUser(null);
@ -179,9 +188,10 @@ const syncSecretsAzureKeyVault = async ({
let result: GetAzureKeyVaultSecret[] = [];
while (url) {
const res = await axios.get(url, {
const res = await request.get(url, {
headers: {
Authorization: `Bearer ${accessToken}`
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
});
@ -200,9 +210,10 @@ const syncSecretsAzureKeyVault = async ({
lastSlashIndex = getAzureKeyVaultSecret.id.lastIndexOf('/');
}
const azureKeyVaultSecret = await axios.get(`${getAzureKeyVaultSecret.id}?api-version=7.3`, {
const azureKeyVaultSecret = await request.get(`${getAzureKeyVaultSecret.id}?api-version=7.3`, {
headers: {
'Authorization': `Bearer ${accessToken}`
'Authorization': `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
});
@ -252,14 +263,15 @@ const syncSecretsAzureKeyVault = async ({
// Sync/push set secrets
if (setSecrets.length > 0) {
setSecrets.forEach(async ({ key, value }) => {
await axios.put(
await request.put(
`${integration.app}/secrets/${key}?api-version=7.3`,
{
value
},
{
headers: {
Authorization: `Bearer ${accessToken}`
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
}
);
@ -268,9 +280,10 @@ const syncSecretsAzureKeyVault = async ({
if (deleteSecrets.length > 0) {
deleteSecrets.forEach(async (secret) => {
await axios.delete(`${integration.app}/secrets/${secret.key}?api-version=7.3`, {
await request.delete(`${integration.app}/secrets/${secret.key}?api-version=7.3`, {
headers: {
'Authorization': `Bearer ${accessToken}`
'Authorization': `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
});
});
@ -482,12 +495,13 @@ const syncSecretsHeroku = async ({
}) => {
try {
const herokuSecrets = (
await axios.get(
await request.get(
`${INTEGRATION_HEROKU_API_URL}/apps/${integration.app}/config-vars`,
{
headers: {
Accept: "application/vnd.heroku+json; version=3",
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
)
@ -499,13 +513,14 @@ const syncSecretsHeroku = async ({
}
});
await axios.patch(
await request.patch(
`${INTEGRATION_HEROKU_API_URL}/apps/${integration.app}/config-vars`,
secrets,
{
headers: {
Accept: "application/vnd.heroku+json; version=3",
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
@ -540,7 +555,7 @@ const syncSecretsVercel = async ({
value: string;
target: string[];
}
try {
// Get all (decrypted) secrets back from Vercel in
// decrypted format
@ -552,39 +567,85 @@ const syncSecretsVercel = async ({
}
: {}),
};
// const res = (
// await Promise.all(
// (
// await request.get(
// `${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
// {
// params,
// headers: {
// Authorization: `Bearer ${accessToken}`,
// 'Accept-Encoding': 'application/json'
// }
// }
// ))
// .data
// .envs
// .filter((secret: VercelSecret) => secret.target.includes(integration.targetEnvironment))
// .map(async (secret: VercelSecret) => {
// if (secret.type === 'encrypted') {
// // case: secret is encrypted -> need to decrypt
// const decryptedSecret = (await request.get(
// `${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
// {
// params,
// headers: {
// Authorization: `Bearer ${accessToken}`,
// 'Accept-Encoding': 'application/json'
// }
// }
// )).data;
const res = (
await Promise.all(
(
await axios.get(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
// return decryptedSecret;
// }
// return secret;
// }))).reduce((obj: any, secret: any) => ({
// ...obj,
// [secret.key]: secret
// }), {});
const vercelSecrets: VercelSecret[] = (await request.get(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
}
))
.data
.envs
.filter((secret: VercelSecret) => secret.target.includes(integration.targetEnvironment));
const res: { [key: string]: VercelSecret } = {};
for await (const vercelSecret of vercelSecrets) {
if (vercelSecret.type === 'encrypted') {
// case: secret is encrypted -> need to decrypt
const decryptedSecret = (await request.get(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${vercelSecret.id}`,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
}
}
))
.data
.envs
.filter((secret: VercelSecret) => secret.target.includes(integration.targetEnvironment))
.map(async (secret: VercelSecret) => (await axios.get(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`
}
}
)).data)
)).reduce((obj: any, secret: any) => ({
...obj,
[secret.key]: secret
}), {});
}
)).data;
const updateSecrets: VercelSecret[] = [];
const deleteSecrets: VercelSecret[] = [];
const newSecrets: VercelSecret[] = [];
res[vercelSecret.key] = decryptedSecret;
} else {
res[vercelSecret.key] = vercelSecret;
}
}
const updateSecrets: VercelSecret[] = [];
const deleteSecrets: VercelSecret[] = [];
const newSecrets: VercelSecret[] = [];
// Identify secrets to create
Object.keys(secrets).map((key) => {
@ -608,8 +669,10 @@ const syncSecretsVercel = async ({
id: res[key].id,
key: key,
value: secrets[key],
type: "encrypted",
target: [integration.targetEnvironment],
type: res[key].type,
target: res[key].target.includes(integration.targetEnvironment)
? [...res[key].target]
: [...res[key].target, integration.targetEnvironment]
});
}
} else {
@ -618,7 +681,7 @@ const syncSecretsVercel = async ({
id: res[key].id,
key: key,
value: res[key].value,
type: "encrypted",
type: "encrypted", // value doesn't matter
target: [integration.targetEnvironment],
});
}
@ -626,48 +689,47 @@ const syncSecretsVercel = async ({
// Sync/push new secrets
if (newSecrets.length > 0) {
await axios.post(
await request.post(
`${INTEGRATION_VERCEL_API_URL}/v10/projects/${integration.app}/env`,
newSecrets,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
}
// Sync/push updated secrets
if (updateSecrets.length > 0) {
updateSecrets.forEach(async (secret: VercelSecret) => {
for await (const secret of updateSecrets) {
if (secret.type !== 'sensitive') {
const { id, ...updatedSecret } = secret;
await axios.patch(
await request.patch(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
updatedSecret,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
});
}
}
// Delete secrets
if (deleteSecrets.length > 0) {
deleteSecrets.forEach(async (secret: VercelSecret) => {
await axios.delete(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`,
},
}
);
});
for await (const secret of deleteSecrets) {
await request.delete(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
{
params,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
}
} catch (err) {
Sentry.setUser(null);
@ -717,12 +779,13 @@ const syncSecretsNetlify = async ({
});
const res = (
await axios.get(
await request.get(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env`,
{
params: getParams,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
)
@ -830,13 +893,14 @@ const syncSecretsNetlify = async ({
});
if (newSecrets.length > 0) {
await axios.post(
await request.post(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env`,
newSecrets,
{
params: syncParams,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
@ -844,7 +908,7 @@ const syncSecretsNetlify = async ({
if (updateSecrets.length > 0) {
updateSecrets.forEach(async (secret: NetlifySecret) => {
await axios.patch(
await request.patch(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${secret.key}`,
{
context: secret.values[0].context,
@ -854,6 +918,7 @@ const syncSecretsNetlify = async ({
params: syncParams,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
@ -862,12 +927,13 @@ const syncSecretsNetlify = async ({
if (deleteSecrets.length > 0) {
deleteSecrets.forEach(async (key: string) => {
await axios.delete(
await request.delete(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${key}`,
{
params: syncParams,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
@ -876,12 +942,13 @@ const syncSecretsNetlify = async ({
if (deleteSecretValues.length > 0) {
deleteSecretValues.forEach(async (secret: NetlifySecret) => {
await axios.delete(
await request.delete(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${secret.key}/value/${secret.values[0].id}`,
{
params: syncParams,
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
@ -1026,7 +1093,7 @@ const syncSecretsRender = async ({
accessToken: string;
}) => {
try {
await axios.put(
await request.put(
`${INTEGRATION_RENDER_API_URL}/v1/services/${integration.appId}/env-vars`,
Object.keys(secrets).map((key) => ({
key,
@ -1035,6 +1102,7 @@ const syncSecretsRender = async ({
{
headers: {
Authorization: `Bearer ${accessToken}`,
'Accept-Encoding': 'application/json'
},
}
);
@ -1083,23 +1151,21 @@ const syncSecretsFlyio = async ({
}
`;
await axios({
url: INTEGRATION_FLYIO_API_URL,
method: "post",
await request.post(INTEGRATION_FLYIO_API_URL, {
query: SetSecrets,
variables: {
input: {
appId: integration.app,
secrets: Object.entries(secrets).map(([key, value]) => ({
key,
value,
})),
},
},
}, {
headers: {
Authorization: "Bearer " + accessToken,
},
data: {
query: SetSecrets,
variables: {
input: {
appId: integration.app,
secrets: Object.entries(secrets).map(([key, value]) => ({
key,
value,
})),
},
},
'Accept-Encoding': 'application/json',
},
});
@ -1120,23 +1186,18 @@ const syncSecretsFlyio = async ({
}
}`;
const getSecretsRes = (
await axios({
method: "post",
url: INTEGRATION_FLYIO_API_URL,
headers: {
'Authorization': 'Bearer ' + accessToken,
'Content-Type': 'application/json',
'Accept-Encoding': 'application/json'
},
data: {
query: GetSecrets,
variables: {
appName: integration.app,
},
},
})
).data.data.app.secrets;
const getSecretsRes = (await request.post(INTEGRATION_FLYIO_API_URL, {
query: GetSecrets,
variables: {
appName: integration.app,
},
}, {
headers: {
Authorization: "Bearer " + accessToken,
'Content-Type': 'application/json',
'Accept-Encoding': 'application/json',
},
})).data.data.app.secrets;
const deleteSecretsKeys = getSecretsRes
.filter((secret: FlyioSecret) => !(secret.name in secrets))
@ -1161,23 +1222,22 @@ const syncSecretsFlyio = async ({
}
}`;
await axios({
method: "post",
url: INTEGRATION_FLYIO_API_URL,
await request.post(INTEGRATION_FLYIO_API_URL, {
query: DeleteSecrets,
variables: {
input: {
appId: integration.app,
keys: deleteSecretsKeys,
},
},
}, {
headers: {
Authorization: "Bearer " + accessToken,
"Content-Type": "application/json",
},
data: {
query: DeleteSecrets,
variables: {
input: {
appId: integration.app,
keys: deleteSecretsKeys,
},
},
'Accept-Encoding': 'application/json',
},
});
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
@ -1203,7 +1263,7 @@ const syncSecretsCircleCI = async ({
}) => {
try {
const circleciOrganizationDetail = (
await axios.get(`${INTEGRATION_CIRCLECI_API_URL}/v2/me/collaborations`, {
await request.get(`${INTEGRATION_CIRCLECI_API_URL}/v2/me/collaborations`, {
headers: {
"Circle-Token": accessToken,
"Accept-Encoding": "application/json",
@ -1216,7 +1276,7 @@ const syncSecretsCircleCI = async ({
// sync secrets to CircleCI
Object.keys(secrets).forEach(
async (key) =>
await axios.post(
await request.post(
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar`,
{
name: key,
@ -1233,7 +1293,7 @@ const syncSecretsCircleCI = async ({
// get secrets from CircleCI
const getSecretsRes = (
await axios.get(
await request.get(
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar`,
{
headers: {
@ -1247,7 +1307,7 @@ const syncSecretsCircleCI = async ({
// delete secrets from CircleCI
getSecretsRes.forEach(async (sec: any) => {
if (!(sec.name in secrets)) {
await axios.delete(
await request.delete(
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar/${sec.name}`,
{
headers: {
@ -1265,4 +1325,105 @@ const syncSecretsCircleCI = async ({
}
};
/**
* Sync/push [secrets] to TravisCI project
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for TravisCI integration
*/
const syncSecretsTravisCI = async ({
integration,
secrets,
accessToken,
}: {
integration: IIntegration;
secrets: any;
accessToken: string;
}) => {
try {
// get secrets from travis-ci
const getSecretsRes = (
await request.get(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars?repository_id=${integration.appId}`,
{
headers: {
"Authorization": `token ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
)
)
.data
?.env_vars
.reduce((obj: any, secret: any) => ({
...obj,
[secret.name]: secret
}), {});
// add secrets
for await (const key of Object.keys(secrets)) {
if (!(key in getSecretsRes)) {
// case: secret does not exist in travis ci
// -> add secret
await request.post(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars?repository_id=${integration.appId}`,
{
env_var: {
name: key,
value: secrets[key]
}
},
{
headers: {
"Authorization": `token ${accessToken}`,
"Content-Type": "application/json",
"Accept-Encoding": "application/json",
},
}
);
} else {
// case: secret exists in travis ci
// -> update/set secret
await request.patch(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars/${getSecretsRes[key].id}?repository_id=${getSecretsRes[key].repository_id}`,
{
env_var: {
name: key,
value: secrets[key],
}
},
{
headers: {
"Authorization": `token ${accessToken}`,
"Content-Type": "application/json",
"Accept-Encoding": "application/json",
},
}
);
}
}
for await (const key of Object.keys(getSecretsRes)) {
if (!(key in secrets)){
// delete secret
await request.delete(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars/${getSecretsRes[key].id}?repository_id=${getSecretsRes[key].repository_id}`,
{
headers: {
"Authorization": `token ${accessToken}`,
"Content-Type": "application/json",
"Accept-Encoding": "application/json",
},
}
);
}
}
} catch (err) {
Sentry.setUser(null);
Sentry.captureException(err);
throw new Error("Failed to sync secrets to TravisCI");
}
}
export { syncSecrets };

View File

@ -1,4 +1,5 @@
import requireAuth from './requireAuth';
import requireMfaAuth from './requireMfaAuth';
import requireBotAuth from './requireBotAuth';
import requireSignupAuth from './requireSignupAuth';
import requireWorkspaceAuth from './requireWorkspaceAuth';
@ -15,6 +16,7 @@ import validateRequest from './validateRequest';
export {
requireAuth,
requireMfaAuth,
requireBotAuth,
requireSignupAuth,
requireWorkspaceAuth,

View File

@ -1,11 +1,12 @@
import { ErrorRequestHandler } from "express";
import * as Sentry from '@sentry/node';
import { InternalServerError } from "../utils/errors";
import { InternalServerError, UnauthorizedRequestError, UnprocessableEntityError } from "../utils/errors";
import { getLogger } from "../utils/logger";
import RequestError, { LogLevel } from "../utils/requestError";
import { NODE_ENV } from "../config";
import mongoose from "mongoose";
export const requestErrorHandler: ErrorRequestHandler = (error: RequestError | Error, req, res, next) => {
if (res.headersSent) return next();
@ -33,4 +34,17 @@ export const requestErrorHandler: ErrorRequestHandler = (error: RequestError | E
res.status((<RequestError>error).statusCode).json((<RequestError>error).format(req))
next()
}
export const handleMongoInvalidDataError = (err: any, req: any, res: any, next: any) => {
if (err instanceof mongoose.Error.ValidationError) {
const errors: any = {};
for (const field in err.errors) {
errors[field] = err.errors[field].message;
}
throw UnprocessableEntityError({ message: JSON.stringify(errors) })
} else {
next(err);
}
}

View File

@ -0,0 +1,43 @@
import jwt from 'jsonwebtoken';
import { Request, Response, NextFunction } from 'express';
import { User } from '../models';
import { JWT_MFA_SECRET } from '../config';
import { BadRequestError, UnauthorizedRequestError } from '../utils/errors';
declare module 'jsonwebtoken' {
export interface UserIDJwtPayload extends jwt.JwtPayload {
userId: string;
}
}
/**
* Validate if (MFA) JWT temporary token on request is valid (e.g. not expired)
* and if there is an associated user.
*/
const requireMfaAuth = async (
req: Request,
res: Response,
next: NextFunction
) => {
// JWT (temporary) authentication middleware for complete signup
const [ AUTH_TOKEN_TYPE, AUTH_TOKEN_VALUE ] = <[string, string]>req.headers['authorization']?.split(' ', 2) ?? [null, null]
if(AUTH_TOKEN_TYPE === null) return next(BadRequestError({message: `Missing Authorization Header in the request header.`}))
if(AUTH_TOKEN_TYPE.toLowerCase() !== 'bearer') return next(BadRequestError({message: `The provided authentication type '${AUTH_TOKEN_TYPE}' is not supported.`}))
if(AUTH_TOKEN_VALUE === null) return next(BadRequestError({message: 'Missing Authorization Body in the request header'}))
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(AUTH_TOKEN_VALUE, JWT_MFA_SECRET)
);
const user = await User.findOne({
_id: decodedToken.userId
}).select('+publicKey');
if (!user)
return next(UnauthorizedRequestError({message: 'Unable to authenticate for User account completion. Try logging in again'}))
req.user = user;
return next();
};
export default requireMfaAuth;

View File

@ -10,7 +10,7 @@ import MembershipOrg, { IMembershipOrg } from './membershipOrg';
import Organization, { IOrganization } from './organization';
import Secret, { ISecret } from './secret';
import ServiceToken, { IServiceToken } from './serviceToken';
import Token, { IToken } from './token';
import TokenData, { ITokenData } from './tokenData';
import User, { IUser } from './user';
import UserAction, { IUserAction } from './userAction';
import Workspace, { IWorkspace } from './workspace';
@ -43,8 +43,8 @@ export {
ISecret,
ServiceToken,
IServiceToken,
Token,
IToken,
TokenData,
ITokenData,
User,
IUser,
UserAction,

View File

@ -10,6 +10,7 @@ import {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
} from "../variables";
export interface IIntegration {
@ -33,7 +34,8 @@ export interface IIntegration {
| 'github'
| 'render'
| 'flyio'
| 'circleci';
| 'circleci'
| 'travisci';
integrationAuth: Types.ObjectId;
}
@ -97,6 +99,7 @@ const integrationSchema = new Schema<IIntegration>(
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
],
required: true,
},

View File

@ -10,12 +10,13 @@ import {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
} from "../variables";
export interface IIntegrationAuth {
_id: Types.ObjectId;
workspace: Types.ObjectId;
integration: 'heroku' | 'vercel' | 'netlify' | 'github' | 'render' | 'flyio' | 'azure-key-vault' | 'circleci' | 'aws-parameter-store' | 'aws-secret-manager';
integration: 'heroku' | 'vercel' | 'netlify' | 'github' | 'render' | 'flyio' | 'azure-key-vault' | 'circleci' | 'travisci' | 'aws-parameter-store' | 'aws-secret-manager';
teamId: string;
accountId: string;
refreshCiphertext?: string;
@ -50,6 +51,7 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
],
required: true,
},

View File

@ -26,7 +26,7 @@ export interface ISecret {
tags?: string[];
}
const secretSchema = new Schema<ISecret>(
export const secretSchema = new Schema<ISecret>(
{
version: {
type: Number,

View File

@ -1,18 +1,28 @@
import mongoose, { Schema, model } from 'mongoose';
import Secret, { ISecret } from './secret';
import Secret, { ISecret, secretSchema } from './secret';
export interface IRequestedChange {
_id: string
userId: mongoose.Types.ObjectId;
status: ApprovalStatus;
modifiedSecretDetails: ISecret,
modifiedSecretParentId: mongoose.Types.ObjectId,
type: string,
approvers: IApprover[]
merged: boolean
}
interface ISecretApprovalRequest {
secret: mongoose.Types.ObjectId;
requestedChanges: ISecret;
requestedBy: mongoose.Types.ObjectId;
approvers: IApprover[];
status: ApprovalStatus;
environment: string;
workspace: mongoose.Types.ObjectId;
requestedChanges: IRequestedChange[];
requestedByUserId: mongoose.Types.ObjectId;
timestamp: Date;
requestType: RequestType;
requestType: ChangeType;
requestId: string;
}
interface IApprover {
export interface IApprover {
userId: mongoose.Types.ObjectId;
status: ApprovalStatus;
}
@ -23,54 +33,80 @@ export enum ApprovalStatus {
REJECTED = 'rejected'
}
export enum RequestType {
export enum ChangeType {
UPDATE = 'update',
DELETE = 'delete',
CREATE = 'create'
}
const approverSchema = new mongoose.Schema({
user: {
userId: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User',
required: true
required: false,
},
status: {
type: String,
enum: [ApprovalStatus],
default: ApprovalStatus.PENDING
}
}, { timestamps: true });
// extend the Secret Schema by taking all but removing _id and version fields
const SecretModificationSchema = new Schema({
...secretSchema.obj,
}, {
_id: false,
});
const secretApprovalRequestSchema = new Schema<ISecretApprovalRequest>(
SecretModificationSchema.remove("version")
const requestedChangeSchema = new mongoose.Schema(
{
secret: {
_id: { type: mongoose.Schema.Types.ObjectId, auto: true },
modifiedSecretDetails: SecretModificationSchema,
modifiedSecretParentId: { // used to fetch the current version of this secret for comparing
type: mongoose.Schema.Types.ObjectId,
ref: 'Secret'
},
requestedChanges: Secret,
requestedBy: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
type: {
type: String,
enum: ChangeType,
required: true
},
approvers: [approverSchema],
status: {
type: String,
enum: ApprovalStatus,
default: ApprovalStatus.PENDING
default: ApprovalStatus.PENDING // the overall status of the requested change
},
timestamp: {
type: Date,
default: Date.now
approvers: [approverSchema],
merged: {
type: Boolean,
default: false,
}
},
{ timestamps: true }
);
const secretApprovalRequestSchema = new Schema<ISecretApprovalRequest>(
{
environment: {
type: String, // The secret changes were requested for
ref: 'Secret'
},
requestType: {
type: String,
enum: RequestType,
required: true
workspace: {
type: mongoose.Schema.Types.ObjectId, // workspace id of the secret
ref: 'Workspace'
},
requestedChanges: [requestedChangeSchema], // the changes that the requested user wants to make to the existing secret
requestedByUserId: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
},
requestId: {
type: String,
required: false
}
},
{
@ -78,6 +114,8 @@ const secretApprovalRequestSchema = new Schema<ISecretApprovalRequest>(
}
);
const SecretApprovalRequest = model<ISecretApprovalRequest>('SecretApprovalRequest', secretApprovalRequestSchema);
secretApprovalRequestSchema.index({ 'requestedChanges.approvers.userId': 1 });
const SecretApprovalRequest = model<ISecretApprovalRequest>('secret_approval_request', secretApprovalRequestSchema);
export default SecretApprovalRequest;

View File

@ -5,7 +5,7 @@ export interface IToken {
email: string;
token: string;
createdAt: Date;
ttl: Number;
ttl: number;
}
const tokenSchema = new Schema<IToken>({

View File

@ -0,0 +1,55 @@
import { Schema, Types, model } from 'mongoose';
export interface ITokenData {
type: string;
email?: string;
phoneNumber?: string;
organization?: Types.ObjectId;
tokenHash: string;
triesLeft?: number;
expiresAt: Date;
createdAt: Date;
updatedAt: Date;
}
const tokenDataSchema = new Schema<ITokenData>({
type: {
type: String,
enum: [
'emailConfirmation',
'emailMfa',
'organizationInvitation',
'passwordReset'
],
required: true
},
email: {
type: String
},
phoneNumber: {
type: String
},
organization: { // organizationInvitation-specific field
type: Schema.Types.ObjectId,
ref: 'Organization'
},
tokenHash: {
type: String,
select: false,
required: true
},
triesLeft: {
type: Number
},
expiresAt: {
type: Date,
expires: 0,
required: true
}
}, {
timestamps: true
});
const TokenData = model<ITokenData>('TokenData', tokenDataSchema);
export default TokenData;

View File

@ -1,10 +1,14 @@
import { Schema, model, Types } from 'mongoose';
import { Schema, model, Types, Document } from 'mongoose';
export interface IUser {
export interface IUser extends Document {
_id: Types.ObjectId;
email: string;
firstName?: string;
lastName?: string;
encryptionVersion: number;
protectedKey: string;
protectedKeyIV: string;
protectedKeyTag: string;
publicKey?: string;
encryptedPrivateKey?: string;
iv?: string;
@ -12,7 +16,12 @@ export interface IUser {
salt?: string;
verifier?: string;
refreshVersion?: number;
seenIps: [string];
isMfaEnabled: boolean;
mfaMethods: boolean;
devices: {
ip: string;
userAgent: string;
}[];
}
const userSchema = new Schema<IUser>(
@ -27,6 +36,23 @@ const userSchema = new Schema<IUser>(
lastName: {
type: String
},
encryptionVersion: {
type: Number,
select: false,
default: 1 // to resolve backward-compatibility issues
},
protectedKey: { // introduced as part of encryption version 2
type: String,
select: false
},
protectedKeyIV: { // introduced as part of encryption version 2
type: String,
select: false
},
protectedKeyTag: { // introduced as part of encryption version 2
type: String,
select: false
},
publicKey: {
type: String,
select: false
@ -35,11 +61,11 @@ const userSchema = new Schema<IUser>(
type: String,
select: false
},
iv: {
iv: { // iv of [encryptedPrivateKey]
type: String,
select: false
},
tag: {
tag: { // tag of [encryptedPrivateKey]
type: String,
select: false
},
@ -56,8 +82,21 @@ const userSchema = new Schema<IUser>(
default: 0,
select: false
},
seenIps: [String]
},
isMfaEnabled: {
type: Boolean,
default: false
},
mfaMethods: [{
type: String
}],
devices: {
type: [{
ip: String,
userAgent: String
}],
default: []
}
},
{
timestamps: true
}

View File

@ -1,9 +1,16 @@
import { Schema, model, Types } from 'mongoose';
import mongoose, { Schema, model, Types } from 'mongoose';
export interface DesignatedApprovers {
environment: string,
approvers: [mongoose.Schema.Types.ObjectId]
}
export interface IWorkspace {
_id: Types.ObjectId;
name: string;
organization: Types.ObjectId;
approvers: [DesignatedApprovers];
environments: Array<{
name: string;
slug: string;
@ -11,6 +18,16 @@ export interface IWorkspace {
autoCapitalization: boolean;
}
const approverSchema = new mongoose.Schema({
userId: {
type: Schema.Types.ObjectId,
ref: 'User',
},
environment: {
type: String
}
}, { _id: false });
const workspaceSchema = new Schema<IWorkspace>({
name: {
type: String,
@ -20,6 +37,7 @@ const workspaceSchema = new Schema<IWorkspace>({
type: Boolean,
default: true,
},
approvers: [approverSchema],
organization: {
type: Schema.Types.ObjectId,
ref: 'Organization',

View File

@ -7,7 +7,7 @@ import { authLimiter } from '../../helpers/rateLimiter';
router.post('/token', validateRequest, authController.getNewToken);
router.post(
router.post( // deprecated (moved to api/v2/auth/login1)
'/login1',
authLimiter,
body('email').exists().trim().notEmpty(),
@ -16,7 +16,7 @@ router.post(
authController.login1
);
router.post(
router.post( // deprecated (moved to api/v2/auth/login2)
'/login2',
authLimiter,
body('email').exists().trim().notEmpty(),

View File

@ -31,7 +31,7 @@ router.patch(
requireBotAuth({
acceptedRoles: [ADMIN, MEMBER]
}),
body('isActive').isBoolean(),
body('isActive').exists().isBoolean(),
body('botKey'),
validateRequest,
botController.setBotActiveState

View File

@ -15,6 +15,7 @@ import password from './password';
import stripe from './stripe';
import integration from './integration';
import integrationAuth from './integrationAuth';
import secretApprovalRequest from './secretApprovalsRequest'
export {
signup,
@ -33,5 +34,6 @@ export {
password,
stripe,
integration,
integrationAuth
integrationAuth,
secretApprovalRequest
};

View File

@ -10,7 +10,7 @@ router.post(
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('clientPublicKey').exists().trim().notEmpty(),
body('clientPublicKey').exists().isString().trim().notEmpty(),
validateRequest,
passwordController.srp1
);
@ -22,11 +22,14 @@ router.post(
acceptedAuthModes: ['jwt']
}),
body('clientProof').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().trim().notEmpty().notEmpty(), // private key encrypted under new pwd
body('iv').exists().trim().notEmpty(), // new iv for private key
body('tag').exists().trim().notEmpty(), // new tag for private key
body('salt').exists().trim().notEmpty(), // part of new pwd
body('verifier').exists().trim().notEmpty(), // part of new pwd
body('protectedKey').exists().isString().trim().notEmpty(),
body('protectedKeyIV').exists().isString().trim().notEmpty(),
body('protectedKeyTag').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // private key encrypted under new pwd
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(), // new iv for private key
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(), // new tag for private key
body('salt').exists().isString().trim().notEmpty(), // part of new pwd
body('verifier').exists().isString().trim().notEmpty(), // part of new pwd
validateRequest,
passwordController.changePassword
);
@ -34,7 +37,7 @@ router.post(
router.post(
'/email/password-reset',
passwordLimiter,
body('email').exists().trim().notEmpty(),
body('email').exists().isString().trim().notEmpty().isEmail(),
validateRequest,
passwordController.emailPasswordReset
);
@ -42,8 +45,8 @@ router.post(
router.post(
'/email/password-reset-verify',
passwordLimiter,
body('email').exists().trim().notEmpty().isEmail(),
body('code').exists().trim().notEmpty(),
body('email').exists().isString().trim().notEmpty().isEmail(),
body('code').exists().isString().trim().notEmpty(),
validateRequest,
passwordController.emailPasswordResetVerify
);
@ -61,12 +64,12 @@ router.post(
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('clientProof').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().trim().notEmpty(), // (backup) private key encrypted under a strong key
body('iv').exists().trim().notEmpty(), // new iv for (backup) private key
body('tag').exists().trim().notEmpty(), // new tag for (backup) private key
body('salt').exists().trim().notEmpty(), // salt generated from strong key
body('verifier').exists().trim().notEmpty(), // salt generated from strong key
body('clientProof').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // (backup) private key encrypted under a strong key
body('iv').exists().isString().trim().notEmpty(), // new iv for (backup) private key
body('tag').exists().isString().trim().notEmpty(), // new tag for (backup) private key
body('salt').exists().isString().trim().notEmpty(), // salt generated from strong key
body('verifier').exists().isString().trim().notEmpty(), // salt generated from strong key
validateRequest,
passwordController.createBackupPrivateKey
);
@ -74,11 +77,14 @@ router.post(
router.post(
'/password-reset',
requireSignupAuth,
body('encryptedPrivateKey').exists().trim().notEmpty(), // private key encrypted under new pwd
body('iv').exists().trim().notEmpty(), // new iv for private key
body('tag').exists().trim().notEmpty(), // new tag for private key
body('salt').exists().trim().notEmpty(), // part of new pwd
body('verifier').exists().trim().notEmpty(), // part of new pwd
body('protectedKey').exists().isString().trim().notEmpty(),
body('protectedKeyIV').exists().isString().trim().notEmpty(),
body('protectedKeyTag').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // private key encrypted under new pwd
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(), // new iv for private key
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(), // new tag for private key
body('salt').exists().isString().trim().notEmpty(), // part of new pwd
body('verifier').exists().isString().trim().notEmpty(), // part of new pwd
validateRequest,
passwordController.resetPassword
);

View File

@ -0,0 +1,65 @@
import express from 'express';
const router = express.Router();
import { requireAuth, validateRequest } from '../../middleware';
import { secretApprovalController } from '../../controllers/v1';
import { body, param } from 'express-validator';
router.post(
'/',
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('workspaceId').exists(),
body('environment').exists(),
body('requestedChanges').isArray(),
validateRequest,
secretApprovalController.createApprovalRequest
);
router.get(
'/sent',
requireAuth({
acceptedAuthModes: ['jwt']
}),
secretApprovalController.getAllApprovalRequestsForUser
);
router.get(
'/approvals-needed',
requireAuth({
acceptedAuthModes: ['jwt']
}),
secretApprovalController.getAllApprovalRequestsThatRequireUserApproval
);
router.post(
'/:reviewId/approve',
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('requestedChangeIds').isArray(),
validateRequest,
secretApprovalController.approveApprovalRequest
);
router.post(
'/:reviewId/reject',
requireAuth({
acceptedAuthModes: ['jwt']
}),
body('requestedChangeIds').isArray(),
validateRequest,
secretApprovalController.rejectApprovalRequest
);
router.post(
'/:reviewId/merge',
body('requestedChangeIds').isArray(),
validateRequest,
requireAuth({
acceptedAuthModes: ['jwt']
}),
secretApprovalController.mergeApprovalRequestSecrets
);
export default router;

View File

@ -1,7 +1,7 @@
import express from 'express';
const router = express.Router();
import { body } from 'express-validator';
import { requireSignupAuth, validateRequest } from '../../middleware';
import { validateRequest } from '../../middleware';
import { signupController } from '../../controllers/v1';
import { authLimiter } from '../../helpers/rateLimiter';
@ -22,39 +22,4 @@ router.post(
signupController.verifyEmailSignup
);
router.post(
'/complete-account/signup',
authLimiter,
requireSignupAuth,
body('email').exists().trim().notEmpty().isEmail(),
body('firstName').exists().trim().notEmpty(),
body('lastName').exists().trim().notEmpty(),
body('publicKey').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().trim().notEmpty(),
body('iv').exists().trim().notEmpty(),
body('tag').exists().trim().notEmpty(),
body('salt').exists().trim().notEmpty(),
body('verifier').exists().trim().notEmpty(),
body('organizationName').exists().trim().notEmpty(),
validateRequest,
signupController.completeAccountSignup
);
router.post(
'/complete-account/invite',
authLimiter,
requireSignupAuth,
body('email').exists().trim().notEmpty().isEmail(),
body('firstName').exists().trim().notEmpty(),
body('lastName').exists().trim().notEmpty(),
body('publicKey').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().trim().notEmpty(),
body('iv').exists().trim().notEmpty(),
body('tag').exists().trim().notEmpty(),
body('salt').exists().trim().notEmpty(),
body('verifier').exists().trim().notEmpty(),
validateRequest,
signupController.completeAccountInvite
);
export default router;

View File

@ -36,10 +36,10 @@ router.get(
);
router.get(
'/',
'/',
requireAuth({
acceptedAuthModes: ['jwt']
}),
}),
workspaceController.getWorkspaces
);
@ -134,6 +134,34 @@ router.get(
workspaceController.getWorkspaceIntegrationAuthorizations
);
router.post(
'/:workspaceId/approvers',
requireAuth({
acceptedAuthModes: ['jwt']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN]
}),
param('workspaceId').exists().trim(),
body("approvers").isArray(),
validateRequest,
workspaceController.addApproverForWorkspaceAndEnvironment
);
router.delete(
'/:workspaceId/approvers',
requireAuth({
acceptedAuthModes: ['jwt']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN]
}),
param('workspaceId').exists().trim(),
body("approvers").isArray(),
validateRequest,
workspaceController.removeApproverForWorkspaceAndEnvironment
);
router.get(
'/:workspaceId/service-tokens', // deprecate
requireAuth({

View File

@ -0,0 +1,44 @@
import express from 'express';
const router = express.Router();
import { body } from 'express-validator';
import { requireMfaAuth, validateRequest } from '../../middleware';
import { authController } from '../../controllers/v2';
import { authLimiter } from '../../helpers/rateLimiter';
router.post(
'/login1',
authLimiter,
body('email').isString().trim().notEmpty(),
body('clientPublicKey').isString().trim().notEmpty(),
validateRequest,
authController.login1
);
router.post(
'/login2',
authLimiter,
body('email').isString().trim().notEmpty(),
body('clientProof').isString().trim().notEmpty(),
validateRequest,
authController.login2
);
router.post(
'/mfa/send',
authLimiter,
body('email').isString().trim().notEmpty(),
validateRequest,
authController.sendMfaToken
);
router.post(
'/mfa/verify',
authLimiter,
requireMfaAuth,
body('email').isString().trim().notEmpty(),
body('mfaToken').isString().trim().notEmpty(),
validateRequest,
authController.verifyMfaToken
);
export default router;

View File

@ -1,3 +1,5 @@
import auth from './auth';
import signup from './signup';
import users from './users';
import organizations from './organizations';
import workspace from './workspace';
@ -9,6 +11,8 @@ import environment from "./environment"
import tags from "./tags"
export {
auth,
signup,
users,
organizations,
workspace,

View File

@ -6,14 +6,52 @@ import {
requireSecretsAuth,
validateRequest
} from '../../middleware';
import { query, check, body } from 'express-validator';
import { query, body } from 'express-validator';
import { secretsController } from '../../controllers/v2';
import { validateSecrets } from '../../helpers/secret';
import {
ADMIN,
MEMBER,
SECRET_PERSONAL,
SECRET_SHARED
} from '../../variables';
import {
BatchSecretRequest
} from '../../types/secret';
router.post(
'/batch',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
location: 'body'
}),
body('workspaceId').exists().isString().trim(),
body('environment').exists().isString().trim(),
body('requests')
.exists()
.custom(async (requests: BatchSecretRequest[], { req }) => {
if (Array.isArray(requests)) {
const secretIds = requests
.map((request) => request.secret._id)
.filter((secretId) => secretId !== undefined)
if (secretIds.length > 0) {
const relevantSecrets = await validateSecrets({
userId: req.user._id.toString(),
secretIds
});
req.secrets = relevantSecrets;
}
}
return true;
}),
validateRequest,
secretsController.batchSecrets
);
router.post(
'/',

View File

@ -0,0 +1,49 @@
import express from 'express';
const router = express.Router();
import { body } from 'express-validator';
import { requireSignupAuth, validateRequest } from '../../middleware';
import { signupController } from '../../controllers/v2';
import { authLimiter } from '../../helpers/rateLimiter';
router.post(
'/complete-account/signup',
authLimiter,
requireSignupAuth,
body('email').exists().isString().trim().notEmpty().isEmail(),
body('firstName').exists().isString().trim().notEmpty(),
body('lastName').exists().isString().trim().notEmpty(),
body('protectedKey').exists().isString().trim().notEmpty(),
body('protectedKeyIV').exists().isString().trim().notEmpty(),
body('protectedKeyTag').exists().isString().trim().notEmpty(),
body('publicKey').exists().isString().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(),
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(),
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(),
body('salt').exists().isString().trim().notEmpty(),
body('verifier').exists().isString().trim().notEmpty(),
body('organizationName').exists().isString().trim().notEmpty(),
validateRequest,
signupController.completeAccountSignup
);
router.post(
'/complete-account/invite',
authLimiter,
requireSignupAuth,
body('email').exists().isString().trim().notEmpty().isEmail(),
body('firstName').exists().isString().trim().notEmpty(),
body('lastName').exists().isString().trim().notEmpty(),
body('protectedKey').exists().isString().trim().notEmpty(),
body('protectedKeyIV').exists().isString().trim().notEmpty(),
body('protectedKeyTag').exists().isString().trim().notEmpty(),
body('publicKey').exists().trim().notEmpty(),
body('encryptedPrivateKey').exists().isString().trim().notEmpty(),
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(),
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(),
body('salt').exists().isString().trim().notEmpty(),
body('verifier').exists().isString().trim().notEmpty(),
validateRequest,
signupController.completeAccountInvite
);
export default router;

View File

@ -1,8 +1,10 @@
import express from 'express';
const router = express.Router();
import {
requireAuth
requireAuth,
validateRequest
} from '../../middleware';
import { body } from 'express-validator';
import { usersController } from '../../controllers/v2';
router.get(
@ -13,6 +15,16 @@ router.get(
usersController.getMe
);
router.patch(
'/me/mfa',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
body('isMfaEnabled').exists().isBoolean(),
validateRequest,
usersController.updateMyMfaEnabled
);
router.get(
'/me/organizations',
requireAuth({

View File

@ -118,7 +118,6 @@ router.delete( // TODO - rewire dashboard to this route
workspaceController.deleteWorkspaceMembership
);
router.patch(
'/:workspaceId/auto-capitalization',
requireAuth({

View File

@ -1,5 +1,3 @@
import { Bot, IBot } from '../models';
import * as Sentry from '@sentry/node';
import { handleEventHelper } from '../helpers/event';
interface Event {

View File

@ -0,0 +1,69 @@
import { Types } from 'mongoose';
import { createTokenHelper, validateTokenHelper } from '../helpers/token';
/**
* Class to handle token actions
* TODO: elaborate more on this class
*/
class TokenService {
/**
* Create a token [token] for type [type] with associated details
* @param {Object} obj
* @param {String} obj.type - type or context of token (e.g. emailConfirmation)
* @param {String} obj.email - email associated with the token
* @param {String} obj.phoneNumber - phone number associated with the token
* @param {Types.ObjectId} obj.organizationId - id of organization associated with the token
* @returns {String} token - the token to create
*/
static async createToken({
type,
email,
phoneNumber,
organizationId
}: {
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
email?: string;
phoneNumber?: string;
organizationId?: Types.ObjectId;
}) {
return await createTokenHelper({
type,
email,
phoneNumber,
organizationId
});
}
/**
* Validate whether or not token [token] and its associated details match a token in the DB
* @param {Object} obj
* @param {String} obj.type - type or context of token (e.g. emailConfirmation)
* @param {String} obj.email - email associated with the token
* @param {String} obj.phoneNumber - phone number associated with the token
* @param {Types.ObjectId} obj.organizationId - id of organization associated with the token
* @param {String} obj.token - the token to validate
*/
static async validateToken({
type,
email,
phoneNumber,
organizationId,
token
}: {
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
email?: string;
phoneNumber?: string;
organizationId?: Types.ObjectId;
token: string;
}) {
return await validateTokenHelper({
type,
email,
phoneNumber,
organizationId,
token
});
}
}
export default TokenService;

View File

@ -3,11 +3,13 @@ import postHogClient from './PostHogClient';
import BotService from './BotService';
import EventService from './EventService';
import IntegrationService from './IntegrationService';
import TokenService from './TokenService';
export {
DatabaseService,
postHogClient,
BotService,
EventService,
IntegrationService
IntegrationService,
TokenService
}

View File

@ -1,6 +1,10 @@
import nodemailer from 'nodemailer';
import { SMTP_HOST, SMTP_PORT, SMTP_USERNAME, SMTP_PASSWORD, SMTP_SECURE } from '../config';
import { SMTP_HOST_SENDGRID, SMTP_HOST_MAILGUN } from '../variables';
import {
SMTP_HOST_SENDGRID,
SMTP_HOST_MAILGUN,
SMTP_HOST_SOCKETLABS
} from '../variables';
import SMTPConnection from 'nodemailer/lib/smtp-connection';
import * as Sentry from '@sentry/node';
@ -27,6 +31,12 @@ if (SMTP_SECURE) {
ciphers: 'TLSv1.2'
}
break;
case SMTP_HOST_SOCKETLABS:
mailOpts.requireTLS = true;
mailOpts.tls = {
ciphers: 'TLSv1.2'
}
break;
default:
if (SMTP_HOST.includes('amazonaws.com')) {
mailOpts.tls = {

View File

@ -0,0 +1,19 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<title>MFA Code</title>
</head>
<body>
<h2>Infisical</h2>
<h2>Sign in attempt requires further verification</h2>
<p>Your MFA code is below — enter it where you started signing in to Infisical.</p>
<h2>{{code}}</h2>
<p>The MFA code will be valid for 2 minutes.</p>
<p>Not you? Contact Infisical or your administrator immediately.</p>
</body>
</html>

View File

@ -1,14 +1,17 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<title>Email Verification</title>
<title>Code</title>
</head>
<body>
<h2>Confirm your email address</h2>
<p>Your confirmation code is below — enter it in the browser window where you've started signing up for Infisical.</p>
<h1>{{code}}</h1>
<p>Questions about setting up Infisical? Email us at support@infisical.com</p>
</body>
</html>

View File

@ -0,0 +1,19 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<title>Successful login for {{email}} from new device</title>
</head>
<body>
<h2>Infisical</h2>
<p>We're verifying a recent login for {{email}}:</p>
<p><strong>Timestamp</strong>: {{timestamp}}</p>
<p><strong>IP address</strong>: {{ip}}</p>
<p><strong>User agent</strong>: {{userAgent}}</p>
<p>If you believe that this login is suspicious, please contact Infisical or reset your password immediately.</p>
</body>
</html>

View File

@ -1,5 +1,7 @@
import { Types } from 'mongoose';
import { Assign, Omit } from 'utility-types';
import { ISecret } from '../../models';
import { mongo } from 'mongoose';
// Everything is required, except the omitted types
export type CreateSecretRequestBody = Omit<ISecret, "user" | "version" | "environment" | "workspace">;
@ -12,3 +14,39 @@ export type SanitizedSecretModify = Partial<Omit<ISecret, "user" | "version" | "
// Everything is required, except the omitted types
export type SanitizedSecretForCreate = Omit<ISecret, "version" | "_id">;
export interface BatchSecretRequest {
id: string;
method: 'POST' | 'PATCH' | 'DELETE';
secret: Secret;
}
export interface BatchSecret {
_id: string;
type: 'shared' | 'personal',
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretCommentCiphertext: string;
secretCommentIV: string;
secretCommentTag: string;
tags: string[];
}
export interface BatchSecret {
_id: string;
type: 'shared' | 'personal',
secretKeyCiphertext: string;
secretKeyIV: string;
secretKeyTag: string;
secretValueCiphertext: string;
secretValueIV: string;
secretValueTag: string;
secretCommentCiphertext: string;
secretCommentIV: string;
secretCommentTag: string;
tags: string[];
}

View File

@ -19,6 +19,15 @@ export const MethodNotAllowedError = (error?: Partial<RequestErrorContext>) => n
stack: error?.stack
});
export const UnprocessableEntityError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
statusCode: error?.statusCode ?? 422,
type: error?.type ?? 'unprocessable_entity',
message: error?.message ?? 'The server understands the content of the request, but it was unable to process it because it contains invalid data',
context: error?.context,
stack: error?.stack
});
export const UnauthorizedRequestError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
statusCode: error?.statusCode ?? 401,
@ -27,7 +36,7 @@ export const UnauthorizedRequestError = (error?: Partial<RequestErrorContext>) =
context: error?.context,
stack: error?.stack
});
export const ForbiddenRequestError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
statusCode: error?.statusCode ?? 403,
@ -46,6 +55,15 @@ export const BadRequestError = (error?: Partial<RequestErrorContext>) => new Req
stack: error?.stack
});
export const ResourceNotFound = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
statusCode: error?.statusCode ?? 404,
type: error?.type ?? 'resource_not_found',
message: error?.message ?? 'The requested resource was not found',
context: error?.context,
stack: error?.stack
});
export const InternalServerError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.ERROR,
statusCode: error?.statusCode ?? 500,

View File

@ -16,6 +16,7 @@ import {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
INTEGRATION_SET,
INTEGRATION_OAUTH2,
INTEGRATION_AZURE_TOKEN_URL,
@ -29,6 +30,7 @@ import {
INTEGRATION_RENDER_API_URL,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_TRAVISCI_API_URL,
INTEGRATION_OPTIONS,
} from "./integration";
import { OWNER, ADMIN, MEMBER, INVITED, ACCEPTED } from "./organization";
@ -40,10 +42,23 @@ import {
ACTION_ADD_SECRETS,
ACTION_UPDATE_SECRETS,
ACTION_DELETE_SECRETS,
ACTION_READ_SECRETS,
} from "./action";
import { SMTP_HOST_SENDGRID, SMTP_HOST_MAILGUN } from "./smtp";
import { PLAN_STARTER, PLAN_PRO } from "./stripe";
ACTION_READ_SECRETS
} from './action';
import {
SMTP_HOST_SENDGRID,
SMTP_HOST_MAILGUN,
SMTP_HOST_SOCKETLABS
} from './smtp';
import { PLAN_STARTER, PLAN_PRO } from './stripe';
import {
MFA_METHOD_EMAIL
} from './user';
import {
TOKEN_EMAIL_CONFIRMATION,
TOKEN_EMAIL_MFA,
TOKEN_EMAIL_ORG_INVITATION,
TOKEN_EMAIL_PASSWORD_RESET
} from './token';
export {
OWNER,
@ -68,6 +83,7 @@ export {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
INTEGRATION_SET,
INTEGRATION_OAUTH2,
INTEGRATION_AZURE_TOKEN_URL,
@ -81,6 +97,7 @@ export {
INTEGRATION_RENDER_API_URL,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_TRAVISCI_API_URL,
EVENT_PUSH_SECRETS,
EVENT_PULL_SECRETS,
ACTION_LOGIN,
@ -92,6 +109,12 @@ export {
INTEGRATION_OPTIONS,
SMTP_HOST_SENDGRID,
SMTP_HOST_MAILGUN,
SMTP_HOST_SOCKETLABS,
PLAN_STARTER,
PLAN_PRO,
MFA_METHOD_EMAIL,
TOKEN_EMAIL_CONFIRMATION,
TOKEN_EMAIL_MFA,
TOKEN_EMAIL_ORG_INVITATION,
TOKEN_EMAIL_PASSWORD_RESET
};

View File

@ -20,6 +20,7 @@ const INTEGRATION_GITHUB = "github";
const INTEGRATION_RENDER = "render";
const INTEGRATION_FLYIO = "flyio";
const INTEGRATION_CIRCLECI = "circleci";
const INTEGRATION_TRAVISCI = "travisci";
const INTEGRATION_SET = new Set([
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
@ -29,6 +30,7 @@ const INTEGRATION_SET = new Set([
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
]);
// integration types
@ -50,6 +52,7 @@ const INTEGRATION_NETLIFY_API_URL = "https://api.netlify.com";
const INTEGRATION_RENDER_API_URL = "https://api.render.com";
const INTEGRATION_FLYIO_API_URL = "https://api.fly.io/graphql";
const INTEGRATION_CIRCLECI_API_URL = "https://circleci.com/api";
const INTEGRATION_TRAVISCI_API_URL = "https://api.travis-ci.com";
const INTEGRATION_OPTIONS = [
{
@ -134,6 +137,15 @@ const INTEGRATION_OPTIONS = [
clientId: '',
docsLink: ''
},
{
name: 'Travis CI',
slug: 'travisci',
image: 'Travis CI.png',
isAvailable: true,
type: 'pat',
clientId: '',
docsLink: ''
},
{
name: 'Azure Key Vault',
slug: 'azure-key-vault',
@ -152,15 +164,6 @@ const INTEGRATION_OPTIONS = [
type: '',
clientId: '',
docsLink: ''
},
{
name: 'Travis CI',
slug: 'travisci',
image: 'Travis CI.png',
isAvailable: false,
type: '',
clientId: '',
docsLink: ''
}
]
@ -175,6 +178,7 @@ export {
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_TRAVISCI,
INTEGRATION_SET,
INTEGRATION_OAUTH2,
INTEGRATION_AZURE_TOKEN_URL,
@ -188,5 +192,6 @@ export {
INTEGRATION_RENDER_API_URL,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_CIRCLECI_API_URL,
INTEGRATION_TRAVISCI_API_URL,
INTEGRATION_OPTIONS,
};

View File

@ -1,7 +1,9 @@
const SMTP_HOST_SENDGRID = 'smtp.sendgrid.net';
const SMTP_HOST_MAILGUN = 'smtp.mailgun.org';
const SMTP_HOST_SOCKETLABS = 'smtp.socketlabs.com';
export {
SMTP_HOST_SENDGRID,
SMTP_HOST_MAILGUN
SMTP_HOST_MAILGUN,
SMTP_HOST_SOCKETLABS
}

View File

@ -0,0 +1,11 @@
const TOKEN_EMAIL_CONFIRMATION = 'emailConfirmation';
const TOKEN_EMAIL_MFA = 'emailMfa';
const TOKEN_EMAIL_ORG_INVITATION = 'organizationInvitation';
const TOKEN_EMAIL_PASSWORD_RESET = 'passwordReset';
export {
TOKEN_EMAIL_CONFIRMATION,
TOKEN_EMAIL_MFA,
TOKEN_EMAIL_ORG_INVITATION,
TOKEN_EMAIL_PASSWORD_RESET
}

View File

@ -0,0 +1,5 @@
const MFA_METHOD_EMAIL = 'email';
export {
MFA_METHOD_EMAIL
}

View File

@ -7,8 +7,8 @@ require (
github.com/muesli/mango-cobra v1.2.0
github.com/muesli/roff v0.1.0
github.com/spf13/cobra v1.6.1
golang.org/x/crypto v0.3.0
golang.org/x/term v0.3.0
golang.org/x/crypto v0.6.0
golang.org/x/term v0.5.0
)
require (
@ -31,8 +31,8 @@ require (
github.com/oklog/ulid v1.3.1 // indirect
github.com/rivo/uniseg v0.2.0 // indirect
go.mongodb.org/mongo-driver v1.10.0 // indirect
golang.org/x/net v0.2.0 // indirect
golang.org/x/sys v0.3.0 // indirect
golang.org/x/net v0.6.0 // indirect
golang.org/x/sys v0.5.0 // indirect
)
require (

View File

@ -106,10 +106,14 @@ go.mongodb.org/mongo-driver v1.10.0/go.mod h1:wsihk0Kdgv8Kqu1Anit4sfK+22vSFbUrAV
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
golang.org/x/crypto v0.3.0 h1:a06MkbcxBrEFc0w0QIZWXrH/9cCX6KJyWbBOIwAn+7A=
golang.org/x/crypto v0.3.0/go.mod h1:hebNnKkNXi2UzZN1eVRvBB7co0a+JxK6XbPiWVs/3J4=
golang.org/x/crypto v0.6.0 h1:qfktjS5LUO+fFKeJXZ+ikTRijMmljikvG68fpMMruSc=
golang.org/x/crypto v0.6.0/go.mod h1:OFC/31mSvZgRz0V1QTNCzfAI1aIRzbiufJtkMIlEp58=
golang.org/x/net v0.0.0-20211029224645-99673261e6eb/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.2.0 h1:sZfSu1wtKLGlWI4ZZayP0ck9Y73K1ynO6gqzTdBVdPU=
golang.org/x/net v0.2.0/go.mod h1:KqCZLdyyvdV855qA2rE3GC2aiw5xGR5TEjj8smXukLY=
golang.org/x/net v0.6.0 h1:L4ZwwTvKW9gr0ZMS1yrHD9GZhIuVjOBBnaKH+SPQK0Q=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/sync v0.0.0-20210220032951-036812b2e83c/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sys v0.0.0-20181122145206-62eef0e2fa9b/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -123,9 +127,13 @@ golang.org/x/sys v0.0.0-20220310020820-b874c991c1a5/go.mod h1:oPkhp1MJrh7nUepCBc
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.3.0 h1:w8ZOecv6NaNa/zC8944JTU3vz4u6Lagfk4RPQxv92NQ=
golang.org/x/sys v0.3.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0 h1:MUK/U/4lj1t1oPg0HfuXDN/Z1wv31ZJ/YcPiGccS4DU=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.3.0 h1:qoo4akIqOcDME5bhc/NgxUdovd6BSS2uMsVjB56q1xI=
golang.org/x/term v0.3.0/go.mod h1:q750SLmJuPmVoN1blW3UFBPREJfb1KmY3vwxfr+nFDA=
golang.org/x/term v0.5.0 h1:n2a8QNdAb0sZNpU9R1ALUXBbY+w51fCQDN+7EdxNBsY=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=

View File

@ -128,6 +128,68 @@ func CallGetSecretsV2(httpClient *resty.Client, request GetEncryptedSecretsV2Req
return secretsResponse, nil
}
func CallLogin1V2(httpClient *resty.Client, request GetLoginOneV2Request) (GetLoginOneV2Response, error) {
var loginOneV2Response GetLoginOneV2Response
response, err := httpClient.
R().
SetResult(&loginOneV2Response).
SetHeader("User-Agent", USER_AGENT).
SetBody(request).
Post(fmt.Sprintf("%v/v2/auth/login1", config.INFISICAL_URL))
if err != nil {
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V2: Unable to complete api request [err=%s]", err)
}
if response.IsError() {
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V2: Unsuccessful response: [response=%s]", response)
}
return loginOneV2Response, nil
}
func CallVerifyMfaToken(httpClient *resty.Client, request VerifyMfaTokenRequest) (*VerifyMfaTokenResponse, *VerifyMfaTokenErrorResponse, error) {
var verifyMfaTokenResponse VerifyMfaTokenResponse
var responseError VerifyMfaTokenErrorResponse
response, err := httpClient.
R().
SetResult(&verifyMfaTokenResponse).
SetHeader("User-Agent", USER_AGENT).
SetError(&responseError).
SetBody(request).
Post(fmt.Sprintf("%v/v2/auth/mfa/verify", config.INFISICAL_URL))
if err != nil {
return nil, nil, fmt.Errorf("CallVerifyMfaToken: Unable to complete api request [err=%s]", err)
}
if response.IsError() {
return nil, &responseError, nil
}
return &verifyMfaTokenResponse, nil, nil
}
func CallLogin2V2(httpClient *resty.Client, request GetLoginTwoV2Request) (GetLoginTwoV2Response, error) {
var loginTwoV2Response GetLoginTwoV2Response
response, err := httpClient.
R().
SetResult(&loginTwoV2Response).
SetHeader("User-Agent", USER_AGENT).
SetBody(request).
Post(fmt.Sprintf("%v/v2/auth/login2", config.INFISICAL_URL))
if err != nil {
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V2: Unable to complete api request [err=%s]", err)
}
if response.IsError() {
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V2: Unsuccessful response: [response=%s]", response)
}
return loginTwoV2Response, nil
}
func CallGetAllWorkSpacesUserBelongsTo(httpClient *resty.Client) (GetWorkSpacesResponse, error) {
var workSpacesResponse GetWorkSpacesResponse
response, err := httpClient.

View File

@ -263,3 +263,63 @@ type GetAccessibleEnvironmentsResponse struct {
IsWriteDenied bool `json:"isWriteDenied"`
} `json:"accessibleEnvironments"`
}
type GetLoginOneV2Request struct {
Email string `json:"email"`
ClientPublicKey string `json:"clientPublicKey"`
}
type GetLoginOneV2Response struct {
ServerPublicKey string `json:"serverPublicKey"`
Salt string `json:"salt"`
}
type GetLoginTwoV2Request struct {
Email string `json:"email"`
ClientProof string `json:"clientProof"`
}
type GetLoginTwoV2Response struct {
MfaEnabled bool `json:"mfaEnabled"`
EncryptionVersion int `json:"encryptionVersion"`
Token string `json:"token"`
PublicKey string `json:"publicKey"`
EncryptedPrivateKey string `json:"encryptedPrivateKey"`
Iv string `json:"iv"`
Tag string `json:"tag"`
ProtectedKey string `json:"protectedKey"`
ProtectedKeyIV string `json:"protectedKeyIV"`
ProtectedKeyTag string `json:"protectedKeyTag"`
}
type VerifyMfaTokenRequest struct {
Email string `json:"email"`
MFAToken string `json:"mfaToken"`
}
type VerifyMfaTokenResponse struct {
EncryptionVersion int `json:"encryptionVersion"`
Token string `json:"token"`
PublicKey string `json:"publicKey"`
EncryptedPrivateKey string `json:"encryptedPrivateKey"`
Iv string `json:"iv"`
Tag string `json:"tag"`
ProtectedKey string `json:"protectedKey"`
ProtectedKeyIV string `json:"protectedKeyIV"`
ProtectedKeyTag string `json:"protectedKeyTag"`
}
type VerifyMfaTokenErrorResponse struct {
Type string `json:"type"`
Message string `json:"message"`
Context struct {
Code string `json:"code"`
TriesLeft int `json:"triesLeft"`
} `json:"context"`
Level int `json:"level"`
LevelName string `json:"level_name"`
StatusCode int `json:"status_code"`
DatetimeIso time.Time `json:"datetime_iso"`
Application string `json:"application"`
Extra []interface{} `json:"extra"`
}

View File

@ -0,0 +1,49 @@
package cmd
import (
"testing"
"github.com/Infisical/infisical-merge/packages/models"
)
func TestFilterReservedEnvVars(t *testing.T) {
// some test env vars.
// HOME and PATH are reserved key words and should be filtered out
// XDG_SESSION_ID and LC_CTYPE are reserved key word prefixes and should be filtered out
// The filter function only checks the keys of the env map, so we dont need to set any values
env := map[string]models.SingleEnvironmentVariable{
"test": {},
"test2": {},
"HOME": {},
"PATH": {},
"XDG_SESSION_ID": {},
"LC_CTYPE": {},
}
// check to see if there are any reserved key words in secrets to inject
filterReservedEnvVars(env)
if len(env) != 2 {
t.Errorf("Expected 2 secrets to be returned, got %d", len(env))
}
if _, ok := env["test"]; !ok {
t.Errorf("Expected test to be returned")
}
if _, ok := env["test2"]; !ok {
t.Errorf("Expected test2 to be returned")
}
if _, ok := env["HOME"]; ok {
t.Errorf("Expected HOME to be filtered out")
}
if _, ok := env["PATH"]; ok {
t.Errorf("Expected PATH to be filtered out")
}
if _, ok := env["XDG_SESSION_ID"]; ok {
t.Errorf("Expected XDG_SESSION_ID to be filtered out")
}
if _, ok := env["LC_CTYPE"]; ok {
t.Errorf("Expected LC_CTYPE to be filtered out")
}
}

View File

@ -36,9 +36,12 @@ var exportCmd = &cobra.Command{
// util.RequireLocalWorkspaceFile()
},
Run: func(cmd *cobra.Command, args []string) {
envName, err := cmd.Flags().GetString("env")
if err != nil {
util.HandleError(err)
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
shouldExpandSecrets, err := cmd.Flags().GetBool("expand")
@ -66,7 +69,7 @@ var exportCmd = &cobra.Command{
util.HandleError(err, "Unable to parse flag")
}
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: envName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
if err != nil {
util.HandleError(err, "Unable to fetch secrets")
}

View File

@ -91,7 +91,7 @@ func writeWorkspaceFile(selectedWorkspace models.Workspace) error {
WorkspaceId: selectedWorkspace.ID,
}
marshalledWorkspaceFile, err := json.Marshal(workspaceFileToSave)
marshalledWorkspaceFile, err := json.MarshalIndent(workspaceFileToSave, "", " ")
if err != nil {
return err
}

View File

@ -13,7 +13,6 @@ import (
"regexp"
"github.com/Infisical/infisical-merge/packages/api"
"github.com/Infisical/infisical-merge/packages/config"
"github.com/Infisical/infisical-merge/packages/crypto"
"github.com/Infisical/infisical-merge/packages/models"
"github.com/Infisical/infisical-merge/packages/srp"
@ -23,8 +22,17 @@ import (
"github.com/manifoldco/promptui"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
"golang.org/x/crypto/argon2"
)
type params struct {
memory uint32
iterations uint32
parallelism uint8
saltLength uint32
keyLength uint32
}
// loginCmd represents the login command
var loginCmd = &cobra.Command{
Use: "login",
@ -55,36 +63,158 @@ var loginCmd = &cobra.Command{
util.HandleError(err, "Unable to parse email and password for authentication")
}
userCredentials, err := getFreshUserCredentials(email, password)
loginOneResponse, loginTwoResponse, err := getFreshUserCredentials(email, password)
if err != nil {
log.Infoln("Unable to authenticate with the provided credentials, please try again")
log.Debugln(err)
return
}
encryptedPrivateKey, _ := base64.StdEncoding.DecodeString(userCredentials.EncryptedPrivateKey)
tag, err := base64.StdEncoding.DecodeString(userCredentials.Tag)
if err != nil {
util.HandleError(err)
if loginTwoResponse.MfaEnabled {
i := 1
for i < 6 {
mfaVerifyCode := askForMFACode()
httpClient := resty.New()
httpClient.SetAuthToken(loginTwoResponse.Token)
verifyMFAresponse, mfaErrorResponse, requestError := api.CallVerifyMfaToken(httpClient, api.VerifyMfaTokenRequest{
Email: email,
MFAToken: mfaVerifyCode,
})
if requestError != nil {
util.HandleError(err)
break
} else if mfaErrorResponse != nil {
if mfaErrorResponse.Context.Code == "mfa_invalid" {
msg := fmt.Sprintf("Incorrect, verification code. You have %v attempts left", 5-i)
fmt.Println(msg)
if i == 5 {
util.PrintErrorMessageAndExit("No tries left, please try again in a bit")
break
}
}
if mfaErrorResponse.Context.Code == "mfa_expired" {
util.PrintErrorMessageAndExit("Your 2FA verification code has expired, please try logging in again")
break
}
i++
} else {
loginTwoResponse.EncryptedPrivateKey = verifyMFAresponse.EncryptedPrivateKey
loginTwoResponse.EncryptionVersion = verifyMFAresponse.EncryptionVersion
loginTwoResponse.Iv = verifyMFAresponse.Iv
loginTwoResponse.ProtectedKey = verifyMFAresponse.ProtectedKey
loginTwoResponse.ProtectedKeyIV = verifyMFAresponse.ProtectedKeyIV
loginTwoResponse.ProtectedKeyTag = verifyMFAresponse.ProtectedKeyTag
loginTwoResponse.PublicKey = verifyMFAresponse.PublicKey
loginTwoResponse.Tag = verifyMFAresponse.Tag
loginTwoResponse.Token = verifyMFAresponse.Token
loginTwoResponse.EncryptionVersion = verifyMFAresponse.EncryptionVersion
break
}
}
}
IV, err := base64.StdEncoding.DecodeString(userCredentials.IV)
if err != nil {
util.HandleError(err)
var decryptedPrivateKey []byte
if loginTwoResponse.EncryptionVersion == 1 {
log.Debug("Login version 1")
encryptedPrivateKey, _ := base64.StdEncoding.DecodeString(loginTwoResponse.EncryptedPrivateKey)
tag, err := base64.StdEncoding.DecodeString(loginTwoResponse.Tag)
if err != nil {
util.HandleError(err)
}
IV, err := base64.StdEncoding.DecodeString(loginTwoResponse.Iv)
if err != nil {
util.HandleError(err)
}
paddedPassword := fmt.Sprintf("%032s", password)
key := []byte(paddedPassword)
computedDecryptedPrivateKey, err := crypto.DecryptSymmetric(key, encryptedPrivateKey, tag, IV)
if err != nil || len(computedDecryptedPrivateKey) == 0 {
util.HandleError(err)
}
decryptedPrivateKey = computedDecryptedPrivateKey
} else if loginTwoResponse.EncryptionVersion == 2 {
log.Debug("Login version 2")
protectedKey, err := base64.StdEncoding.DecodeString(loginTwoResponse.ProtectedKey)
if err != nil {
util.HandleError(err)
}
protectedKeyTag, err := base64.StdEncoding.DecodeString(loginTwoResponse.ProtectedKeyTag)
if err != nil {
util.HandleError(err)
}
protectedKeyIV, err := base64.StdEncoding.DecodeString(loginTwoResponse.ProtectedKeyIV)
if err != nil {
util.HandleError(err)
}
nonProtectedTag, err := base64.StdEncoding.DecodeString(loginTwoResponse.Tag)
if err != nil {
util.HandleError(err)
}
nonProtectedIv, err := base64.StdEncoding.DecodeString(loginTwoResponse.Iv)
if err != nil {
util.HandleError(err)
}
parameters := &params{
memory: 64 * 1024,
iterations: 3,
parallelism: 1,
keyLength: 32,
}
derivedKey, err := generateFromPassword(password, []byte(loginOneResponse.Salt), parameters)
if err != nil {
util.HandleError(fmt.Errorf("unable to generate argon hash from password [err=%s]", err))
}
decryptedProtectedKey, err := crypto.DecryptSymmetric(derivedKey, protectedKey, protectedKeyTag, protectedKeyIV)
if err != nil {
util.HandleError(fmt.Errorf("unable to get decrypted protected key [err=%s]", err))
}
encryptedPrivateKey, err := base64.StdEncoding.DecodeString(loginTwoResponse.EncryptedPrivateKey)
if err != nil {
util.HandleError(err)
}
decryptedProtectedKeyInHex, err := hex.DecodeString(string(decryptedProtectedKey))
if err != nil {
util.HandleError(err)
}
computedDecryptedPrivateKey, err := crypto.DecryptSymmetric(decryptedProtectedKeyInHex, encryptedPrivateKey, nonProtectedTag, nonProtectedIv)
if err != nil {
util.HandleError(err)
}
decryptedPrivateKey = computedDecryptedPrivateKey
} else {
util.PrintErrorMessageAndExit("Insufficient details to decrypt private key")
}
paddedPassword := fmt.Sprintf("%032s", password)
key := []byte(paddedPassword)
decryptedPrivateKey, err := crypto.DecryptSymmetric(key, encryptedPrivateKey, tag, IV)
if err != nil || len(decryptedPrivateKey) == 0 {
util.HandleError(err)
if string(decryptedPrivateKey) == "" || email == "" || loginTwoResponse.Token == "" {
log.Debugf("[decryptedPrivateKey=%s] [email=%s] [loginTwoResponse.Token=%s]", string(decryptedPrivateKey), email, loginTwoResponse.Token)
util.PrintErrorMessageAndExit("We were unable to fetch required details to complete your login. Run with -d to see more info")
}
userCredentialsToBeStored := &models.UserCredentials{
Email: email,
PrivateKey: string(decryptedPrivateKey),
JTWToken: userCredentials.JTWToken,
JTWToken: loginTwoResponse.Token,
}
err = util.StoreUserCredsInKeyRing(userCredentialsToBeStored)
@ -155,7 +285,7 @@ func askForLoginCredentials() (email string, password string, err error) {
return userEmail, userPassword, nil
}
func getFreshUserCredentials(email string, password string) (*api.LoginTwoResponse, error) {
func getFreshUserCredentials(email string, password string) (*api.GetLoginOneV2Response, *api.GetLoginTwoV2Response, error) {
log.Debugln("getFreshUserCredentials:", "email", email, "password", password)
httpClient := resty.New()
httpClient.SetRetryCount(5)
@ -166,36 +296,24 @@ func getFreshUserCredentials(email string, password string) (*api.LoginTwoRespon
srpA := hex.EncodeToString(srpClient.ComputeA())
// ** Login one
loginOneRequest := api.LoginOneRequest{
loginOneResponseResult, err := api.CallLogin1V2(httpClient, api.GetLoginOneV2Request{
Email: email,
ClientPublicKey: srpA,
}
var loginOneResponseResult api.LoginOneResponse
loginOneResponse, err := httpClient.
R().
SetBody(loginOneRequest).
SetResult(&loginOneResponseResult).
Post(fmt.Sprintf("%v/v1/auth/login1", config.INFISICAL_URL))
})
if err != nil {
return nil, err
}
if loginOneResponse.StatusCode() > 299 {
return nil, fmt.Errorf("ops, unsuccessful response code. [response=%v]", loginOneResponse)
util.HandleError(err)
}
// **** Login 2
serverPublicKey_bytearray, err := hex.DecodeString(loginOneResponseResult.ServerPublicKey)
if err != nil {
return nil, err
return nil, nil, err
}
userSalt, err := hex.DecodeString(loginOneResponseResult.ServerSalt)
userSalt, err := hex.DecodeString(loginOneResponseResult.Salt)
if err != nil {
return nil, err
return nil, nil, err
}
srpClient.SetSalt(userSalt, []byte(email), []byte(password))
@ -203,27 +321,16 @@ func getFreshUserCredentials(email string, password string) (*api.LoginTwoRespon
srpM1 := srpClient.ComputeM1()
LoginTwoRequest := api.LoginTwoRequest{
loginTwoResponseResult, err := api.CallLogin2V2(httpClient, api.GetLoginTwoV2Request{
Email: email,
ClientProof: hex.EncodeToString(srpM1),
}
var loginTwoResponseResult api.LoginTwoResponse
loginTwoResponse, err := httpClient.
R().
SetBody(LoginTwoRequest).
SetResult(&loginTwoResponseResult).
Post(fmt.Sprintf("%v/v1/auth/login2", config.INFISICAL_URL))
})
if err != nil {
return nil, err
util.HandleError(err)
}
if loginTwoResponse.StatusCode() > 299 {
return nil, fmt.Errorf("ops, unsuccessful response code. [response=%v]", loginTwoResponse)
}
return &loginTwoResponseResult, nil
return &loginOneResponseResult, &loginTwoResponseResult, nil
}
func shouldOverrideLoginPrompt(currentLoggedInUserEmail string) (bool, error) {
@ -237,3 +344,21 @@ func shouldOverrideLoginPrompt(currentLoggedInUserEmail string) (bool, error) {
}
return result == "Yes", err
}
func generateFromPassword(password string, salt []byte, p *params) (hash []byte, err error) {
hash = argon2.IDKey([]byte(password), salt, p.iterations, p.memory, p.parallelism, p.keyLength)
return hash, nil
}
func askForMFACode() string {
mfaCodePromptUI := promptui.Prompt{
Label: "Enter the 2FA verification code sent to your email",
}
mfaVerifyCode, err := mfaCodePromptUI.Run()
if err != nil {
util.HandleError(err)
}
return mfaVerifyCode
}

View File

@ -54,9 +54,12 @@ var runCmd = &cobra.Command{
return nil
},
Run: func(cmd *cobra.Command, args []string) {
envName, err := cmd.Flags().GetString("env")
if err != nil {
util.HandleError(err, "Unable to parse flag")
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
infisicalToken, err := cmd.Flags().GetString("token")
@ -79,7 +82,7 @@ var runCmd = &cobra.Command{
util.HandleError(err, "Unable to parse flag")
}
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: envName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
if err != nil {
util.HandleError(err, "Could not fetch secrets", "If you are using a service token to fetch secrets, please ensure it is valid")
@ -107,13 +110,7 @@ var runCmd = &cobra.Command{
}
// check to see if there are any reserved key words in secrets to inject
reservedEnvironmentVariables := []string{"HOME", "PATH", "PS1", "PS2"}
for _, reservedEnvName := range reservedEnvironmentVariables {
if _, ok := secretsByKey[reservedEnvName]; ok {
delete(secretsByKey, reservedEnvName)
util.PrintWarning(fmt.Sprintf("Infisical secret named [%v] has been removed because it is a reserved secret name", reservedEnvName))
}
}
filterReservedEnvVars(secretsByKey)
// now add infisical secrets
for k, v := range secretsByKey {
@ -146,6 +143,37 @@ var runCmd = &cobra.Command{
},
}
var (
reservedEnvVars = []string{
"HOME", "PATH", "PS1", "PS2",
"PWD", "EDITOR", "XAUTHORITY", "USER",
"TERM", "TERMINFO", "SHELL", "MAIL",
}
reservedEnvVarPrefixes = []string{
"XDG_",
"LC_",
}
)
func filterReservedEnvVars(env map[string]models.SingleEnvironmentVariable) {
for _, reservedEnvName := range reservedEnvVars {
if _, ok := env[reservedEnvName]; ok {
delete(env, reservedEnvName)
util.PrintWarning(fmt.Sprintf("Infisical secret named [%v] has been removed because it is a reserved secret name", reservedEnvName))
}
}
for _, reservedEnvPrefix := range reservedEnvVarPrefixes {
for envName := range env {
if strings.HasPrefix(envName, reservedEnvPrefix) {
delete(env, envName)
util.PrintWarning(fmt.Sprintf("Infisical secret named [%v] has been removed because it contains a reserved prefix", envName))
}
}
}
}
func init() {
rootCmd.AddCommand(runCmd)
runCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")
@ -158,11 +186,14 @@ func init() {
// Will execute a single command and pass in the given secrets into the process
func executeSingleCommandWithEnvs(args []string, secretsCount int, env []string) error {
command := args[0]
argsForCommand := args[1:]
shell := subShellCmd()
color.Green("Injecting %v Infisical secrets into your application process", secretsCount)
cmd := exec.Command(command, argsForCommand...)
args = append(args[:1], args[0:]...) // shift args to the right
args[0] = shell[1]
cmd := exec.Command(shell[0], args...)
cmd.Stdin = os.Stdin
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
@ -172,15 +203,7 @@ func executeSingleCommandWithEnvs(args []string, secretsCount int, env []string)
}
func executeMultipleCommandWithEnvs(fullCommand string, secretsCount int, env []string) error {
shell := [2]string{"sh", "-c"}
if runtime.GOOS == "windows" {
shell = [2]string{"cmd", "/C"}
} else {
currentShell := os.Getenv("SHELL")
if currentShell != "" {
shell[0] = currentShell
}
}
shell := subShellCmd()
cmd := exec.Command(shell[0], shell[1], fullCommand)
cmd.Stdin = os.Stdin
@ -194,6 +217,23 @@ func executeMultipleCommandWithEnvs(fullCommand string, secretsCount int, env []
return execCmd(cmd)
}
func subShellCmd() [2]string {
// default to sh -c
shell := [...]string{"sh", "-c"}
currentShell := os.Getenv("SHELL")
if currentShell != "" {
shell[0] = currentShell
} else if runtime.GOOS == "windows" {
// if the SHELL env var is not set and we're on Windows, use cmd.exe
// The SHELL var should always be checked first, in case the user executes
// infisical from something like Git Bash.
return [...]string{"cmd", "/C"}
}
return shell
}
// Credit: inspired by AWS Valut
func execCmd(cmd *exec.Cmd) error {
sigChannel := make(chan os.Signal, 1)

View File

@ -19,7 +19,6 @@ import (
"github.com/Infisical/infisical-merge/packages/util"
"github.com/Infisical/infisical-merge/packages/visualize"
"github.com/go-resty/resty/v2"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
)
@ -31,9 +30,12 @@ var secretsCmd = &cobra.Command{
PreRun: toggleDebug,
Args: cobra.NoArgs,
Run: func(cmd *cobra.Command, args []string) {
environmentName, err := cmd.Flags().GetString("env")
if err != nil {
util.HandleError(err)
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
infisicalToken, err := cmd.Flags().GetString("token")
@ -94,9 +96,12 @@ var secretsSetCmd = &cobra.Command{
Run: func(cmd *cobra.Command, args []string) {
util.RequireLocalWorkspaceFile()
environmentName, err := cmd.Flags().GetString("env")
if err != nil {
util.HandleError(err, "Unable to parse flag")
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
workspaceFile, err := util.GetWorkSpaceFromFile()
@ -270,11 +275,12 @@ var secretsDeleteCmd = &cobra.Command{
PreRun: toggleDebug,
Args: cobra.MinimumNArgs(1),
Run: func(cmd *cobra.Command, args []string) {
environmentName, err := cmd.Flags().GetString("env")
if err != nil {
log.Errorln("Unable to parse the environment name flag")
log.Debugln(err)
return
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
@ -330,9 +336,12 @@ var secretsDeleteCmd = &cobra.Command{
}
func getSecretsByNames(cmd *cobra.Command, args []string) {
environmentName, err := cmd.Flags().GetString("env")
if err != nil {
util.HandleError(err, "Unable to parse flag")
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
infisicalToken, err := cmd.Flags().GetString("token")
@ -352,10 +361,7 @@ func getSecretsByNames(cmd *cobra.Command, args []string) {
requestedSecrets := []models.SingleEnvironmentVariable{}
secretsMap := make(map[string]models.SingleEnvironmentVariable)
for _, secret := range secrets {
secretsMap[secret.Key] = secret
}
secretsMap := getSecretsByKeys(secrets)
for _, secretKeyFromArg := range args {
if value, ok := secretsMap[strings.ToUpper(secretKeyFromArg)]; ok {
@ -373,9 +379,12 @@ func getSecretsByNames(cmd *cobra.Command, args []string) {
}
func generateExampleEnv(cmd *cobra.Command, args []string) {
environmentName, err := cmd.Flags().GetString("env")
if err != nil {
util.HandleError(err, "Unable to parse flag")
environmentName, _ := cmd.Flags().GetString("env")
if !cmd.Flags().Changed("env") {
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
if environmentFromWorkspace != "" {
environmentName = environmentFromWorkspace
}
}
infisicalToken, err := cmd.Flags().GetString("token")
@ -575,7 +584,7 @@ func addHash(input string) string {
}
func getSecretsByKeys(secrets []models.SingleEnvironmentVariable) map[string]models.SingleEnvironmentVariable {
secretMapByName := make(map[string]models.SingleEnvironmentVariable)
secretMapByName := make(map[string]models.SingleEnvironmentVariable, len(secrets))
for _, secret := range secrets {
secretMapByName[secret.Key] = secret

View File

@ -39,8 +39,9 @@ type Workspace struct {
}
type WorkspaceConfigFile struct {
WorkspaceId string `json:"workspaceId"`
DefaultEnvironment string `json:"defaultEnvironment"`
WorkspaceId string `json:"workspaceId"`
DefaultEnvironment string `json:"defaultEnvironment"`
GitBranchToEnvironmentMapping map[string]string `json:"gitBranchToEnvironmentMapping"`
}
type SymmetricEncryptionResult struct {
@ -50,7 +51,8 @@ type SymmetricEncryptionResult struct {
}
type GetAllSecretsParameters struct {
Environment string
InfisicalToken string
TagSlugs string
Environment string
EnvironmentPassedViaFlag bool
InfisicalToken string
TagSlugs string
}

View File

@ -5,6 +5,7 @@ import (
"errors"
"fmt"
"os"
"path/filepath"
"github.com/Infisical/infisical-merge/packages/models"
log "github.com/sirupsen/logrus"
@ -73,7 +74,12 @@ func WorkspaceConfigFileExistsInCurrentPath() bool {
}
func GetWorkSpaceFromFile() (models.WorkspaceConfigFile, error) {
configFileAsBytes, err := os.ReadFile(INFISICAL_WORKSPACE_CONFIG_FILE_NAME)
cfgFile, err := FindWorkspaceConfigFile()
if err != nil {
return models.WorkspaceConfigFile{}, err
}
configFileAsBytes, err := os.ReadFile(cfgFile)
if err != nil {
return models.WorkspaceConfigFile{}, err
}
@ -87,6 +93,37 @@ func GetWorkSpaceFromFile() (models.WorkspaceConfigFile, error) {
return workspaceConfigFile, nil
}
// FindWorkspaceConfigFile searches for a .infisical.json file in the current directory and all parent directories.
func FindWorkspaceConfigFile() (string, error) {
dir, err := os.Getwd()
if err != nil {
return "", err
}
for {
path := filepath.Join(dir, INFISICAL_WORKSPACE_CONFIG_FILE_NAME)
_, err := os.Stat(path)
if err == nil {
// file found
log.Debugf("FindWorkspaceConfigFile: workspace file found at [path=%s]", path)
return path, nil
}
// check if we have reached the root directory
if dir == filepath.Dir(dir) {
break
}
// move up one directory
dir = filepath.Dir(dir)
}
// file not found
return "", fmt.Errorf("file not found: %s", INFISICAL_WORKSPACE_CONFIG_FILE_NAME)
}
func GetFullConfigFilePath() (fullPathToFile string, fullPathToDirectory string, err error) {
homeDir, err := GetHomeDir()
if err != nil {

View File

@ -1,10 +1,14 @@
package util
import (
"bytes"
"crypto/sha256"
"encoding/base64"
"fmt"
"os"
"os/exec"
"path"
"strings"
)
type DecodedSymmetricEncryptionDetails = struct {
@ -85,8 +89,8 @@ func RequireServiceToken() {
}
func RequireLocalWorkspaceFile() {
workspaceFileExists := WorkspaceConfigFileExistsInCurrentPath()
if !workspaceFileExists {
workspaceFilePath, _ := FindWorkspaceConfigFile()
if workspaceFilePath == "" {
PrintErrorMessageAndExit("It looks you have not yet connected this project to Infisical", "To do so, run [infisical init] then run your command again")
}
@ -110,3 +114,26 @@ func GetHashFromStringList(list []string) string {
sum := sha256.Sum256(hash.Sum(nil))
return fmt.Sprintf("%x", sum)
}
// execCmd is a struct that holds the command and arguments to be executed.
// By using this struct, we can easily mock the command and arguments.
type execCmd struct {
cmd string
args []string
}
var getCurrentBranchCmd = execCmd{
cmd: "git",
args: []string{"symbolic-ref", "--short", "HEAD"},
}
func getCurrentBranch() (string, error) {
cmd := exec.Command(getCurrentBranchCmd.cmd, getCurrentBranchCmd.args...)
var out bytes.Buffer
cmd.Stdout = &out
err := cmd.Run()
if err != nil {
return "", err
}
return path.Base(strings.TrimSpace(out.String())), nil
}

View File

@ -131,10 +131,6 @@ func GetAllEnvironmentVariables(params models.GetAllSecretsParameters) ([]models
return nil, err
}
if workspaceFile.DefaultEnvironment != "" {
params.Environment = workspaceFile.DefaultEnvironment
}
// Verify environment
err = ValidateEnvironmentName(params.Environment, workspaceFile.WorkspaceId, loggedInUserDetails.UserCredentials)
if err != nil {
@ -485,3 +481,35 @@ func DeleteBackupSecrets() error {
return os.RemoveAll(fullPathToSecretsBackupFolder)
}
func GetEnvFromWorkspaceFile() string {
workspaceFile, err := GetWorkSpaceFromFile()
if err != nil {
log.Debugf("getEnvFromWorkspaceFile: [err=%s]", err)
return ""
}
if env := GetEnvelopmentBasedOnGitBranch(workspaceFile); env != "" {
return env
}
return workspaceFile.DefaultEnvironment
}
func GetEnvelopmentBasedOnGitBranch(workspaceFile models.WorkspaceConfigFile) string {
branch, err := getCurrentBranch()
if err != nil {
log.Debugf("getEnvelopmentBasedOnGitBranch: [err=%s]", err)
}
envBasedOnGitBranch, ok := workspaceFile.GitBranchToEnvironmentMapping[branch]
log.Debugf("GetEnvelopmentBasedOnGitBranch: [envBasedOnGitBranch=%s] [ok=%t]", envBasedOnGitBranch, ok)
if err == nil && ok {
return envBasedOnGitBranch
} else {
log.Debugf("getEnvelopmentBasedOnGitBranch: [err=%s]", err)
return ""
}
}

View File

@ -1,6 +1,9 @@
package util
import (
"io"
"os"
"path"
"testing"
"github.com/Infisical/infisical-merge/packages/models"
@ -158,3 +161,98 @@ func Test_SubstituteSecrets_When_No_SubstituteNeeded(t *testing.T) {
}
}
}
func Test_Read_Env_From_File(t *testing.T) {
type testCase struct {
TestFile string
ExpectedEnv string
}
var cases = []testCase{
{
TestFile: "testdata/infisical-default-env.json",
ExpectedEnv: "myDefaultEnv",
},
{
TestFile: "testdata/infisical-branch-env.json",
ExpectedEnv: "myMainEnv",
},
{
TestFile: "testdata/infisical-no-matching-branch-env.json",
ExpectedEnv: "myDefaultEnv",
},
}
// create a tmp directory for testing
testDir, err := os.MkdirTemp(os.TempDir(), "infisical-test")
if err != nil {
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to create temp directory: %s", err)
}
// safe the current working directory
originalDir, err := os.Getwd()
if err != nil {
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to get current working directory: %s", err)
}
// backup the original git command
originalGitCmd := getCurrentBranchCmd
// make sure to clean up after the test
t.Cleanup(func() {
os.Chdir(originalDir)
os.RemoveAll(testDir)
getCurrentBranchCmd = originalGitCmd
})
// mock the git command to return "main" as the current branch
getCurrentBranchCmd = execCmd{cmd: "echo", args: []string{"main"}}
for _, c := range cases {
// make sure we start in the original directory
err = os.Chdir(originalDir)
if err != nil {
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to change working directory: %s", err)
}
// remove old test file if it exists
err = os.Remove(path.Join(testDir, INFISICAL_WORKSPACE_CONFIG_FILE_NAME))
if err != nil && !os.IsNotExist(err) {
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to remove old test file: %s", err)
}
// deploy the test file
copyTestFile(t, c.TestFile, path.Join(testDir, INFISICAL_WORKSPACE_CONFIG_FILE_NAME))
// change the working directory to the tmp directory
err = os.Chdir(testDir)
if err != nil {
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to change working directory: %s", err)
}
// get env from file
env := GetEnvFromWorkspaceFile()
if env != c.ExpectedEnv {
t.Errorf("Test_Read_DefaultEnv_From_File: Expected env to be %s but got %s", c.ExpectedEnv, env)
}
}
}
func copyTestFile(t *testing.T, src, dst string) {
srcFile, err := os.Open(src)
if err != nil {
t.Errorf("Test_Read_Env_From_File_By_Branch: Failed to open source file: %s", err)
}
defer srcFile.Close()
dstFile, err := os.Create(dst)
if err != nil {
t.Errorf("Test_Read_Env_From_File_By_Branch: Failed to create destination file: %s", err)
}
defer dstFile.Close()
_, err = io.Copy(dstFile, srcFile)
if err != nil {
t.Errorf("Test_Read_Env_From_File_By_Branch: Failed to copy file: %s", err)
}
}

View File

@ -0,0 +1,7 @@
{
"workspaceId": "12345678",
"defaultEnvironment": "myDefaultEnv",
"gitBranchToEnvironmentMapping": {
"main": "myMainEnv"
}
}

View File

@ -0,0 +1,5 @@
{
"workspaceId": "12345678",
"defaultEnvironment": "myDefaultEnv",
"gitBranchToEnvironmentMapping": null
}

View File

@ -0,0 +1,7 @@
{
"workspaceId": "12345678",
"defaultEnvironment": "myDefaultEnv",
"gitBranchToEnvironmentMapping": {
"notmain": "myMainEnv"
}
}

View File

@ -1,4 +1,4 @@
version: '3'
version: "3"
services:
nginx:
@ -22,7 +22,6 @@ services:
depends_on:
- mongo
image: infisical/backend
command: npm run start
env_file: .env
environment:
- NODE_ENV=production

View File

@ -18,4 +18,8 @@ If you are still experiencing trouble, please seek support.
<Accordion title="Can I fetch secrets with Infisical if I am offline?">
Yes. If you have previously retrieved secrets for a specific project and environment (such as dev, staging, or prod), the `run`/`secret` command will utilize the saved secrets, even when offline, on subsequent fetch attempts.
</Accordion>
<Accordion title="Can I upload the .infisical.json file that was generated?">
Yes. This is simply a configuration file and contains no sensitive data.
</Accordion>

View File

@ -20,7 +20,7 @@ The Infisical CLI provides a way to inject environment variables from the platfo
### Updates
```bash
brew upgrade infisical
brew update && brew upgrade infisical
```
</Tab>

View File

@ -0,0 +1,44 @@
---
title: "Project config file"
description: "Project config file & customization options"
---
To link your local project on your machine with an Infisical project, we suggest using the infisical init CLI command. This will generate a `.infisical.json` file in the root directory of your project.
The `.infisical.json` file specifies various parameters, such as the Infisical project to retrieve secrets from, along with other configuration options. Furthermore, you can define additional properties in the file to further tailor your local development experience.
## Set default environment
If you need to change environments while using the CLI, you can do so by including the `--env` flag in your command.
However, this can be inconvenient if you typically work in just one environment.
To simplify the process, you can establish a default environment, which will be used for every command unless you specify otherwise.
```json .infisical.json
{
"workspaceId": "63ee5410a45f7a1ed39ba118",
"defaultEnvironment": "test",
"gitBranchToEnvironmentMapping": null
}
```
### How it works
If both `defaultEnvironment` and `gitBranchToEnvironmentMapping` are configured, `gitBranchToEnvironmentMapping` will take precedence over `defaultEnvironment`.
However, if `gitBranchToEnvironmentMapping` is not set and `defaultEnvironment` is, then the `defaultEnvironment` will be used to execute your Infisical CLI commands.
If you wish to override the `defaultEnvironment`, you can do so by using the `--env` flag explicitly.
## Set Infisical environment based on GitHub branch
When fetching your secrets from Infisical, you can switch between environments by using the `--env` flag. However, in certain cases, you may prefer the environment to be automatically mapped based on the current GitHub branch you are working on.
To achieve this, simply add the `gitBranchToEnvironmentMapping` property to your configuration file, as shown below.
```json .infisical.json
{
"workspaceId": "63ee5410a45f7a1ed39ba118",
"gitBranchToEnvironmentMapping": {
"branchName": "dev",
"anotherBranchName": "staging"
}
}
```
### How it works
After configuring this property, every time you use the CLI with the specified configuration file, it will automatically verify if there is a corresponding environment mapping for the current Github branch you are on.
If it exists, the CLI will use that environment to retrieve secrets. You can override this behavior by explicitly using the `--env` flag while interacting with the CLI.

View File

@ -5,8 +5,6 @@ description: "How to sync your secrets among various 3rd-party services with Inf
Integrations allow environment variables to be synced across your entire infrastructure from local development to CI/CD and production.
We're still relatively early with integrations. 6+ integrations are already avaiable but expect more coming very soon.
<Card title="View integrations" icon="link" href="/integrations/overview">
View all available integrations and their guides
</Card>

View File

@ -0,0 +1,18 @@
---
title: "MFA"
description: "Secure your Infisical account with MFA"
---
MFA requires users to provide multiple forms of identification to access their account. Currently, this means logging in with your password and a 6-digit code sent to your email.
## Email 2FA
Check the box in Personal Settings > Two-factor Authentication to enable email-based 2FA.
![Email-based MFA](../../images/mfa-email.png)
<Note>
Infisical currently supports email-based 2FA. We're actively working on
building support for other forms of identification via SMS and Authenticator
App.
</Note>

Binary file not shown.

After

Width:  |  Height:  |  Size: 315 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 468 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 332 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 162 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 180 KiB

Some files were not shown because too many files have changed in this diff Show More