mirror of
https://github.com/Infisical/infisical.git
synced 2025-04-10 07:25:40 +00:00
Compare commits
224 Commits
Author | SHA1 | Date | |
---|---|---|---|
aca6269920 | |||
a4074c9687 | |||
205ec61549 | |||
0d16f707c2 | |||
d3d5ead6ed | |||
1f05d6ea4d | |||
ff82af8358 | |||
a7da858694 | |||
b5c2f6e551 | |||
77226e0924 | |||
0cc4286f5f | |||
99144143ff | |||
efff841121 | |||
2f8d914ecb | |||
a89fccdc1f | |||
40ddd3b2a5 | |||
74d17a20a4 | |||
d537bd2f58 | |||
2f045be8a4 | |||
d948923d95 | |||
fb1085744a | |||
ec22291aca | |||
16c49a9626 | |||
06ea809d60 | |||
12364005c1 | |||
98573e9e05 | |||
c1a4ca6203 | |||
21c2fd8542 | |||
b27bc8fc1b | |||
091115e6ba | |||
d9c055872d | |||
f73d18ddc7 | |||
eb47126f68 | |||
4750767268 | |||
b0ed772885 | |||
7fdab81b5f | |||
c17bf13f8c | |||
0e17c9a6db | |||
1c4dd78dea | |||
23418b3a09 | |||
0f143adbde | |||
1f3f4b7900 | |||
2c5f26380e | |||
8f974fb087 | |||
a0722b4ca5 | |||
41e039578a | |||
c89e8e8a96 | |||
cac83ab927 | |||
0f0b894363 | |||
43f9af1bc6 | |||
f5ed14c84c | |||
2dd57d7c73 | |||
0b1891b64a | |||
5614b0f58a | |||
3bb178976d | |||
1777f98aef | |||
45e3706335 | |||
337ed1fc46 | |||
d1ea76e5a0 | |||
4a72d725b1 | |||
1693db3199 | |||
1ff42991b3 | |||
978423ba5b | |||
4d0dc0d7b7 | |||
3817e666a9 | |||
b61350f6a4 | |||
0fb1a1dc6f | |||
9eefc87b7a | |||
53d35757ee | |||
e80e8e00b1 | |||
0b08e574c7 | |||
499323d0e3 | |||
89ad2f163a | |||
7f04617b7d | |||
44904628bc | |||
fafde7b1ad | |||
7e65314670 | |||
df52c56e83 | |||
4276fb54cc | |||
bb5a0db79c | |||
b906048ea1 | |||
7ce9c816c5 | |||
3fef6e4849 | |||
e7ce1e36e7 | |||
734c915206 | |||
783174adc6 | |||
d769db7668 | |||
00e532fce4 | |||
7cf8cba54b | |||
70b26811d9 | |||
e7aafecbc2 | |||
949fb052cd | |||
fcb1f5a51b | |||
e24f70b891 | |||
bd233ebe9b | |||
f92269f2ec | |||
2143db5eb5 | |||
0c72f50b5e | |||
3c4c616242 | |||
153baad49f | |||
75a2ab636c | |||
05a77e612c | |||
d02bc06dce | |||
e1f88f1a7b | |||
86a2647134 | |||
621b640af4 | |||
40c80f417c | |||
7bb2c1c278 | |||
a5278affe6 | |||
2f953192d6 | |||
af64582efd | |||
6ad70f24a2 | |||
8bf8968588 | |||
7e9ce0360a | |||
1d35c41dcb | |||
824315f773 | |||
8a74799d64 | |||
f0f6e8a988 | |||
89bc9a823c | |||
40250b7ecf | |||
2d6d32923d | |||
7cb6aee3f7 | |||
469d042f4b | |||
c38ccdb915 | |||
baaa92427f | |||
1ff2c61b3a | |||
0b356e0e83 | |||
eb55c053eb | |||
07b307e4b1 | |||
5bee6a5e24 | |||
bdc99e34cc | |||
cee10fb507 | |||
74e78bb967 | |||
ea5811c24c | |||
d31b7ae4af | |||
75eac1b972 | |||
c65ce14de3 | |||
f8c4ccd64c | |||
43ce222725 | |||
c7ebeecb6b | |||
243c6ca22e | |||
66f1c57a2a | |||
c0d1495761 | |||
e5f6ed3dc7 | |||
ab62d91b09 | |||
59beabb445 | |||
d5bc377e3d | |||
2bdb20f42f | |||
0062df58a2 | |||
b6bbfc08ad | |||
5baccc73c9 | |||
20e7eae4fe | |||
8432f71d58 | |||
604c22d64d | |||
c1deb08df8 | |||
66f201746f | |||
1c61ffbd36 | |||
e5ba8eb281 | |||
f542e07c33 | |||
1082d7f869 | |||
4a3adaa347 | |||
1659dab87d | |||
d88599714f | |||
71bf56a2b7 | |||
0fba78ad16 | |||
92560f5e1f | |||
0d484b93eb | |||
5f3b8c55b8 | |||
553416689c | |||
b0744fd21d | |||
be38844a5b | |||
54e2b661bc | |||
b81d8eba25 | |||
dbcd2b0988 | |||
1d11f11eaf | |||
f2d7401d1d | |||
91cb9750b4 | |||
3e0d4cb70a | |||
dab677b360 | |||
625c0785b5 | |||
540a8b4201 | |||
11f86da1f6 | |||
ab5ffa9ee6 | |||
65bec23292 | |||
635ae941d7 | |||
a9753fb784 | |||
b587d9b35a | |||
aa68bc05d9 | |||
66566a401f | |||
5aa75ecd3f | |||
0a77f9a0c8 | |||
b5d4cfed03 | |||
c57394bdab | |||
da857f321b | |||
754ea09400 | |||
f28a2ea151 | |||
c7dd028771 | |||
3c94bacda9 | |||
b710944630 | |||
280f482fc8 | |||
e1ad8fbee8 | |||
56ca6039ba | |||
d3fcb69c50 | |||
2db4a29ad7 | |||
4df82a6ff1 | |||
cdf73043e1 | |||
ca07d1c50e | |||
868011479b | |||
13b1805d04 | |||
c233fd8ed1 | |||
30b2b85446 | |||
e53fd110f6 | |||
17406e413d | |||
9b219f67b0 | |||
669861d7a8 | |||
d7acd7aef6 | |||
860b8efd7d | |||
6ca3fc5ad2 | |||
189af07ff5 | |||
caf7426f86 | |||
bb752863fa | |||
cf5603c8e3 | |||
77b1011207 | |||
5cadb9e2f9 |
@ -48,10 +48,12 @@ CLIENT_ID_HEROKU=
|
||||
CLIENT_ID_VERCEL=
|
||||
CLIENT_ID_NETLIFY=
|
||||
CLIENT_ID_GITHUB=
|
||||
CLIENT_ID_GITLAB=
|
||||
CLIENT_SECRET_HEROKU=
|
||||
CLIENT_SECRET_VERCEL=
|
||||
CLIENT_SECRET_NETLIFY=
|
||||
CLIENT_SECRET_GITHUB=
|
||||
CLIENT_SECRET_GITLAB=
|
||||
CLIENT_SLUG_VERCEL=
|
||||
|
||||
# Sentry (optional) for monitoring errors
|
||||
|
22
.github/pull_request_template.md
vendored
Normal file
22
.github/pull_request_template.md
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
# Description 📣
|
||||
|
||||
*Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.*
|
||||
|
||||
## Type ✨
|
||||
|
||||
- [ ] Bug fix
|
||||
- [ ] New feature
|
||||
- [ ] Breaking change
|
||||
- [ ] Documentation
|
||||
|
||||
# Tests 🛠️
|
||||
|
||||
*Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration. You may want to add screenshots when relevant and possible*
|
||||
|
||||
```sh
|
||||
# Here's some code block to paste some code snippets
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
- [ ] I have read the [contributing guide](https://infisical.com/docs/contributing/overview), agreed and acknowledged the [code of conduct](https://infisical.com/docs/contributing/code-of-conduct). 📝
|
86
.github/values.yaml
vendored
86
.github/values.yaml
vendored
@ -1,11 +1,5 @@
|
||||
#####
|
||||
# INFISICAL K8 DEFAULT VALUES FILE
|
||||
# PLEASE REPLACE VALUES/EDIT AS REQUIRED
|
||||
#####
|
||||
|
||||
nameOverride: ""
|
||||
|
||||
frontend:
|
||||
enabled: true
|
||||
name: frontend
|
||||
podAnnotations: {}
|
||||
deploymentAnnotations:
|
||||
@ -13,17 +7,18 @@ frontend:
|
||||
replicaCount: 2
|
||||
image:
|
||||
repository: infisical/frontend
|
||||
pullPolicy: Always
|
||||
tag: "latest"
|
||||
pullPolicy: Always
|
||||
kubeSecretRef: managed-secret-frontend
|
||||
service:
|
||||
# type of the frontend service
|
||||
type: ClusterIP
|
||||
# define the nodePort if service type is NodePort
|
||||
# nodePort:
|
||||
annotations: {}
|
||||
type: ClusterIP
|
||||
nodePort: ""
|
||||
|
||||
frontendEnvironmentVariables: null
|
||||
|
||||
backend:
|
||||
enabled: true
|
||||
name: backend
|
||||
podAnnotations: {}
|
||||
deploymentAnnotations:
|
||||
@ -31,63 +26,46 @@ backend:
|
||||
replicaCount: 2
|
||||
image:
|
||||
repository: infisical/backend
|
||||
pullPolicy: Always
|
||||
tag: "latest"
|
||||
pullPolicy: Always
|
||||
kubeSecretRef: managed-backend-secret
|
||||
service:
|
||||
annotations: {}
|
||||
type: ClusterIP
|
||||
nodePort: ""
|
||||
|
||||
backendEnvironmentVariables: null
|
||||
|
||||
## Mongo DB persistence
|
||||
mongodb:
|
||||
name: mongodb
|
||||
podAnnotations: {}
|
||||
image:
|
||||
repository: mongo
|
||||
pullPolicy: IfNotPresent
|
||||
tag: "latest"
|
||||
service:
|
||||
annotations: {}
|
||||
enabled: true
|
||||
persistence:
|
||||
enabled: false
|
||||
|
||||
# By default the backend will be connected to a Mongo instance in the cluster.
|
||||
# However, it is recommended to add a managed document DB connection string because the DB instance in the cluster does not have persistence yet ( data will be deleted on next deploy).
|
||||
# Learn about connection string type here https://www.mongodb.com/docs/manual/reference/connection-string/
|
||||
mongodbConnection: {}
|
||||
# externalMongoDBConnectionString: <>
|
||||
## By default the backend will be connected to a Mongo instance within the cluster
|
||||
## However, it is recommended to add a managed document DB connection string for production-use (DBaaS)
|
||||
## Learn about connection string type here https://www.mongodb.com/docs/manual/reference/connection-string/
|
||||
## e.g. "mongodb://<user>:<pass>@<host>:<port>/<database-name>"
|
||||
mongodbConnection:
|
||||
externalMongoDBConnectionString: ""
|
||||
|
||||
ingress:
|
||||
enabled: true
|
||||
annotations:
|
||||
kubernetes.io/ingress.class: "nginx"
|
||||
hostName: gamma.infisical.com # replace with your domain
|
||||
frontend:
|
||||
# cert-manager.io/issuer: letsencrypt-nginx
|
||||
hostName: gamma.infisical.com ## <- Replace with your own domain
|
||||
frontend:
|
||||
path: /
|
||||
pathType: Prefix
|
||||
backend:
|
||||
path: /api
|
||||
pathType: Prefix
|
||||
tls: []
|
||||
tls:
|
||||
[]
|
||||
# - secretName: letsencrypt-nginx
|
||||
# hosts:
|
||||
# - infisical.local
|
||||
|
||||
|
||||
## Complete Ingress example
|
||||
# ingress:
|
||||
# enabled: true
|
||||
# annotations:
|
||||
# kubernetes.io/ingress.class: "nginx"
|
||||
# cert-manager.io/issuer: letsencrypt-nginx
|
||||
# hostName: k8.infisical.com
|
||||
# frontend:
|
||||
# path: /
|
||||
# pathType: Prefix
|
||||
# backend:
|
||||
# path: /api
|
||||
# pathType: Prefix
|
||||
# tls:
|
||||
# - secretName: letsencrypt-nginx
|
||||
# hosts:
|
||||
# - k8.infisical.com
|
||||
|
||||
###
|
||||
### YOU MUST FILL IN ALL SECRETS BELOW
|
||||
###
|
||||
backendEnvironmentVariables: {}
|
||||
|
||||
frontendEnvironmentVariables: {}
|
||||
mailhog:
|
||||
enabled: false
|
||||
|
@ -1,18 +1,27 @@
|
||||
FROM node:16-bullseye-slim
|
||||
# Build stage
|
||||
FROM node:16-alpine AS build
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package.json package-lock.json ./
|
||||
|
||||
# RUN npm ci --only-production --ignore-scripts
|
||||
# "prepare": "cd .. && npm install"
|
||||
|
||||
COPY package*.json ./
|
||||
RUN npm ci --only-production
|
||||
|
||||
COPY . .
|
||||
RUN npm run build
|
||||
|
||||
# Production stage
|
||||
FROM node:16-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package*.json ./
|
||||
RUN npm ci --only-production
|
||||
|
||||
COPY --from=build /app .
|
||||
|
||||
HEALTHCHECK --interval=10s --timeout=3s --start-period=10s \
|
||||
CMD node healthcheck.js
|
||||
|
||||
EXPOSE 4000
|
||||
|
||||
CMD ["npm", "run", "start"]
|
||||
CMD ["npm", "run", "start"]
|
2
backend/environment.d.ts
vendored
2
backend/environment.d.ts
vendored
@ -22,10 +22,12 @@ declare global {
|
||||
CLIENT_ID_VERCEL: string;
|
||||
CLIENT_ID_NETLIFY: string;
|
||||
CLIENT_ID_GITHUB: string;
|
||||
CLIENT_ID_GITLAB: string;
|
||||
CLIENT_SECRET_HEROKU: string;
|
||||
CLIENT_SECRET_VERCEL: string;
|
||||
CLIENT_SECRET_NETLIFY: string;
|
||||
CLIENT_SECRET_GITHUB: string;
|
||||
CLIENT_SECRET_GITLAB: string;
|
||||
CLIENT_SLUG_VERCEL: string;
|
||||
POSTHOG_HOST: string;
|
||||
POSTHOG_PROJECT_API_KEY: string;
|
||||
|
9634
backend/package-lock.json
generated
9634
backend/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@ -10,6 +10,7 @@
|
||||
"await-to-js": "^3.0.0",
|
||||
"aws-sdk": "^2.1311.0",
|
||||
"axios": "^1.1.3",
|
||||
"axios-retry": "^3.4.0",
|
||||
"bcrypt": "^5.1.0",
|
||||
"bigint-conversion": "^2.2.2",
|
||||
"builder-pattern": "^2.2.0",
|
||||
@ -47,7 +48,7 @@
|
||||
"version": "1.0.0",
|
||||
"main": "src/index.js",
|
||||
"scripts": {
|
||||
"start": "npm run build && node build/index.js",
|
||||
"start": "node build/index.js",
|
||||
"dev": "nodemon",
|
||||
"swagger-autogen": "node ./swagger/index.ts",
|
||||
"build": "rimraf ./build && tsc && cp -R ./src/templates ./build",
|
||||
@ -86,7 +87,7 @@
|
||||
"@types/supertest": "^2.0.12",
|
||||
"@types/swagger-jsdoc": "^6.0.1",
|
||||
"@types/swagger-ui-express": "^4.1.3",
|
||||
"@typescript-eslint/eslint-plugin": "^5.40.1",
|
||||
"@typescript-eslint/eslint-plugin": "^5.54.0",
|
||||
"@typescript-eslint/parser": "^5.40.1",
|
||||
"cross-env": "^7.0.3",
|
||||
"eslint": "^8.26.0",
|
||||
|
@ -1,7 +1,7 @@
|
||||
// eslint-disable-next-line @typescript-eslint/no-var-requires
|
||||
const { patchRouterParam } = require('./utils/patchAsyncRoutes');
|
||||
|
||||
import express, { Request, Response } from 'express';
|
||||
import express from 'express';
|
||||
import helmet from 'helmet';
|
||||
import cors from 'cors';
|
||||
import cookieParser from 'cookie-parser';
|
||||
@ -42,6 +42,8 @@ import {
|
||||
integrationAuth as v1IntegrationAuthRouter
|
||||
} from './routes/v1';
|
||||
import {
|
||||
signup as v2SignupRouter,
|
||||
auth as v2AuthRouter,
|
||||
users as v2UsersRouter,
|
||||
organizations as v2OrganizationsRouter,
|
||||
workspace as v2WorkspaceRouter,
|
||||
@ -110,6 +112,8 @@ app.use('/api/v1/integration', v1IntegrationRouter);
|
||||
app.use('/api/v1/integration-auth', v1IntegrationAuthRouter);
|
||||
|
||||
// v2 routes
|
||||
app.use('/api/v2/signup', v2SignupRouter);
|
||||
app.use('/api/v2/auth', v2AuthRouter);
|
||||
app.use('/api/v2/users', v2UsersRouter);
|
||||
app.use('/api/v2/organizations', v2OrganizationsRouter);
|
||||
app.use('/api/v2/workspace', v2EnvironmentRouter);
|
||||
|
@ -5,6 +5,8 @@ const ENCRYPTION_KEY = process.env.ENCRYPTION_KEY!;
|
||||
const SALT_ROUNDS = parseInt(process.env.SALT_ROUNDS!) || 10;
|
||||
const JWT_AUTH_LIFETIME = process.env.JWT_AUTH_LIFETIME! || '10d';
|
||||
const JWT_AUTH_SECRET = process.env.JWT_AUTH_SECRET!;
|
||||
const JWT_MFA_LIFETIME = process.env.JWT_MFA_LIFETIME! || '5m';
|
||||
const JWT_MFA_SECRET = process.env.JWT_MFA_SECRET!;
|
||||
const JWT_REFRESH_LIFETIME = process.env.JWT_REFRESH_LIFETIME! || '90d';
|
||||
const JWT_REFRESH_SECRET = process.env.JWT_REFRESH_SECRET!;
|
||||
const JWT_SERVICE_SECRET = process.env.JWT_SERVICE_SECRET!;
|
||||
@ -20,11 +22,13 @@ const CLIENT_ID_HEROKU = process.env.CLIENT_ID_HEROKU!;
|
||||
const CLIENT_ID_VERCEL = process.env.CLIENT_ID_VERCEL!;
|
||||
const CLIENT_ID_NETLIFY = process.env.CLIENT_ID_NETLIFY!;
|
||||
const CLIENT_ID_GITHUB = process.env.CLIENT_ID_GITHUB!;
|
||||
const CLIENT_ID_GITLAB = process.env.CLIENT_ID_GITLAB!;
|
||||
const CLIENT_SECRET_AZURE = process.env.CLIENT_SECRET_AZURE!;
|
||||
const CLIENT_SECRET_HEROKU = process.env.CLIENT_SECRET_HEROKU!;
|
||||
const CLIENT_SECRET_VERCEL = process.env.CLIENT_SECRET_VERCEL!;
|
||||
const CLIENT_SECRET_NETLIFY = process.env.CLIENT_SECRET_NETLIFY!;
|
||||
const CLIENT_SECRET_GITHUB = process.env.CLIENT_SECRET_GITHUB!;
|
||||
const CLIENT_SECRET_GITLAB = process.env.CLIENT_SECRET_GITLAB;
|
||||
const CLIENT_SLUG_VERCEL = process.env.CLIENT_SLUG_VERCEL!;
|
||||
const POSTHOG_HOST = process.env.POSTHOG_HOST! || 'https://app.posthog.com';
|
||||
const POSTHOG_PROJECT_API_KEY =
|
||||
@ -56,6 +60,8 @@ export {
|
||||
SALT_ROUNDS,
|
||||
JWT_AUTH_LIFETIME,
|
||||
JWT_AUTH_SECRET,
|
||||
JWT_MFA_LIFETIME,
|
||||
JWT_MFA_SECRET,
|
||||
JWT_REFRESH_LIFETIME,
|
||||
JWT_REFRESH_SECRET,
|
||||
JWT_SERVICE_SECRET,
|
||||
@ -71,11 +77,13 @@ export {
|
||||
CLIENT_ID_VERCEL,
|
||||
CLIENT_ID_NETLIFY,
|
||||
CLIENT_ID_GITHUB,
|
||||
CLIENT_ID_GITLAB,
|
||||
CLIENT_SECRET_AZURE,
|
||||
CLIENT_SECRET_HEROKU,
|
||||
CLIENT_SECRET_VERCEL,
|
||||
CLIENT_SECRET_NETLIFY,
|
||||
CLIENT_SECRET_GITHUB,
|
||||
CLIENT_SECRET_GITLAB,
|
||||
CLIENT_SLUG_VERCEL,
|
||||
POSTHOG_HOST,
|
||||
POSTHOG_PROJECT_API_KEY,
|
||||
|
16
backend/src/config/request.ts
Normal file
16
backend/src/config/request.ts
Normal file
@ -0,0 +1,16 @@
|
||||
import axios from 'axios';
|
||||
import axiosRetry from 'axios-retry';
|
||||
|
||||
const axiosInstance = axios.create();
|
||||
|
||||
// add retry functionality to the axios instance
|
||||
axiosRetry(axiosInstance, {
|
||||
retries: 3,
|
||||
retryDelay: axiosRetry.exponentialDelay, // exponential back-off delay between retries
|
||||
retryCondition: (error) => {
|
||||
// only retry if the error is a network error or a 5xx server error
|
||||
return axiosRetry.isNetworkError(error) || axiosRetry.isRetryableError(error);
|
||||
},
|
||||
});
|
||||
|
||||
export default axiosInstance;
|
@ -5,7 +5,8 @@ import * as Sentry from '@sentry/node';
|
||||
import * as bigintConversion from 'bigint-conversion';
|
||||
const jsrp = require('jsrp');
|
||||
import { User, LoginSRPDetail } from '../../models';
|
||||
import { createToken, issueTokens, clearTokens } from '../../helpers/auth';
|
||||
import { createToken, issueAuthTokens, clearTokens } from '../../helpers/auth';
|
||||
import { checkUserDevice } from '../../helpers/user';
|
||||
import {
|
||||
ACTION_LOGIN,
|
||||
ACTION_LOGOUT
|
||||
@ -111,7 +112,14 @@ export const login2 = async (req: Request, res: Response) => {
|
||||
// compare server and client shared keys
|
||||
if (server.checkClientProof(clientProof)) {
|
||||
// issue tokens
|
||||
const tokens = await issueTokens({ userId: user._id.toString() });
|
||||
|
||||
await checkUserDevice({
|
||||
user,
|
||||
ip: req.ip,
|
||||
userAgent: req.headers['user-agent'] ?? ''
|
||||
});
|
||||
|
||||
const tokens = await issueAuthTokens({ userId: user._id.toString() });
|
||||
|
||||
// store (refresh) token in httpOnly cookie
|
||||
res.cookie('jid', tokens.refreshToken, {
|
||||
|
@ -1,14 +1,13 @@
|
||||
import { Request, Response } from 'express';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import crypto from 'crypto';
|
||||
import { SITE_URL, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, EMAIL_TOKEN_LIFETIME } from '../../config';
|
||||
import { MembershipOrg, Organization, User, Token } from '../../models';
|
||||
import { SITE_URL, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET } from '../../config';
|
||||
import { MembershipOrg, Organization, User } from '../../models';
|
||||
import { deleteMembershipOrg as deleteMemberFromOrg } from '../../helpers/membershipOrg';
|
||||
import { checkEmailVerification } from '../../helpers/signup';
|
||||
import { createToken } from '../../helpers/auth';
|
||||
import { updateSubscriptionOrgQuantity } from '../../helpers/organization';
|
||||
import { sendMail } from '../../helpers/nodemailer';
|
||||
import { OWNER, ADMIN, MEMBER, ACCEPTED, INVITED } from '../../variables';
|
||||
import { TokenService } from '../../services';
|
||||
import { OWNER, ADMIN, MEMBER, ACCEPTED, INVITED, TOKEN_EMAIL_ORG_INVITATION } from '../../variables';
|
||||
|
||||
/**
|
||||
* Delete organization membership with id [membershipOrgId] from organization
|
||||
@ -163,18 +162,11 @@ export const inviteUserToOrganization = async (req: Request, res: Response) => {
|
||||
const organization = await Organization.findOne({ _id: organizationId });
|
||||
|
||||
if (organization) {
|
||||
const token = crypto.randomBytes(16).toString('hex');
|
||||
|
||||
await Token.findOneAndUpdate(
|
||||
{ email: inviteeEmail },
|
||||
{
|
||||
email: inviteeEmail,
|
||||
token,
|
||||
createdAt: new Date(),
|
||||
ttl: Math.floor(+new Date() / 1000) + EMAIL_TOKEN_LIFETIME // time in seconds, i.e unix
|
||||
},
|
||||
{ upsert: true, new: true }
|
||||
);
|
||||
const token = await TokenService.createToken({
|
||||
type: TOKEN_EMAIL_ORG_INVITATION,
|
||||
email: inviteeEmail,
|
||||
organizationId: organization._id
|
||||
});
|
||||
|
||||
await sendMail({
|
||||
template: 'organizationInvitation.handlebars',
|
||||
@ -226,10 +218,12 @@ export const verifyUserToOrganization = async (req: Request, res: Response) => {
|
||||
|
||||
if (!membershipOrg)
|
||||
throw new Error('Failed to find any invitations for email');
|
||||
|
||||
await checkEmailVerification({
|
||||
|
||||
await TokenService.validateToken({
|
||||
type: TOKEN_EMAIL_ORG_INVITATION,
|
||||
email,
|
||||
code
|
||||
organizationId: membershipOrg.organization,
|
||||
token: code
|
||||
});
|
||||
|
||||
if (user && user?.publicKey) {
|
||||
|
@ -1,14 +1,14 @@
|
||||
import { Request, Response } from 'express';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import crypto from 'crypto';
|
||||
// eslint-disable-next-line @typescript-eslint/no-var-requires
|
||||
const jsrp = require('jsrp');
|
||||
import * as bigintConversion from 'bigint-conversion';
|
||||
import { User, Token, BackupPrivateKey, LoginSRPDetail } from '../../models';
|
||||
import { checkEmailVerification } from '../../helpers/signup';
|
||||
import { User, BackupPrivateKey, LoginSRPDetail } from '../../models';
|
||||
import { createToken } from '../../helpers/auth';
|
||||
import { sendMail } from '../../helpers/nodemailer';
|
||||
import { EMAIL_TOKEN_LIFETIME, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, SITE_URL } from '../../config';
|
||||
import { TokenService } from '../../services';
|
||||
import { JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, SITE_URL } from '../../config';
|
||||
import { TOKEN_EMAIL_PASSWORD_RESET } from '../../variables';
|
||||
import { BadRequestError } from '../../utils/errors';
|
||||
|
||||
/**
|
||||
@ -31,20 +31,12 @@ export const emailPasswordReset = async (req: Request, res: Response) => {
|
||||
error: 'Failed to send email verification for password reset'
|
||||
});
|
||||
}
|
||||
|
||||
const token = crypto.randomBytes(16).toString('hex');
|
||||
|
||||
await Token.findOneAndUpdate(
|
||||
{ email },
|
||||
{
|
||||
email,
|
||||
token,
|
||||
createdAt: new Date(),
|
||||
ttl: Math.floor(+new Date() / 1000) + EMAIL_TOKEN_LIFETIME // time in seconds, i.e unix
|
||||
},
|
||||
{ upsert: true, new: true }
|
||||
);
|
||||
|
||||
|
||||
const token = await TokenService.createToken({
|
||||
type: TOKEN_EMAIL_PASSWORD_RESET,
|
||||
email
|
||||
});
|
||||
|
||||
await sendMail({
|
||||
template: 'passwordReset.handlebars',
|
||||
subjectLine: 'Infisical password reset',
|
||||
@ -55,7 +47,6 @@ export const emailPasswordReset = async (req: Request, res: Response) => {
|
||||
callback_url: SITE_URL + '/password-reset'
|
||||
}
|
||||
});
|
||||
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
@ -88,10 +79,11 @@ export const emailPasswordResetVerify = async (req: Request, res: Response) => {
|
||||
error: 'Failed email verification for password reset'
|
||||
});
|
||||
}
|
||||
|
||||
await checkEmailVerification({
|
||||
|
||||
await TokenService.validateToken({
|
||||
type: TOKEN_EMAIL_PASSWORD_RESET,
|
||||
email,
|
||||
code
|
||||
token: code
|
||||
});
|
||||
|
||||
// generate temporary password-reset token
|
||||
@ -174,8 +166,18 @@ export const srp1 = async (req: Request, res: Response) => {
|
||||
*/
|
||||
export const changePassword = async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { clientProof, encryptedPrivateKey, iv, tag, salt, verifier } =
|
||||
req.body;
|
||||
const {
|
||||
clientProof,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
encryptedPrivateKey,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
} = req.body;
|
||||
|
||||
const user = await User.findOne({
|
||||
email: req.user.email
|
||||
}).select('+salt +verifier');
|
||||
@ -205,9 +207,13 @@ export const changePassword = async (req: Request, res: Response) => {
|
||||
await User.findByIdAndUpdate(
|
||||
req.user._id.toString(),
|
||||
{
|
||||
encryptionVersion: 2,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
iv: encryptedPrivateKeyIV,
|
||||
tag: encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
},
|
||||
@ -341,9 +347,12 @@ export const getBackupPrivateKey = async (req: Request, res: Response) => {
|
||||
export const resetPassword = async (req: Request, res: Response) => {
|
||||
try {
|
||||
const {
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier,
|
||||
} = req.body;
|
||||
@ -351,9 +360,13 @@ export const resetPassword = async (req: Request, res: Response) => {
|
||||
await User.findByIdAndUpdate(
|
||||
req.user._id.toString(),
|
||||
{
|
||||
encryptionVersion: 2,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
iv: encryptedPrivateKeyIV,
|
||||
tag: encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
},
|
||||
|
@ -1,16 +1,12 @@
|
||||
import { Request, Response } from 'express';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { NODE_ENV, JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, INVITE_ONLY_SIGNUP } from '../../config';
|
||||
import { User, MembershipOrg } from '../../models';
|
||||
import { completeAccount } from '../../helpers/user';
|
||||
import { User } from '../../models';
|
||||
import { JWT_SIGNUP_LIFETIME, JWT_SIGNUP_SECRET, INVITE_ONLY_SIGNUP } from '../../config';
|
||||
import {
|
||||
sendEmailVerification,
|
||||
checkEmailVerification,
|
||||
initializeDefaultOrg
|
||||
} from '../../helpers/signup';
|
||||
import { issueTokens, createToken } from '../../helpers/auth';
|
||||
import { INVITED, ACCEPTED } from '../../variables';
|
||||
import axios from 'axios';
|
||||
import { createToken } from '../../helpers/auth';
|
||||
import { BadRequestError } from '../../utils/errors';
|
||||
|
||||
/**
|
||||
@ -112,201 +108,3 @@ export const verifyEmailSignup = async (req: Request, res: Response) => {
|
||||
token
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Complete setting up user by adding their personal and auth information as part of the
|
||||
* signup flow
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const completeAccountSignup = async (req: Request, res: Response) => {
|
||||
let user, token, refreshToken;
|
||||
try {
|
||||
const {
|
||||
email,
|
||||
firstName,
|
||||
lastName,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
salt,
|
||||
verifier,
|
||||
organizationName
|
||||
} = req.body;
|
||||
|
||||
// get user
|
||||
user = await User.findOne({ email });
|
||||
|
||||
if (!user || (user && user?.publicKey)) {
|
||||
// case 1: user doesn't exist.
|
||||
// case 2: user has already completed account
|
||||
return res.status(403).send({
|
||||
error: 'Failed to complete account for complete user'
|
||||
});
|
||||
}
|
||||
|
||||
// complete setting up user's account
|
||||
user = await completeAccount({
|
||||
userId: user._id.toString(),
|
||||
firstName,
|
||||
lastName,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
salt,
|
||||
verifier
|
||||
});
|
||||
|
||||
if (!user)
|
||||
throw new Error('Failed to complete account for non-existent user'); // ensure user is non-null
|
||||
|
||||
// initialize default organization and workspace
|
||||
await initializeDefaultOrg({
|
||||
organizationName,
|
||||
user
|
||||
});
|
||||
|
||||
// update organization membership statuses that are
|
||||
// invited to completed with user attached
|
||||
await MembershipOrg.updateMany(
|
||||
{
|
||||
inviteEmail: email,
|
||||
status: INVITED
|
||||
},
|
||||
{
|
||||
user,
|
||||
status: ACCEPTED
|
||||
}
|
||||
);
|
||||
|
||||
// issue tokens
|
||||
const tokens = await issueTokens({
|
||||
userId: user._id.toString()
|
||||
});
|
||||
|
||||
token = tokens.token;
|
||||
refreshToken = tokens.refreshToken;
|
||||
|
||||
// sending a welcome email to new users
|
||||
if (process.env.LOOPS_API_KEY) {
|
||||
await axios.post("https://app.loops.so/api/v1/events/send", {
|
||||
"email": email,
|
||||
"eventName": "Sign Up",
|
||||
"firstName": firstName,
|
||||
"lastName": lastName
|
||||
}, {
|
||||
headers: {
|
||||
"Accept": "application/json",
|
||||
"Authorization": "Bearer " + process.env.LOOPS_API_KEY
|
||||
},
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to complete account setup'
|
||||
});
|
||||
}
|
||||
|
||||
return res.status(200).send({
|
||||
message: 'Successfully set up account',
|
||||
user,
|
||||
token,
|
||||
refreshToken
|
||||
});
|
||||
};
|
||||
/**
|
||||
* Complete setting up user by adding their personal and auth information as part of the
|
||||
* invite flow
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const completeAccountInvite = async (req: Request, res: Response) => {
|
||||
let user, token, refreshToken;
|
||||
try {
|
||||
const {
|
||||
email,
|
||||
firstName,
|
||||
lastName,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
salt,
|
||||
verifier
|
||||
} = req.body;
|
||||
|
||||
// get user
|
||||
user = await User.findOne({ email });
|
||||
|
||||
if (!user || (user && user?.publicKey)) {
|
||||
// case 1: user doesn't exist.
|
||||
// case 2: user has already completed account
|
||||
return res.status(403).send({
|
||||
error: 'Failed to complete account for complete user'
|
||||
});
|
||||
}
|
||||
|
||||
const membershipOrg = await MembershipOrg.findOne({
|
||||
inviteEmail: email,
|
||||
status: INVITED
|
||||
});
|
||||
|
||||
if (!membershipOrg) throw new Error('Failed to find invitations for email');
|
||||
|
||||
// complete setting up user's account
|
||||
user = await completeAccount({
|
||||
userId: user._id.toString(),
|
||||
firstName,
|
||||
lastName,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
salt,
|
||||
verifier
|
||||
});
|
||||
|
||||
if (!user)
|
||||
throw new Error('Failed to complete account for non-existent user');
|
||||
|
||||
// update organization membership statuses that are
|
||||
// invited to completed with user attached
|
||||
await MembershipOrg.updateMany(
|
||||
{
|
||||
inviteEmail: email,
|
||||
status: INVITED
|
||||
},
|
||||
{
|
||||
user,
|
||||
status: ACCEPTED
|
||||
}
|
||||
);
|
||||
|
||||
// issue tokens
|
||||
const tokens = await issueTokens({
|
||||
userId: user._id.toString()
|
||||
});
|
||||
|
||||
token = tokens.token;
|
||||
refreshToken = tokens.refreshToken;
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to complete account setup'
|
||||
});
|
||||
}
|
||||
|
||||
return res.status(200).send({
|
||||
message: 'Successfully set up account',
|
||||
user,
|
||||
token,
|
||||
refreshToken
|
||||
});
|
||||
};
|
||||
|
351
backend/src/controllers/v2/authController.ts
Normal file
351
backend/src/controllers/v2/authController.ts
Normal file
@ -0,0 +1,351 @@
|
||||
/* eslint-disable @typescript-eslint/no-var-requires */
|
||||
import { Request, Response } from 'express';
|
||||
import jwt from 'jsonwebtoken';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import * as bigintConversion from 'bigint-conversion';
|
||||
const jsrp = require('jsrp');
|
||||
import { User, LoginSRPDetail } from '../../models';
|
||||
import { issueAuthTokens, createToken } from '../../helpers/auth';
|
||||
import { checkUserDevice } from '../../helpers/user';
|
||||
import { sendMail } from '../../helpers/nodemailer';
|
||||
import { TokenService } from '../../services';
|
||||
import { EELogService } from '../../ee/services';
|
||||
import {
|
||||
NODE_ENV,
|
||||
JWT_MFA_LIFETIME,
|
||||
JWT_MFA_SECRET
|
||||
} from '../../config';
|
||||
import { BadRequestError, InternalServerError } from '../../utils/errors';
|
||||
import {
|
||||
TOKEN_EMAIL_MFA,
|
||||
ACTION_LOGIN
|
||||
} from '../../variables';
|
||||
import { getChannelFromUserAgent } from '../../utils/posthog'; // TODO: move this
|
||||
|
||||
declare module 'jsonwebtoken' {
|
||||
export interface UserIDJwtPayload extends jwt.JwtPayload {
|
||||
userId: string;
|
||||
}
|
||||
}
|
||||
|
||||
const clientPublicKeys: any = {};
|
||||
|
||||
/**
|
||||
* Log in user step 1: Return [salt] and [serverPublicKey] as part of step 1 of SRP protocol
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const login1 = async (req: Request, res: Response) => {
|
||||
try {
|
||||
const {
|
||||
email,
|
||||
clientPublicKey
|
||||
}: { email: string; clientPublicKey: string } = req.body;
|
||||
|
||||
const user = await User.findOne({
|
||||
email
|
||||
}).select('+salt +verifier');
|
||||
|
||||
if (!user) throw new Error('Failed to find user');
|
||||
|
||||
const server = new jsrp.server();
|
||||
server.init(
|
||||
{
|
||||
salt: user.salt,
|
||||
verifier: user.verifier
|
||||
},
|
||||
async () => {
|
||||
// generate server-side public key
|
||||
const serverPublicKey = server.getPublicKey();
|
||||
|
||||
await LoginSRPDetail.findOneAndReplace({ email: email }, {
|
||||
email: email,
|
||||
clientPublicKey: clientPublicKey,
|
||||
serverBInt: bigintConversion.bigintToBuf(server.bInt),
|
||||
}, { upsert: true, returnNewDocument: false });
|
||||
|
||||
return res.status(200).send({
|
||||
serverPublicKey,
|
||||
salt: user.salt
|
||||
});
|
||||
}
|
||||
);
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to start authentication process'
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Log in user step 2: complete step 2 of SRP protocol and return token and their (encrypted)
|
||||
* private key
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const login2 = async (req: Request, res: Response) => {
|
||||
try {
|
||||
|
||||
if (!req.headers['user-agent']) throw InternalServerError({ message: 'User-Agent header is required' });
|
||||
|
||||
const { email, clientProof } = req.body;
|
||||
const user = await User.findOne({
|
||||
email
|
||||
}).select('+salt +verifier +encryptionVersion +protectedKey +protectedKeyIV +protectedKeyTag +publicKey +encryptedPrivateKey +iv +tag');
|
||||
|
||||
if (!user) throw new Error('Failed to find user');
|
||||
|
||||
const loginSRPDetail = await LoginSRPDetail.findOneAndDelete({ email: email })
|
||||
|
||||
if (!loginSRPDetail) {
|
||||
return BadRequestError(Error("Failed to find login details for SRP"))
|
||||
}
|
||||
|
||||
const server = new jsrp.server();
|
||||
server.init(
|
||||
{
|
||||
salt: user.salt,
|
||||
verifier: user.verifier,
|
||||
b: loginSRPDetail.serverBInt
|
||||
},
|
||||
async () => {
|
||||
server.setClientPublicKey(loginSRPDetail.clientPublicKey);
|
||||
|
||||
// compare server and client shared keys
|
||||
if (server.checkClientProof(clientProof)) {
|
||||
|
||||
if (user.isMfaEnabled) {
|
||||
// case: user has MFA enabled
|
||||
|
||||
// generate temporary MFA token
|
||||
const token = createToken({
|
||||
payload: {
|
||||
userId: user._id.toString()
|
||||
},
|
||||
expiresIn: JWT_MFA_LIFETIME,
|
||||
secret: JWT_MFA_SECRET
|
||||
});
|
||||
|
||||
const code = await TokenService.createToken({
|
||||
type: TOKEN_EMAIL_MFA,
|
||||
email
|
||||
});
|
||||
|
||||
// send MFA code [code] to [email]
|
||||
await sendMail({
|
||||
template: 'emailMfa.handlebars',
|
||||
subjectLine: 'Infisical MFA code',
|
||||
recipients: [email],
|
||||
substitutions: {
|
||||
code
|
||||
}
|
||||
});
|
||||
|
||||
return res.status(200).send({
|
||||
mfaEnabled: true,
|
||||
token
|
||||
});
|
||||
}
|
||||
|
||||
await checkUserDevice({
|
||||
user,
|
||||
ip: req.ip,
|
||||
userAgent: req.headers['user-agent'] ?? ''
|
||||
});
|
||||
|
||||
// issue tokens
|
||||
const tokens = await issueAuthTokens({ userId: user._id.toString() });
|
||||
|
||||
// store (refresh) token in httpOnly cookie
|
||||
res.cookie('jid', tokens.refreshToken, {
|
||||
httpOnly: true,
|
||||
path: '/',
|
||||
sameSite: 'strict',
|
||||
secure: NODE_ENV === 'production' ? true : false
|
||||
});
|
||||
|
||||
// case: user does not have MFA enablgged
|
||||
// return (access) token in response
|
||||
|
||||
interface ResponseData {
|
||||
mfaEnabled: boolean;
|
||||
encryptionVersion: any;
|
||||
protectedKey?: string;
|
||||
protectedKeyIV?: string;
|
||||
protectedKeyTag?: string;
|
||||
token: string;
|
||||
publicKey?: string;
|
||||
encryptedPrivateKey?: string;
|
||||
iv?: string;
|
||||
tag?: string;
|
||||
}
|
||||
|
||||
const response: ResponseData = {
|
||||
mfaEnabled: false,
|
||||
encryptionVersion: user.encryptionVersion,
|
||||
token: tokens.token,
|
||||
publicKey: user.publicKey,
|
||||
encryptedPrivateKey: user.encryptedPrivateKey,
|
||||
iv: user.iv,
|
||||
tag: user.tag
|
||||
}
|
||||
|
||||
if (
|
||||
user?.protectedKey &&
|
||||
user?.protectedKeyIV &&
|
||||
user?.protectedKeyTag
|
||||
) {
|
||||
response.protectedKey = user.protectedKey;
|
||||
response.protectedKeyIV = user.protectedKeyIV
|
||||
response.protectedKeyTag = user.protectedKeyTag;
|
||||
}
|
||||
|
||||
const loginAction = await EELogService.createAction({
|
||||
name: ACTION_LOGIN,
|
||||
userId: user._id
|
||||
});
|
||||
|
||||
loginAction && await EELogService.createLog({
|
||||
userId: user._id,
|
||||
actions: [loginAction],
|
||||
channel: getChannelFromUserAgent(req.headers['user-agent']),
|
||||
ipAddress: req.ip
|
||||
});
|
||||
|
||||
return res.status(200).send(response);
|
||||
}
|
||||
|
||||
return res.status(400).send({
|
||||
message: 'Failed to authenticate. Try again?'
|
||||
});
|
||||
}
|
||||
);
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to authenticate. Try again?'
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Send MFA token to email [email]
|
||||
* @param req
|
||||
* @param res
|
||||
*/
|
||||
export const sendMfaToken = async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { email } = req.body;
|
||||
|
||||
const code = await TokenService.createToken({
|
||||
type: TOKEN_EMAIL_MFA,
|
||||
email
|
||||
});
|
||||
|
||||
// send MFA code [code] to [email]
|
||||
await sendMail({
|
||||
template: 'emailMfa.handlebars',
|
||||
subjectLine: 'Infisical MFA code',
|
||||
recipients: [email],
|
||||
substitutions: {
|
||||
code
|
||||
}
|
||||
});
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to send MFA code'
|
||||
});
|
||||
}
|
||||
|
||||
return res.status(200).send({
|
||||
message: 'Successfully sent new MFA code'
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify MFA token [mfaToken] and issue JWT and refresh tokens if the
|
||||
* MFA token [mfaToken] is valid
|
||||
* @param req
|
||||
* @param res
|
||||
*/
|
||||
export const verifyMfaToken = async (req: Request, res: Response) => {
|
||||
const { email, mfaToken } = req.body;
|
||||
|
||||
await TokenService.validateToken({
|
||||
type: TOKEN_EMAIL_MFA,
|
||||
email,
|
||||
token: mfaToken
|
||||
});
|
||||
|
||||
const user = await User.findOne({
|
||||
email
|
||||
}).select('+salt +verifier +encryptionVersion +protectedKey +protectedKeyIV +protectedKeyTag +publicKey +encryptedPrivateKey +iv +tag');
|
||||
|
||||
if (!user) throw new Error('Failed to find user');
|
||||
|
||||
await checkUserDevice({
|
||||
user,
|
||||
ip: req.ip,
|
||||
userAgent: req.headers['user-agent'] ?? ''
|
||||
});
|
||||
|
||||
// issue tokens
|
||||
const tokens = await issueAuthTokens({ userId: user._id.toString() });
|
||||
|
||||
// store (refresh) token in httpOnly cookie
|
||||
res.cookie('jid', tokens.refreshToken, {
|
||||
httpOnly: true,
|
||||
path: '/',
|
||||
sameSite: 'strict',
|
||||
secure: NODE_ENV === 'production' ? true : false
|
||||
});
|
||||
|
||||
interface VerifyMfaTokenRes {
|
||||
encryptionVersion: number;
|
||||
protectedKey?: string;
|
||||
protectedKeyIV?: string;
|
||||
protectedKeyTag?: string;
|
||||
token: string;
|
||||
publicKey: string;
|
||||
encryptedPrivateKey: string;
|
||||
iv: string;
|
||||
tag: string;
|
||||
}
|
||||
|
||||
const resObj: VerifyMfaTokenRes = {
|
||||
encryptionVersion: user.encryptionVersion,
|
||||
token: tokens.token,
|
||||
publicKey: user.publicKey as string,
|
||||
encryptedPrivateKey: user.encryptedPrivateKey as string,
|
||||
iv: user.iv as string,
|
||||
tag: user.tag as string
|
||||
}
|
||||
|
||||
if (user?.protectedKey && user?.protectedKeyIV && user?.protectedKeyTag) {
|
||||
resObj.protectedKey = user.protectedKey;
|
||||
resObj.protectedKeyIV = user.protectedKeyIV;
|
||||
resObj.protectedKeyTag = user.protectedKeyTag;
|
||||
}
|
||||
|
||||
const loginAction = await EELogService.createAction({
|
||||
name: ACTION_LOGIN,
|
||||
userId: user._id
|
||||
});
|
||||
|
||||
loginAction && await EELogService.createLog({
|
||||
userId: user._id,
|
||||
actions: [loginAction],
|
||||
channel: getChannelFromUserAgent(req.headers['user-agent']),
|
||||
ipAddress: req.ip
|
||||
});
|
||||
|
||||
return res.status(200).send(resObj);
|
||||
}
|
||||
|
@ -1,3 +1,5 @@
|
||||
import * as authController from './authController';
|
||||
import * as signupController from './signupController';
|
||||
import * as usersController from './usersController';
|
||||
import * as organizationsController from './organizationsController';
|
||||
import * as workspaceController from './workspaceController';
|
||||
@ -9,6 +11,8 @@ import * as environmentController from './environmentController';
|
||||
import * as tagController from './tagController';
|
||||
|
||||
export {
|
||||
authController,
|
||||
signupController,
|
||||
usersController,
|
||||
organizationsController,
|
||||
workspaceController,
|
||||
|
@ -1,7 +1,8 @@
|
||||
import to from 'await-to-js';
|
||||
import { Types } from 'mongoose';
|
||||
import { Request, Response } from 'express';
|
||||
import { ISecret, Membership, Secret, Workspace } from '../../models';
|
||||
import { ISecret, Secret } from '../../models';
|
||||
import { IAction } from '../../ee/models';
|
||||
import {
|
||||
SECRET_PERSONAL,
|
||||
SECRET_SHARED,
|
||||
@ -20,6 +21,252 @@ import { ABILITY_READ, ABILITY_WRITE } from '../../variables/organization';
|
||||
import { userHasNoAbility, userHasWorkspaceAccess, userHasWriteOnlyAbility } from '../../ee/helpers/checkMembershipPermissions';
|
||||
import Tag from '../../models/tag';
|
||||
import _ from 'lodash';
|
||||
import {
|
||||
BatchSecretRequest,
|
||||
BatchSecret
|
||||
} from '../../types/secret';
|
||||
|
||||
/**
|
||||
* Peform a batch of any specified CUD secret operations
|
||||
* @param req
|
||||
* @param res
|
||||
*/
|
||||
export const batchSecrets = async (req: Request, res: Response) => {
|
||||
const channel = getChannelFromUserAgent(req.headers['user-agent']);
|
||||
const {
|
||||
workspaceId,
|
||||
environment,
|
||||
requests
|
||||
}: {
|
||||
workspaceId: string;
|
||||
environment: string;
|
||||
requests: BatchSecretRequest[];
|
||||
} = req.body;
|
||||
|
||||
const createSecrets: BatchSecret[] = [];
|
||||
const updateSecrets: BatchSecret[] = [];
|
||||
const deleteSecrets: Types.ObjectId[] = [];
|
||||
const actions: IAction[] = [];
|
||||
|
||||
requests.forEach((request) => {
|
||||
switch (request.method) {
|
||||
case 'POST':
|
||||
createSecrets.push({
|
||||
...request.secret,
|
||||
version: 1,
|
||||
user: request.secret.type === SECRET_PERSONAL ? req.user : undefined,
|
||||
environment,
|
||||
workspace: new Types.ObjectId(workspaceId)
|
||||
});
|
||||
break;
|
||||
case 'PATCH':
|
||||
updateSecrets.push({
|
||||
...request.secret,
|
||||
_id: new Types.ObjectId(request.secret._id)
|
||||
});
|
||||
break;
|
||||
case 'DELETE':
|
||||
deleteSecrets.push(new Types.ObjectId(request.secret._id));
|
||||
break;
|
||||
}
|
||||
});
|
||||
|
||||
// handle create secrets
|
||||
let createdSecrets: ISecret[] = [];
|
||||
if (createSecrets.length > 0) {
|
||||
createdSecrets = await Secret.insertMany(createSecrets);
|
||||
// (EE) add secret versions for new secrets
|
||||
await EESecretService.addSecretVersions({
|
||||
secretVersions: createdSecrets.map((n: any) => {
|
||||
return ({
|
||||
...n._doc,
|
||||
_id: new Types.ObjectId(),
|
||||
secret: n._id,
|
||||
isDeleted: false
|
||||
});
|
||||
})
|
||||
});
|
||||
|
||||
const addAction = await EELogService.createAction({
|
||||
name: ACTION_ADD_SECRETS,
|
||||
userId: req.user._id,
|
||||
workspaceId: new Types.ObjectId(workspaceId),
|
||||
secretIds: createdSecrets.map((n) => n._id)
|
||||
}) as IAction;
|
||||
actions.push(addAction);
|
||||
|
||||
if (postHogClient) {
|
||||
postHogClient.capture({
|
||||
event: 'secrets added',
|
||||
distinctId: req.user.email,
|
||||
properties: {
|
||||
numberOfSecrets: createdSecrets.length,
|
||||
environment,
|
||||
workspaceId,
|
||||
channel,
|
||||
userAgent: req.headers?.['user-agent']
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// handle update secrets
|
||||
let updatedSecrets: ISecret[] = [];
|
||||
if (updateSecrets.length > 0 && req.secrets) {
|
||||
// construct object containing all secrets
|
||||
let listedSecretsObj: {
|
||||
[key: string]: {
|
||||
version: number;
|
||||
type: string;
|
||||
}
|
||||
} = {};
|
||||
|
||||
listedSecretsObj = req.secrets.reduce((obj: any, secret: ISecret) => ({
|
||||
...obj,
|
||||
[secret._id.toString()]: secret
|
||||
}), {});
|
||||
|
||||
const updateOperations = updateSecrets.map((u) => ({
|
||||
updateOne: {
|
||||
filter: { _id: new Types.ObjectId(u._id) },
|
||||
update: {
|
||||
$inc: {
|
||||
version: 1
|
||||
},
|
||||
...u,
|
||||
_id: new Types.ObjectId(u._id)
|
||||
}
|
||||
}
|
||||
}));
|
||||
|
||||
await Secret.bulkWrite(updateOperations);
|
||||
|
||||
const secretVersions = updateSecrets.map((u) => ({
|
||||
secret: new Types.ObjectId(u._id),
|
||||
version: listedSecretsObj[u._id.toString()].version,
|
||||
workspace: new Types.ObjectId(workspaceId),
|
||||
type: listedSecretsObj[u._id.toString()].type,
|
||||
environment,
|
||||
isDeleted: false,
|
||||
secretKeyCiphertext: u.secretKeyCiphertext,
|
||||
secretKeyIV: u.secretKeyIV,
|
||||
secretKeyTag: u.secretKeyTag,
|
||||
secretValueCiphertext: u.secretValueCiphertext,
|
||||
secretValueIV: u.secretValueIV,
|
||||
secretValueTag: u.secretValueTag,
|
||||
secretCommentCiphertext: u.secretCommentCiphertext,
|
||||
secretCommentIV: u.secretCommentIV,
|
||||
secretCommentTag: u.secretCommentTag,
|
||||
tags: u.tags
|
||||
}));
|
||||
|
||||
await EESecretService.addSecretVersions({
|
||||
secretVersions
|
||||
});
|
||||
|
||||
updatedSecrets = await Secret.find({
|
||||
_id: {
|
||||
$in: updateSecrets.map((u) => new Types.ObjectId(u._id))
|
||||
}
|
||||
});
|
||||
|
||||
const updateAction = await EELogService.createAction({
|
||||
name: ACTION_UPDATE_SECRETS,
|
||||
userId: req.user._id,
|
||||
workspaceId: new Types.ObjectId(workspaceId),
|
||||
secretIds: updatedSecrets.map((u) => u._id)
|
||||
}) as IAction;
|
||||
actions.push(updateAction);
|
||||
|
||||
if (postHogClient) {
|
||||
postHogClient.capture({
|
||||
event: 'secrets modified',
|
||||
distinctId: req.user.email,
|
||||
properties: {
|
||||
numberOfSecrets: updateSecrets.length,
|
||||
environment,
|
||||
workspaceId,
|
||||
channel,
|
||||
userAgent: req.headers?.['user-agent']
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// handle delete secrets
|
||||
if (deleteSecrets.length > 0) {
|
||||
await Secret.deleteMany({
|
||||
_id: {
|
||||
$in: deleteSecrets
|
||||
}
|
||||
});
|
||||
|
||||
await EESecretService.markDeletedSecretVersions({
|
||||
secretIds: deleteSecrets
|
||||
});
|
||||
|
||||
const deleteAction = await EELogService.createAction({
|
||||
name: ACTION_DELETE_SECRETS,
|
||||
userId: req.user._id,
|
||||
workspaceId: new Types.ObjectId(workspaceId),
|
||||
secretIds: deleteSecrets
|
||||
}) as IAction;
|
||||
actions.push(deleteAction);
|
||||
|
||||
if (postHogClient) {
|
||||
postHogClient.capture({
|
||||
event: 'secrets deleted',
|
||||
distinctId: req.user.email,
|
||||
properties: {
|
||||
numberOfSecrets: deleteSecrets.length,
|
||||
environment,
|
||||
workspaceId,
|
||||
channel: channel,
|
||||
userAgent: req.headers?.['user-agent']
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (actions.length > 0) {
|
||||
// (EE) create (audit) log
|
||||
await EELogService.createLog({
|
||||
userId: req.user._id.toString(),
|
||||
workspaceId: new Types.ObjectId(workspaceId),
|
||||
actions,
|
||||
channel,
|
||||
ipAddress: req.ip
|
||||
});
|
||||
}
|
||||
|
||||
// // trigger event - push secrets
|
||||
await EventService.handleEvent({
|
||||
event: eventPushSecrets({
|
||||
workspaceId
|
||||
})
|
||||
});
|
||||
|
||||
// (EE) take a secret snapshot
|
||||
await EESecretService.takeSecretSnapshot({
|
||||
workspaceId
|
||||
});
|
||||
|
||||
const resObj: { [key: string]: ISecret[] | string[] } = {}
|
||||
|
||||
if (createSecrets.length > 0) {
|
||||
resObj['createdSecrets'] = createdSecrets;
|
||||
}
|
||||
|
||||
if (updateSecrets.length > 0) {
|
||||
resObj['updatedSecrets'] = updatedSecrets;
|
||||
}
|
||||
|
||||
if (deleteSecrets.length > 0) {
|
||||
resObj['deletedSecrets'] = deleteSecrets.map((d) => d.toString());
|
||||
}
|
||||
|
||||
return res.status(200).send(resObj);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create secret(s) for workspace with id [workspaceId] and environment [environment]
|
||||
@ -111,9 +358,25 @@ export const createSecrets = async (req: Request, res: Response) => {
|
||||
tags: string[]
|
||||
}
|
||||
|
||||
const newlyCreatedSecrets = await Secret.insertMany(
|
||||
listOfSecretsToCreate.map(({
|
||||
const secretsToInsert: ISecret[] = listOfSecretsToCreate.map(({
|
||||
type,
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretCommentCiphertext,
|
||||
secretCommentIV,
|
||||
secretCommentTag,
|
||||
tags
|
||||
}: secretsToCreateType) => {
|
||||
return ({
|
||||
version: 1,
|
||||
workspace: new Types.ObjectId(workspaceId),
|
||||
type,
|
||||
user: type === SECRET_PERSONAL ? req.user : undefined,
|
||||
environment,
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
@ -124,26 +387,10 @@ export const createSecrets = async (req: Request, res: Response) => {
|
||||
secretCommentIV,
|
||||
secretCommentTag,
|
||||
tags
|
||||
}: secretsToCreateType) => {
|
||||
return ({
|
||||
version: 1,
|
||||
workspace: new Types.ObjectId(workspaceId),
|
||||
type,
|
||||
user: type === SECRET_PERSONAL ? req.user : undefined,
|
||||
environment,
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretCommentCiphertext,
|
||||
secretCommentIV,
|
||||
secretCommentTag,
|
||||
tags
|
||||
});
|
||||
})
|
||||
);
|
||||
});
|
||||
})
|
||||
|
||||
const newlyCreatedSecrets: ISecret[] = (await Secret.insertMany(secretsToInsert)).map((insertedSecret) => insertedSecret.toObject());
|
||||
|
||||
setTimeout(async () => {
|
||||
// trigger event - push secrets
|
||||
@ -166,11 +413,9 @@ export const createSecrets = async (req: Request, res: Response) => {
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretKeyHash,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretValueHash,
|
||||
secretCommentCiphertext,
|
||||
secretCommentIV,
|
||||
secretCommentTag,
|
||||
@ -187,11 +432,9 @@ export const createSecrets = async (req: Request, res: Response) => {
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretKeyHash,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretValueHash,
|
||||
secretCommentCiphertext,
|
||||
secretCommentIV,
|
||||
secretCommentTag,
|
||||
|
250
backend/src/controllers/v2/signupController.ts
Normal file
250
backend/src/controllers/v2/signupController.ts
Normal file
@ -0,0 +1,250 @@
|
||||
import { Request, Response } from 'express';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { User, MembershipOrg } from '../../models';
|
||||
import { completeAccount } from '../../helpers/user';
|
||||
import {
|
||||
initializeDefaultOrg
|
||||
} from '../../helpers/signup';
|
||||
import { issueAuthTokens } from '../../helpers/auth';
|
||||
import { INVITED, ACCEPTED } from '../../variables';
|
||||
import { NODE_ENV } from '../../config';
|
||||
import request from '../../config/request';
|
||||
|
||||
/**
|
||||
* Complete setting up user by adding their personal and auth information as part of the
|
||||
* signup flow
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const completeAccountSignup = async (req: Request, res: Response) => {
|
||||
let user, token, refreshToken;
|
||||
try {
|
||||
const {
|
||||
email,
|
||||
firstName,
|
||||
lastName,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier,
|
||||
organizationName
|
||||
}: {
|
||||
email: string;
|
||||
firstName: string;
|
||||
lastName: string;
|
||||
protectedKey: string;
|
||||
protectedKeyIV: string;
|
||||
protectedKeyTag: string;
|
||||
publicKey: string;
|
||||
encryptedPrivateKey: string;
|
||||
encryptedPrivateKeyIV: string;
|
||||
encryptedPrivateKeyTag: string;
|
||||
salt: string;
|
||||
verifier: string;
|
||||
organizationName: string;
|
||||
} = req.body;
|
||||
|
||||
// get user
|
||||
user = await User.findOne({ email });
|
||||
|
||||
if (!user || (user && user?.publicKey)) {
|
||||
// case 1: user doesn't exist.
|
||||
// case 2: user has already completed account
|
||||
return res.status(403).send({
|
||||
error: 'Failed to complete account for complete user'
|
||||
});
|
||||
}
|
||||
|
||||
// complete setting up user's account
|
||||
user = await completeAccount({
|
||||
userId: user._id.toString(),
|
||||
firstName,
|
||||
lastName,
|
||||
encryptionVersion: 2,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
});
|
||||
|
||||
if (!user)
|
||||
throw new Error('Failed to complete account for non-existent user'); // ensure user is non-null
|
||||
|
||||
// initialize default organization and workspace
|
||||
await initializeDefaultOrg({
|
||||
organizationName,
|
||||
user
|
||||
});
|
||||
|
||||
// update organization membership statuses that are
|
||||
// invited to completed with user attached
|
||||
await MembershipOrg.updateMany(
|
||||
{
|
||||
inviteEmail: email,
|
||||
status: INVITED
|
||||
},
|
||||
{
|
||||
user,
|
||||
status: ACCEPTED
|
||||
}
|
||||
);
|
||||
|
||||
// issue tokens
|
||||
const tokens = await issueAuthTokens({
|
||||
userId: user._id.toString()
|
||||
});
|
||||
|
||||
token = tokens.token;
|
||||
|
||||
// sending a welcome email to new users
|
||||
if (process.env.LOOPS_API_KEY) {
|
||||
await request.post("https://app.loops.so/api/v1/events/send", {
|
||||
"email": email,
|
||||
"eventName": "Sign Up",
|
||||
"firstName": firstName,
|
||||
"lastName": lastName
|
||||
}, {
|
||||
headers: {
|
||||
"Accept": "application/json",
|
||||
"Authorization": "Bearer " + process.env.LOOPS_API_KEY
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
// store (refresh) token in httpOnly cookie
|
||||
res.cookie('jid', tokens.refreshToken, {
|
||||
httpOnly: true,
|
||||
path: '/',
|
||||
sameSite: 'strict',
|
||||
secure: NODE_ENV === 'production' ? true : false
|
||||
});
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to complete account setup'
|
||||
});
|
||||
}
|
||||
|
||||
return res.status(200).send({
|
||||
message: 'Successfully set up account',
|
||||
user,
|
||||
token
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Complete setting up user by adding their personal and auth information as part of the
|
||||
* invite flow
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const completeAccountInvite = async (req: Request, res: Response) => {
|
||||
let user, token, refreshToken;
|
||||
try {
|
||||
const {
|
||||
email,
|
||||
firstName,
|
||||
lastName,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
} = req.body;
|
||||
|
||||
// get user
|
||||
user = await User.findOne({ email });
|
||||
|
||||
if (!user || (user && user?.publicKey)) {
|
||||
// case 1: user doesn't exist.
|
||||
// case 2: user has already completed account
|
||||
return res.status(403).send({
|
||||
error: 'Failed to complete account for complete user'
|
||||
});
|
||||
}
|
||||
|
||||
const membershipOrg = await MembershipOrg.findOne({
|
||||
inviteEmail: email,
|
||||
status: INVITED
|
||||
});
|
||||
|
||||
if (!membershipOrg) throw new Error('Failed to find invitations for email');
|
||||
|
||||
// complete setting up user's account
|
||||
user = await completeAccount({
|
||||
userId: user._id.toString(),
|
||||
firstName,
|
||||
lastName,
|
||||
encryptionVersion: 2,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
});
|
||||
|
||||
if (!user)
|
||||
throw new Error('Failed to complete account for non-existent user');
|
||||
|
||||
// update organization membership statuses that are
|
||||
// invited to completed with user attached
|
||||
await MembershipOrg.updateMany(
|
||||
{
|
||||
inviteEmail: email,
|
||||
status: INVITED
|
||||
},
|
||||
{
|
||||
user,
|
||||
status: ACCEPTED
|
||||
}
|
||||
);
|
||||
|
||||
// issue tokens
|
||||
const tokens = await issueAuthTokens({
|
||||
userId: user._id.toString()
|
||||
});
|
||||
|
||||
token = tokens.token;
|
||||
|
||||
// store (refresh) token in httpOnly cookie
|
||||
res.cookie('jid', tokens.refreshToken, {
|
||||
httpOnly: true,
|
||||
path: '/',
|
||||
sameSite: 'strict',
|
||||
secure: NODE_ENV === 'production' ? true : false
|
||||
});
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: 'Failed to complete account setup'
|
||||
});
|
||||
}
|
||||
|
||||
return res.status(200).send({
|
||||
message: 'Successfully set up account',
|
||||
user,
|
||||
token
|
||||
});
|
||||
};
|
@ -55,6 +55,44 @@ export const getMe = async (req: Request, res: Response) => {
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the current user's MFA-enabled status [isMfaEnabled].
|
||||
* Note: Infisical currently only supports email-based 2FA only; this will expand to
|
||||
* include SMS and authenticator app modes of authentication in the future.
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
export const updateMyMfaEnabled = async (req: Request, res: Response) => {
|
||||
let user;
|
||||
try {
|
||||
const { isMfaEnabled }: { isMfaEnabled: boolean } = req.body;
|
||||
req.user.isMfaEnabled = isMfaEnabled;
|
||||
|
||||
if (isMfaEnabled) {
|
||||
// TODO: adapt this route/controller
|
||||
// to work for different forms of MFA
|
||||
req.user.mfaMethods = ['email'];
|
||||
} else {
|
||||
req.user.mfaMethods = [];
|
||||
}
|
||||
|
||||
await req.user.save();
|
||||
|
||||
user = req.user;
|
||||
} catch (err) {
|
||||
Sentry.setUser({ email: req.user.email });
|
||||
Sentry.captureException(err);
|
||||
return res.status(400).send({
|
||||
message: "Failed to update current user's MFA status"
|
||||
});
|
||||
}
|
||||
|
||||
return res.status(200).send({
|
||||
user
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Return organizations that the current user is part of.
|
||||
* @param req
|
||||
|
@ -158,11 +158,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretKeyHash,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretValueHash
|
||||
} = oldSecretVersion;
|
||||
|
||||
// update secret
|
||||
@ -179,11 +177,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretKeyHash,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretValueHash
|
||||
},
|
||||
{
|
||||
new: true
|
||||
@ -204,11 +200,9 @@ export const rollbackSecretVersion = async (req: Request, res: Response) => {
|
||||
secretKeyCiphertext,
|
||||
secretKeyIV,
|
||||
secretKeyTag,
|
||||
secretKeyHash,
|
||||
secretValueCiphertext,
|
||||
secretValueIV,
|
||||
secretValueTag,
|
||||
secretValueHash
|
||||
secretValueTag
|
||||
}).save();
|
||||
|
||||
// take secret snapshot
|
||||
|
@ -5,22 +5,19 @@ import {
|
||||
} from '../../variables';
|
||||
|
||||
export interface ISecretVersion {
|
||||
_id: Types.ObjectId;
|
||||
secret: Types.ObjectId;
|
||||
version: number;
|
||||
workspace: Types.ObjectId; // new
|
||||
type: string; // new
|
||||
user: Types.ObjectId; // new
|
||||
user?: Types.ObjectId; // new
|
||||
environment: string; // new
|
||||
isDeleted: boolean;
|
||||
secretKeyCiphertext: string;
|
||||
secretKeyIV: string;
|
||||
secretKeyTag: string;
|
||||
secretKeyHash: string;
|
||||
secretValueCiphertext: string;
|
||||
secretValueIV: string;
|
||||
secretValueTag: string;
|
||||
secretValueHash: string;
|
||||
tags?: string[];
|
||||
}
|
||||
|
||||
@ -72,9 +69,6 @@ const secretVersionSchema = new Schema<ISecretVersion>(
|
||||
type: String, // symmetric
|
||||
required: true
|
||||
},
|
||||
secretKeyHash: {
|
||||
type: String
|
||||
},
|
||||
secretValueCiphertext: {
|
||||
type: String,
|
||||
required: true
|
||||
@ -87,9 +81,6 @@ const secretVersionSchema = new Schema<ISecretVersion>(
|
||||
type: String, // symmetric
|
||||
required: true
|
||||
},
|
||||
secretValueHash: {
|
||||
type: String
|
||||
},
|
||||
tags: {
|
||||
ref: 'Tag',
|
||||
type: [Schema.Types.ObjectId],
|
||||
|
@ -211,7 +211,7 @@ const getAuthAPIKeyPayload = async ({
|
||||
* @return {String} obj.token - issued JWT token
|
||||
* @return {String} obj.refreshToken - issued refresh token
|
||||
*/
|
||||
const issueTokens = async ({ userId }: { userId: string }) => {
|
||||
const issueAuthTokens = async ({ userId }: { userId: string }) => {
|
||||
let token: string;
|
||||
let refreshToken: string;
|
||||
try {
|
||||
@ -298,6 +298,6 @@ export {
|
||||
getAuthSTDPayload,
|
||||
getAuthAPIKeyPayload,
|
||||
createToken,
|
||||
issueTokens,
|
||||
issueAuthTokens,
|
||||
clearTokens
|
||||
};
|
||||
|
@ -267,7 +267,7 @@ const v1PushSecrets = async ({
|
||||
|
||||
if (toAdd.length > 0) {
|
||||
// add secrets
|
||||
const newSecrets = await Secret.insertMany(
|
||||
const newSecrets: ISecret[] = (await Secret.insertMany(
|
||||
toAdd.map((s, idx) => {
|
||||
const obj: any = {
|
||||
version: 1,
|
||||
@ -294,7 +294,7 @@ const v1PushSecrets = async ({
|
||||
|
||||
return obj;
|
||||
})
|
||||
);
|
||||
)).map((insertedSecret) => insertedSecret.toObject());
|
||||
|
||||
// (EE) add secret versions for new secrets
|
||||
EESecretService.addSecretVersions({
|
||||
|
@ -1,13 +1,11 @@
|
||||
import * as Sentry from '@sentry/node';
|
||||
import crypto from 'crypto';
|
||||
import { Token, IToken, IUser } from '../models';
|
||||
import { IUser } from '../models';
|
||||
import { createOrganization } from './organization';
|
||||
import { addMembershipsOrg } from './membershipOrg';
|
||||
import { createWorkspace } from './workspace';
|
||||
import { addMemberships } from './membership';
|
||||
import { OWNER, ADMIN, ACCEPTED } from '../variables';
|
||||
import { OWNER, ACCEPTED } from '../variables';
|
||||
import { sendMail } from '../helpers/nodemailer';
|
||||
import { EMAIL_TOKEN_LIFETIME } from '../config';
|
||||
import { TokenService } from '../services';
|
||||
import { TOKEN_EMAIL_CONFIRMATION } from '../variables';
|
||||
|
||||
/**
|
||||
* Send magic link to verify email to [email]
|
||||
@ -15,22 +13,13 @@ import { EMAIL_TOKEN_LIFETIME } from '../config';
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.email - email
|
||||
* @returns {Boolean} success - whether or not operation was successful
|
||||
*
|
||||
*/
|
||||
const sendEmailVerification = async ({ email }: { email: string }) => {
|
||||
try {
|
||||
const token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
|
||||
|
||||
await Token.findOneAndUpdate(
|
||||
{ email },
|
||||
{
|
||||
email,
|
||||
token,
|
||||
createdAt: new Date(),
|
||||
ttl: Math.floor(+new Date() / 1000) + EMAIL_TOKEN_LIFETIME // time in seconds, i.e unix
|
||||
},
|
||||
{ upsert: true, new: true }
|
||||
);
|
||||
const token = await TokenService.createToken({
|
||||
type: TOKEN_EMAIL_CONFIRMATION,
|
||||
email
|
||||
});
|
||||
|
||||
// send mail
|
||||
await sendMail({
|
||||
@ -64,21 +53,11 @@ const checkEmailVerification = async ({
|
||||
code: string;
|
||||
}) => {
|
||||
try {
|
||||
const token = await Token.findOne({
|
||||
await TokenService.validateToken({
|
||||
type: TOKEN_EMAIL_CONFIRMATION,
|
||||
email,
|
||||
token: code
|
||||
});
|
||||
|
||||
if (token && Math.floor(Date.now() / 1000) > token.ttl) {
|
||||
await Token.deleteOne({
|
||||
email,
|
||||
token: code
|
||||
});
|
||||
|
||||
throw new Error('Verification token has expired')
|
||||
}
|
||||
|
||||
if (!token) throw new Error('Failed to find email verification token');
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
@ -114,18 +93,6 @@ const initializeDefaultOrg = async ({
|
||||
roles: [OWNER],
|
||||
statuses: [ACCEPTED]
|
||||
});
|
||||
|
||||
// initialize a default workspace inside the new organization
|
||||
const workspace = await createWorkspace({
|
||||
name: `Example Project`,
|
||||
organizationId: organization._id.toString()
|
||||
});
|
||||
|
||||
await addMemberships({
|
||||
userIds: [user._id.toString()],
|
||||
workspaceId: workspace._id.toString(),
|
||||
roles: [ADMIN]
|
||||
});
|
||||
} catch (err) {
|
||||
throw new Error(`Failed to initialize default organization and workspace [err=${err}]`);
|
||||
}
|
||||
|
217
backend/src/helpers/token.ts
Normal file
217
backend/src/helpers/token.ts
Normal file
@ -0,0 +1,217 @@
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { Types } from 'mongoose';
|
||||
import { TokenData } from '../models';
|
||||
import crypto from 'crypto';
|
||||
import bcrypt from 'bcrypt';
|
||||
import {
|
||||
TOKEN_EMAIL_CONFIRMATION,
|
||||
TOKEN_EMAIL_MFA,
|
||||
TOKEN_EMAIL_ORG_INVITATION,
|
||||
TOKEN_EMAIL_PASSWORD_RESET
|
||||
} from '../variables';
|
||||
import {
|
||||
SALT_ROUNDS
|
||||
} from '../config';
|
||||
import { UnauthorizedRequestError } from '../utils/errors';
|
||||
|
||||
/**
|
||||
* Create and store a token in the database for purpose [type]
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.type
|
||||
* @param {String} obj.email
|
||||
* @param {String} obj.phoneNumber
|
||||
* @param {Types.ObjectId} obj.organizationId
|
||||
* @returns {String} token - the created token
|
||||
*/
|
||||
const createTokenHelper = async ({
|
||||
type,
|
||||
email,
|
||||
phoneNumber,
|
||||
organizationId
|
||||
}: {
|
||||
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organizationId?: Types.ObjectId
|
||||
}) => {
|
||||
let token, expiresAt, triesLeft;
|
||||
try {
|
||||
// generate random token based on specified token use-case
|
||||
// type [type]
|
||||
switch (type) {
|
||||
case TOKEN_EMAIL_CONFIRMATION:
|
||||
// generate random 6-digit code
|
||||
token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
|
||||
expiresAt = new Date((new Date()).getTime() + 86400000);
|
||||
break;
|
||||
case TOKEN_EMAIL_MFA:
|
||||
// generate random 6-digit code
|
||||
token = String(crypto.randomInt(Math.pow(10, 5), Math.pow(10, 6) - 1));
|
||||
triesLeft = 5;
|
||||
expiresAt = new Date((new Date()).getTime() + 300000);
|
||||
break;
|
||||
case TOKEN_EMAIL_ORG_INVITATION:
|
||||
// generate random hex
|
||||
token = crypto.randomBytes(16).toString('hex');
|
||||
expiresAt = new Date((new Date()).getTime() + 259200000);
|
||||
break;
|
||||
case TOKEN_EMAIL_PASSWORD_RESET:
|
||||
// generate random hex
|
||||
token = crypto.randomBytes(16).toString('hex');
|
||||
expiresAt = new Date((new Date()).getTime() + 86400000);
|
||||
break;
|
||||
default:
|
||||
token = crypto.randomBytes(16).toString('hex');
|
||||
expiresAt = new Date();
|
||||
break;
|
||||
}
|
||||
|
||||
interface TokenDataQuery {
|
||||
type: string;
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organization?: Types.ObjectId;
|
||||
}
|
||||
|
||||
interface TokenDataUpdate {
|
||||
type: string;
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organization?: Types.ObjectId;
|
||||
tokenHash: string;
|
||||
triesLeft?: number;
|
||||
expiresAt: Date;
|
||||
}
|
||||
|
||||
const query: TokenDataQuery = { type };
|
||||
const update: TokenDataUpdate = {
|
||||
type,
|
||||
tokenHash: await bcrypt.hash(token, SALT_ROUNDS),
|
||||
expiresAt
|
||||
}
|
||||
|
||||
if (email) {
|
||||
query.email = email;
|
||||
update.email = email;
|
||||
}
|
||||
if (phoneNumber) {
|
||||
query.phoneNumber = phoneNumber;
|
||||
update.phoneNumber = phoneNumber;
|
||||
}
|
||||
if (organizationId) {
|
||||
query.organization = organizationId
|
||||
update.organization = organizationId
|
||||
}
|
||||
|
||||
if (triesLeft) {
|
||||
update.triesLeft = triesLeft;
|
||||
}
|
||||
|
||||
await TokenData.findOneAndUpdate(
|
||||
query,
|
||||
update,
|
||||
{
|
||||
new: true,
|
||||
upsert: true
|
||||
}
|
||||
);
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error(
|
||||
"Failed to create token"
|
||||
);
|
||||
}
|
||||
|
||||
return token;
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.email - email associated with the token
|
||||
* @param {String} obj.token - value of the token
|
||||
*/
|
||||
const validateTokenHelper = async ({
|
||||
type,
|
||||
email,
|
||||
phoneNumber,
|
||||
organizationId,
|
||||
token
|
||||
}: {
|
||||
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organizationId?: Types.ObjectId;
|
||||
token: string;
|
||||
}) => {
|
||||
interface Query {
|
||||
type: string;
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organization?: Types.ObjectId;
|
||||
}
|
||||
|
||||
const query: Query = { type };
|
||||
|
||||
if (email) { query.email = email; }
|
||||
if (phoneNumber) { query.phoneNumber = phoneNumber; }
|
||||
if (organizationId) { query.organization = organizationId; }
|
||||
|
||||
const tokenData = await TokenData.findOne(query).select('+tokenHash');
|
||||
|
||||
if (!tokenData) throw new Error('Failed to find token to validate');
|
||||
|
||||
if (tokenData.expiresAt < new Date()) {
|
||||
// case: token expired
|
||||
await TokenData.findByIdAndDelete(tokenData._id);
|
||||
throw UnauthorizedRequestError({
|
||||
message: 'MFA session expired. Please log in again',
|
||||
context: {
|
||||
code: 'mfa_expired'
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
const isValid = await bcrypt.compare(token, tokenData.tokenHash);
|
||||
if (!isValid) {
|
||||
// case: token is not valid
|
||||
if (tokenData?.triesLeft !== undefined) {
|
||||
// case: token has a try-limit
|
||||
if (tokenData.triesLeft === 1) {
|
||||
// case: token is out of tries
|
||||
await TokenData.findByIdAndDelete(tokenData._id);
|
||||
} else {
|
||||
// case: token has more than 1 try left
|
||||
await TokenData.findByIdAndUpdate(tokenData._id, {
|
||||
triesLeft: tokenData.triesLeft - 1
|
||||
}, {
|
||||
new: true
|
||||
});
|
||||
}
|
||||
|
||||
throw UnauthorizedRequestError({
|
||||
message: 'MFA code is invalid',
|
||||
context: {
|
||||
code: 'mfa_invalid',
|
||||
triesLeft: tokenData.triesLeft - 1
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
throw UnauthorizedRequestError({
|
||||
message: 'MFA code is invalid',
|
||||
context: {
|
||||
code: 'mfa_invalid'
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// case: token is valid
|
||||
await TokenData.findByIdAndDelete(tokenData._id);
|
||||
}
|
||||
|
||||
export {
|
||||
createTokenHelper,
|
||||
validateTokenHelper
|
||||
}
|
@ -1,5 +1,6 @@
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { User, IUser } from '../models';
|
||||
import { IUser, User } from '../models';
|
||||
import { sendMail } from './nodemailer';
|
||||
|
||||
/**
|
||||
* Initialize a user under email [email]
|
||||
@ -28,10 +29,14 @@ const setupAccount = async ({ email }: { email: string }) => {
|
||||
* @param {String} obj.userId - id of user to finish setting up
|
||||
* @param {String} obj.firstName - first name of user
|
||||
* @param {String} obj.lastName - last name of user
|
||||
* @param {Number} obj.encryptionVersion - version of auth encryption scheme used
|
||||
* @param {String} obj.protectedKey - protected key in encryption version 2
|
||||
* @param {String} obj.protectedKeyIV - IV of protected key in encryption version 2
|
||||
* @param {String} obj.protectedKeyTag - tag of protected key in encryption version 2
|
||||
* @param {String} obj.publicKey - publickey of user
|
||||
* @param {String} obj.encryptedPrivateKey - (encrypted) private key of user
|
||||
* @param {String} obj.iv - iv for (encrypted) private key of user
|
||||
* @param {String} obj.tag - tag for (encrypted) private key of user
|
||||
* @param {String} obj.encryptedPrivateKeyIV - iv for (encrypted) private key of user
|
||||
* @param {String} obj.encryptedPrivateKeyTag - tag for (encrypted) private key of user
|
||||
* @param {String} obj.salt - salt for auth SRP
|
||||
* @param {String} obj.verifier - verifier for auth SRP
|
||||
* @returns {Object} user - the completed user
|
||||
@ -40,20 +45,28 @@ const completeAccount = async ({
|
||||
userId,
|
||||
firstName,
|
||||
lastName,
|
||||
encryptionVersion,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
encryptedPrivateKeyIV,
|
||||
encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
}: {
|
||||
userId: string;
|
||||
firstName: string;
|
||||
lastName: string;
|
||||
encryptionVersion: number;
|
||||
protectedKey: string;
|
||||
protectedKeyIV: string;
|
||||
protectedKeyTag: string;
|
||||
publicKey: string;
|
||||
encryptedPrivateKey: string;
|
||||
iv: string;
|
||||
tag: string;
|
||||
encryptedPrivateKeyIV: string;
|
||||
encryptedPrivateKeyTag: string;
|
||||
salt: string;
|
||||
verifier: string;
|
||||
}) => {
|
||||
@ -67,10 +80,14 @@ const completeAccount = async ({
|
||||
{
|
||||
firstName,
|
||||
lastName,
|
||||
encryptionVersion,
|
||||
protectedKey,
|
||||
protectedKeyIV,
|
||||
protectedKeyTag,
|
||||
publicKey,
|
||||
encryptedPrivateKey,
|
||||
iv,
|
||||
tag,
|
||||
iv: encryptedPrivateKeyIV,
|
||||
tag: encryptedPrivateKeyTag,
|
||||
salt,
|
||||
verifier
|
||||
},
|
||||
@ -85,4 +102,48 @@ const completeAccount = async ({
|
||||
return user;
|
||||
};
|
||||
|
||||
export { setupAccount, completeAccount };
|
||||
/**
|
||||
* Check if device with ip [ip] and user-agent [userAgent] has been seen for user [user].
|
||||
* If the device is unseen, then notify the user of the new device
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.ip - login ip address
|
||||
* @param {String} obj.userAgent - login user-agent
|
||||
*/
|
||||
const checkUserDevice = async ({
|
||||
user,
|
||||
ip,
|
||||
userAgent
|
||||
}: {
|
||||
user: IUser;
|
||||
ip: string;
|
||||
userAgent: string;
|
||||
}) => {
|
||||
const isDeviceSeen = user.devices.some((device) => device.ip === ip && device.userAgent === userAgent);
|
||||
|
||||
if (!isDeviceSeen) {
|
||||
// case: unseen login ip detected for user
|
||||
// -> notify user about the sign-in from new ip
|
||||
|
||||
user.devices = user.devices.concat([{
|
||||
ip: String(ip),
|
||||
userAgent
|
||||
}]);
|
||||
|
||||
await user.save();
|
||||
|
||||
// send MFA code [code] to [email]
|
||||
await sendMail({
|
||||
template: 'newDevice.handlebars',
|
||||
subjectLine: `Successful login from new device`,
|
||||
recipients: [user.email],
|
||||
substitutions: {
|
||||
email: user.email,
|
||||
timestamp: new Date().toString(),
|
||||
ip,
|
||||
userAgent
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export { setupAccount, completeAccount, checkUserDevice };
|
||||
|
@ -1,7 +1,7 @@
|
||||
import axios from "axios";
|
||||
import * as Sentry from "@sentry/node";
|
||||
import { Octokit } from "@octokit/rest";
|
||||
import { IIntegrationAuth } from "../models";
|
||||
import request from '../config/request';
|
||||
import {
|
||||
INTEGRATION_AZURE_KEY_VAULT,
|
||||
INTEGRATION_AWS_PARAMETER_STORE,
|
||||
@ -10,15 +10,19 @@ import {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
INTEGRATION_HEROKU_API_URL,
|
||||
INTEGRATION_GITLAB_API_URL,
|
||||
INTEGRATION_VERCEL_API_URL,
|
||||
INTEGRATION_NETLIFY_API_URL,
|
||||
INTEGRATION_RENDER_API_URL,
|
||||
INTEGRATION_FLYIO_API_URL,
|
||||
INTEGRATION_CIRCLECI_API_URL,
|
||||
INTEGRATION_TRAVISCI_API_URL,
|
||||
} from "../variables";
|
||||
|
||||
/**
|
||||
@ -42,7 +46,7 @@ const getApps = async ({
|
||||
owner?: string;
|
||||
}
|
||||
|
||||
let apps: App[];
|
||||
let apps: App[] = [];
|
||||
try {
|
||||
switch (integrationAuth.integration) {
|
||||
case INTEGRATION_AZURE_KEY_VAULT:
|
||||
@ -75,6 +79,11 @@ const getApps = async ({
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_GITLAB:
|
||||
apps = await getAppsGitlab({
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_RENDER:
|
||||
apps = await getAppsRender({
|
||||
accessToken,
|
||||
@ -90,6 +99,11 @@ const getApps = async ({
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_TRAVISCI:
|
||||
apps = await getAppsTravisCI({
|
||||
accessToken,
|
||||
})
|
||||
break;
|
||||
}
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
@ -111,7 +125,7 @@ const getAppsHeroku = async ({ accessToken }: { accessToken: string }) => {
|
||||
let apps;
|
||||
try {
|
||||
const res = (
|
||||
await axios.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
|
||||
await request.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
|
||||
headers: {
|
||||
Accept: "application/vnd.heroku+json; version=3",
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
@ -148,7 +162,7 @@ const getAppsVercel = async ({
|
||||
let apps;
|
||||
try {
|
||||
const res = (
|
||||
await axios.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
|
||||
await request.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
@ -186,7 +200,7 @@ const getAppsNetlify = async ({ accessToken }: { accessToken: string }) => {
|
||||
let apps;
|
||||
try {
|
||||
const res = (
|
||||
await axios.get(`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`, {
|
||||
await request.get(`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
@ -210,9 +224,9 @@ const getAppsNetlify = async ({ accessToken }: { accessToken: string }) => {
|
||||
/**
|
||||
* Return list of repositories for Github integration
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.accessToken - access token for Netlify API
|
||||
* @returns {Object[]} apps - names of Netlify sites
|
||||
* @returns {String} apps.name - name of Netlify site
|
||||
* @param {String} obj.accessToken - access token for Github API
|
||||
* @returns {Object[]} apps - names of Github sites
|
||||
* @returns {String} apps.name - name of Github site
|
||||
*/
|
||||
const getAppsGithub = async ({ accessToken }: { accessToken: string }) => {
|
||||
let apps;
|
||||
@ -257,7 +271,7 @@ const getAppsRender = async ({ accessToken }: { accessToken: string }) => {
|
||||
let apps: any;
|
||||
try {
|
||||
const res = (
|
||||
await axios.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
|
||||
await request.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
Accept: 'application/json',
|
||||
@ -303,23 +317,18 @@ const getAppsFlyio = async ({ accessToken }: { accessToken: string }) => {
|
||||
}
|
||||
`;
|
||||
|
||||
const res = (
|
||||
await axios({
|
||||
url: INTEGRATION_FLYIO_API_URL,
|
||||
method: "post",
|
||||
headers: {
|
||||
Authorization: "Bearer " + accessToken,
|
||||
const res = (await request.post(INTEGRATION_FLYIO_API_URL, {
|
||||
query,
|
||||
variables: {
|
||||
role: null,
|
||||
},
|
||||
}, {
|
||||
headers: {
|
||||
Authorization: "Bearer " + accessToken,
|
||||
'Accept': 'application/json',
|
||||
'Accept-Encoding': 'application/json',
|
||||
},
|
||||
data: {
|
||||
query,
|
||||
variables: {
|
||||
role: null,
|
||||
},
|
||||
},
|
||||
})
|
||||
).data.data.apps.nodes;
|
||||
},
|
||||
})).data.data.apps.nodes;
|
||||
|
||||
apps = res.map((a: any) => ({
|
||||
name: a.name,
|
||||
@ -344,7 +353,7 @@ const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
|
||||
let apps: any;
|
||||
try {
|
||||
const res = (
|
||||
await axios.get(
|
||||
await request.get(
|
||||
`${INTEGRATION_CIRCLECI_API_URL}/v1.1/projects`,
|
||||
{
|
||||
headers: {
|
||||
@ -369,4 +378,84 @@ const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
|
||||
return apps;
|
||||
};
|
||||
|
||||
const getAppsTravisCI = async ({ accessToken }: { accessToken: string }) => {
|
||||
let apps: any;
|
||||
try {
|
||||
const res = (
|
||||
await request.get(
|
||||
`${INTEGRATION_TRAVISCI_API_URL}/repos`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `token ${accessToken}`,
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
).data;
|
||||
|
||||
apps = res?.map((a: any) => {
|
||||
return {
|
||||
name: a?.slug?.split("/")[1],
|
||||
appId: a?.id,
|
||||
}
|
||||
});
|
||||
}catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error("Failed to get TravisCI projects");
|
||||
}
|
||||
|
||||
return apps;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return list of repositories for GitLab integration
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.accessToken - access token for GitLab API
|
||||
* @returns {Object[]} apps - names of GitLab sites
|
||||
* @returns {String} apps.name - name of GitLab site
|
||||
*/
|
||||
const getAppsGitlab = async ({ accessToken }: {accessToken: string}) => {
|
||||
let apps;
|
||||
|
||||
try {
|
||||
const { id } = (
|
||||
await request.get(
|
||||
`${INTEGRATION_GITLAB_API_URL}/v4/user`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `Bearer ${accessToken}`,
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
).data;
|
||||
|
||||
const res = (
|
||||
await request.get(
|
||||
`${INTEGRATION_GITLAB_API_URL}/v4/users/${id}/projects`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `Bearer ${accessToken}`,
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
).data;
|
||||
|
||||
apps = res?.map((a: any) => {
|
||||
return {
|
||||
name: a?.name,
|
||||
appId: `${a?.id}`,
|
||||
}
|
||||
});
|
||||
}catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error("Failed to get GitLab repos");
|
||||
}
|
||||
|
||||
return apps;
|
||||
}
|
||||
|
||||
export { getApps };
|
||||
|
@ -1,4 +1,4 @@
|
||||
import axios from 'axios';
|
||||
import request from '../config/request';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import {
|
||||
INTEGRATION_AZURE_KEY_VAULT,
|
||||
@ -6,11 +6,13 @@ import {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_AZURE_TOKEN_URL,
|
||||
INTEGRATION_HEROKU_TOKEN_URL,
|
||||
INTEGRATION_VERCEL_TOKEN_URL,
|
||||
INTEGRATION_NETLIFY_TOKEN_URL,
|
||||
INTEGRATION_GITHUB_TOKEN_URL
|
||||
INTEGRATION_GITHUB_TOKEN_URL,
|
||||
INTEGRATION_GITLAB_TOKEN_URL,
|
||||
} from '../variables';
|
||||
import {
|
||||
SITE_URL,
|
||||
@ -18,11 +20,13 @@ import {
|
||||
CLIENT_ID_VERCEL,
|
||||
CLIENT_ID_NETLIFY,
|
||||
CLIENT_ID_GITHUB,
|
||||
CLIENT_ID_GITLAB,
|
||||
CLIENT_SECRET_AZURE,
|
||||
CLIENT_SECRET_HEROKU,
|
||||
CLIENT_SECRET_VERCEL,
|
||||
CLIENT_SECRET_NETLIFY,
|
||||
CLIENT_SECRET_GITHUB
|
||||
CLIENT_SECRET_GITHUB,
|
||||
CLIENT_SECRET_GITLAB,
|
||||
} from '../config';
|
||||
|
||||
interface ExchangeCodeAzureResponse {
|
||||
@ -66,6 +70,15 @@ interface ExchangeCodeGithubResponse {
|
||||
token_type: string;
|
||||
}
|
||||
|
||||
interface ExchangeCodeGitlabResponse {
|
||||
access_token: string;
|
||||
token_type: string;
|
||||
expires_in: string;
|
||||
refresh_token: string;
|
||||
scope: string;
|
||||
created_at: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return [accessToken], [accessExpiresAt], and [refreshToken] for OAuth2
|
||||
* code-token exchange for integration named [integration]
|
||||
@ -114,6 +127,10 @@ const exchangeCode = async ({
|
||||
code
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_GITLAB:
|
||||
obj = await exchangeCodeGitlab({
|
||||
code
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
@ -136,9 +153,9 @@ const exchangeCodeAzure = async ({
|
||||
const accessExpiresAt = new Date();
|
||||
let res: ExchangeCodeAzureResponse;
|
||||
try {
|
||||
res = (await axios.post(
|
||||
res = (await request.post(
|
||||
INTEGRATION_AZURE_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
new URLSearchParams({
|
||||
grant_type: 'authorization_code',
|
||||
code: code,
|
||||
scope: 'https://vault.azure.net/.default openid offline_access',
|
||||
@ -147,16 +164,16 @@ const exchangeCodeAzure = async ({
|
||||
redirect_uri: `${SITE_URL}/integrations/azure-key-vault/oauth2/callback`
|
||||
} as any)
|
||||
)).data;
|
||||
|
||||
|
||||
accessExpiresAt.setSeconds(
|
||||
accessExpiresAt.getSeconds() + res.expires_in
|
||||
accessExpiresAt.getSeconds() + res.expires_in
|
||||
);
|
||||
} catch (err: any) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error('Failed OAuth2 code-token exchange with Azure');
|
||||
}
|
||||
|
||||
|
||||
return ({
|
||||
accessToken: res.access_token,
|
||||
refreshToken: res.refresh_token,
|
||||
@ -175,36 +192,36 @@ const exchangeCodeAzure = async ({
|
||||
* @returns {Date} obj2.accessExpiresAt - date of expiration for access token
|
||||
*/
|
||||
const exchangeCodeHeroku = async ({
|
||||
code
|
||||
code
|
||||
}: {
|
||||
code: string;
|
||||
code: string;
|
||||
}) => {
|
||||
let res: ExchangeCodeHerokuResponse;
|
||||
const accessExpiresAt = new Date();
|
||||
try {
|
||||
res = (await axios.post(
|
||||
INTEGRATION_HEROKU_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
grant_type: 'authorization_code',
|
||||
code: code,
|
||||
client_secret: CLIENT_SECRET_HEROKU
|
||||
} as any)
|
||||
)).data;
|
||||
|
||||
accessExpiresAt.setSeconds(
|
||||
accessExpiresAt.getSeconds() + res.expires_in
|
||||
);
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error('Failed OAuth2 code-token exchange with Heroku');
|
||||
}
|
||||
|
||||
return ({
|
||||
accessToken: res.access_token,
|
||||
refreshToken: res.refresh_token,
|
||||
accessExpiresAt
|
||||
});
|
||||
let res: ExchangeCodeHerokuResponse;
|
||||
const accessExpiresAt = new Date();
|
||||
try {
|
||||
res = (await request.post(
|
||||
INTEGRATION_HEROKU_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
grant_type: 'authorization_code',
|
||||
code: code,
|
||||
client_secret: CLIENT_SECRET_HEROKU
|
||||
} as any)
|
||||
)).data;
|
||||
|
||||
accessExpiresAt.setSeconds(
|
||||
accessExpiresAt.getSeconds() + res.expires_in
|
||||
);
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error('Failed OAuth2 code-token exchange with Heroku');
|
||||
}
|
||||
|
||||
return ({
|
||||
accessToken: res.access_token,
|
||||
refreshToken: res.refresh_token,
|
||||
accessExpiresAt
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
@ -221,7 +238,7 @@ const exchangeCodeVercel = async ({ code }: { code: string }) => {
|
||||
let res: ExchangeCodeVercelResponse;
|
||||
try {
|
||||
res = (
|
||||
await axios.post(
|
||||
await request.post(
|
||||
INTEGRATION_VERCEL_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
code: code,
|
||||
@ -234,7 +251,7 @@ const exchangeCodeVercel = async ({ code }: { code: string }) => {
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error('Failed OAuth2 code-token exchange with Vercel');
|
||||
throw new Error(`Failed OAuth2 code-token exchange with Vercel [err=${err}]`);
|
||||
}
|
||||
|
||||
return {
|
||||
@ -260,7 +277,7 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
|
||||
let accountId;
|
||||
try {
|
||||
res = (
|
||||
await axios.post(
|
||||
await request.post(
|
||||
INTEGRATION_NETLIFY_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
grant_type: 'authorization_code',
|
||||
@ -272,14 +289,14 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
|
||||
)
|
||||
).data;
|
||||
|
||||
const res2 = await axios.get('https://api.netlify.com/api/v1/sites', {
|
||||
const res2 = await request.get('https://api.netlify.com/api/v1/sites', {
|
||||
headers: {
|
||||
Authorization: `Bearer ${res.access_token}`
|
||||
}
|
||||
});
|
||||
|
||||
const res3 = (
|
||||
await axios.get('https://api.netlify.com/api/v1/accounts', {
|
||||
await request.get('https://api.netlify.com/api/v1/accounts', {
|
||||
headers: {
|
||||
Authorization: `Bearer ${res.access_token}`
|
||||
}
|
||||
@ -314,7 +331,7 @@ const exchangeCodeGithub = async ({ code }: { code: string }) => {
|
||||
let res: ExchangeCodeGithubResponse;
|
||||
try {
|
||||
res = (
|
||||
await axios.get(INTEGRATION_GITHUB_TOKEN_URL, {
|
||||
await request.get(INTEGRATION_GITHUB_TOKEN_URL, {
|
||||
params: {
|
||||
client_id: CLIENT_ID_GITHUB,
|
||||
client_secret: CLIENT_SECRET_GITHUB,
|
||||
@ -341,4 +358,48 @@ const exchangeCodeGithub = async ({ code }: { code: string }) => {
|
||||
};
|
||||
};
|
||||
|
||||
/**
|
||||
* Return [accessToken], [accessExpiresAt], and [refreshToken] for Gitlab
|
||||
* code-token exchange
|
||||
* @param {Object} obj1
|
||||
* @param {Object} obj1.code - code for code-token exchange
|
||||
* @returns {Object} obj2
|
||||
* @returns {String} obj2.accessToken - access token for Github API
|
||||
* @returns {String} obj2.refreshToken - refresh token for Github API
|
||||
* @returns {Date} obj2.accessExpiresAt - date of expiration for access token
|
||||
*/
|
||||
const exchangeCodeGitlab = async ({ code }: { code: string }) => {
|
||||
let res: ExchangeCodeGitlabResponse;
|
||||
|
||||
try {
|
||||
res = (
|
||||
await request.post(
|
||||
INTEGRATION_GITLAB_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
grant_type: 'authorization_code',
|
||||
code: code,
|
||||
client_id: CLIENT_ID_GITLAB,
|
||||
client_secret: CLIENT_SECRET_GITLAB,
|
||||
redirect_uri: `${SITE_URL}/integrations/gitlab/oauth2/callback`
|
||||
} as any),
|
||||
{
|
||||
headers: {
|
||||
"Accept-Encoding": "application/json",
|
||||
}
|
||||
}
|
||||
)
|
||||
).data;
|
||||
}catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error('Failed OAuth2 code-token exchange with Gitlab');
|
||||
}
|
||||
|
||||
return {
|
||||
accessToken: res.access_token,
|
||||
refreshToken: null,
|
||||
accessExpiresAt: null
|
||||
};
|
||||
}
|
||||
|
||||
export { exchangeCode };
|
||||
|
@ -1,4 +1,4 @@
|
||||
import axios from 'axios';
|
||||
import request from '../config/request';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { INTEGRATION_AZURE_KEY_VAULT, INTEGRATION_HEROKU } from '../variables';
|
||||
import {
|
||||
@ -71,7 +71,7 @@ const exchangeRefreshAzure = async ({
|
||||
refreshToken: string;
|
||||
}) => {
|
||||
try {
|
||||
const res: RefreshTokenAzureResponse = (await axios.post(
|
||||
const res: RefreshTokenAzureResponse = (await request.post(
|
||||
INTEGRATION_AZURE_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
client_id: CLIENT_ID_AZURE,
|
||||
@ -105,7 +105,7 @@ const exchangeRefreshHeroku = async ({
|
||||
|
||||
let accessToken;
|
||||
try {
|
||||
const res = await axios.post(
|
||||
const res = await request.post(
|
||||
INTEGRATION_HEROKU_TOKEN_URL,
|
||||
new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
|
@ -10,7 +10,8 @@ import {
|
||||
INTEGRATION_HEROKU,
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
} from '../variables';
|
||||
|
||||
const revokeAccess = async ({
|
||||
@ -32,6 +33,8 @@ const revokeAccess = async ({
|
||||
break;
|
||||
case INTEGRATION_GITHUB:
|
||||
break;
|
||||
case INTEGRATION_GITLAB:
|
||||
break;
|
||||
}
|
||||
|
||||
deletedIntegrationAuth = await IntegrationAuth.findOneAndDelete({
|
||||
|
@ -1,4 +1,3 @@
|
||||
import axios from "axios";
|
||||
import * as Sentry from "@sentry/node";
|
||||
import _ from 'lodash';
|
||||
import AWS from 'aws-sdk';
|
||||
@ -20,17 +19,21 @@ import {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
INTEGRATION_HEROKU_API_URL,
|
||||
INTEGRATION_GITLAB_API_URL,
|
||||
INTEGRATION_VERCEL_API_URL,
|
||||
INTEGRATION_NETLIFY_API_URL,
|
||||
INTEGRATION_RENDER_API_URL,
|
||||
INTEGRATION_FLYIO_API_URL,
|
||||
INTEGRATION_CIRCLECI_API_URL,
|
||||
INTEGRATION_TRAVISCI_API_URL,
|
||||
} from "../variables";
|
||||
import { access, appendFile } from "fs";
|
||||
import request from '../config/request';
|
||||
|
||||
/**
|
||||
* Sync/push [secrets] to [app] in integration named [integration]
|
||||
@ -109,6 +112,13 @@ const syncSecrets = async ({
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_GITLAB:
|
||||
await syncSecretsGitLab({
|
||||
integration,
|
||||
secrets,
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_RENDER:
|
||||
await syncSecretsRender({
|
||||
integration,
|
||||
@ -129,6 +139,14 @@ const syncSecrets = async ({
|
||||
secrets,
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
case INTEGRATION_TRAVISCI:
|
||||
await syncSecretsTravisCI({
|
||||
integration,
|
||||
secrets,
|
||||
accessToken,
|
||||
});
|
||||
break;
|
||||
}
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
@ -179,9 +197,10 @@ const syncSecretsAzureKeyVault = async ({
|
||||
let result: GetAzureKeyVaultSecret[] = [];
|
||||
|
||||
while (url) {
|
||||
const res = await axios.get(url, {
|
||||
const res = await request.get(url, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
}
|
||||
});
|
||||
|
||||
@ -200,9 +219,10 @@ const syncSecretsAzureKeyVault = async ({
|
||||
lastSlashIndex = getAzureKeyVaultSecret.id.lastIndexOf('/');
|
||||
}
|
||||
|
||||
const azureKeyVaultSecret = await axios.get(`${getAzureKeyVaultSecret.id}?api-version=7.3`, {
|
||||
const azureKeyVaultSecret = await request.get(`${getAzureKeyVaultSecret.id}?api-version=7.3`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${accessToken}`
|
||||
'Authorization': `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
}
|
||||
});
|
||||
|
||||
@ -252,14 +272,15 @@ const syncSecretsAzureKeyVault = async ({
|
||||
// Sync/push set secrets
|
||||
if (setSecrets.length > 0) {
|
||||
setSecrets.forEach(async ({ key, value }) => {
|
||||
await axios.put(
|
||||
await request.put(
|
||||
`${integration.app}/secrets/${key}?api-version=7.3`,
|
||||
{
|
||||
value
|
||||
},
|
||||
{
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
}
|
||||
}
|
||||
);
|
||||
@ -268,9 +289,10 @@ const syncSecretsAzureKeyVault = async ({
|
||||
|
||||
if (deleteSecrets.length > 0) {
|
||||
deleteSecrets.forEach(async (secret) => {
|
||||
await axios.delete(`${integration.app}/secrets/${secret.key}?api-version=7.3`, {
|
||||
await request.delete(`${integration.app}/secrets/${secret.key}?api-version=7.3`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${accessToken}`
|
||||
'Authorization': `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
}
|
||||
});
|
||||
});
|
||||
@ -482,12 +504,13 @@ const syncSecretsHeroku = async ({
|
||||
}) => {
|
||||
try {
|
||||
const herokuSecrets = (
|
||||
await axios.get(
|
||||
await request.get(
|
||||
`${INTEGRATION_HEROKU_API_URL}/apps/${integration.app}/config-vars`,
|
||||
{
|
||||
headers: {
|
||||
Accept: "application/vnd.heroku+json; version=3",
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
)
|
||||
@ -499,13 +522,14 @@ const syncSecretsHeroku = async ({
|
||||
}
|
||||
});
|
||||
|
||||
await axios.patch(
|
||||
await request.patch(
|
||||
`${INTEGRATION_HEROKU_API_URL}/apps/${integration.app}/config-vars`,
|
||||
secrets,
|
||||
{
|
||||
headers: {
|
||||
Accept: "application/vnd.heroku+json; version=3",
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
@ -540,7 +564,7 @@ const syncSecretsVercel = async ({
|
||||
value: string;
|
||||
target: string[];
|
||||
}
|
||||
|
||||
|
||||
try {
|
||||
// Get all (decrypted) secrets back from Vercel in
|
||||
// decrypted format
|
||||
@ -552,39 +576,85 @@ const syncSecretsVercel = async ({
|
||||
}
|
||||
: {}),
|
||||
};
|
||||
|
||||
// const res = (
|
||||
// await Promise.all(
|
||||
// (
|
||||
// await request.get(
|
||||
// `${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
|
||||
// {
|
||||
// params,
|
||||
// headers: {
|
||||
// Authorization: `Bearer ${accessToken}`,
|
||||
// 'Accept-Encoding': 'application/json'
|
||||
// }
|
||||
// }
|
||||
// ))
|
||||
// .data
|
||||
// .envs
|
||||
// .filter((secret: VercelSecret) => secret.target.includes(integration.targetEnvironment))
|
||||
// .map(async (secret: VercelSecret) => {
|
||||
// if (secret.type === 'encrypted') {
|
||||
// // case: secret is encrypted -> need to decrypt
|
||||
// const decryptedSecret = (await request.get(
|
||||
// `${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
|
||||
// {
|
||||
// params,
|
||||
// headers: {
|
||||
// Authorization: `Bearer ${accessToken}`,
|
||||
// 'Accept-Encoding': 'application/json'
|
||||
// }
|
||||
// }
|
||||
// )).data;
|
||||
|
||||
const res = (
|
||||
await Promise.all(
|
||||
(
|
||||
await axios.get(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
|
||||
// return decryptedSecret;
|
||||
// }
|
||||
|
||||
// return secret;
|
||||
// }))).reduce((obj: any, secret: any) => ({
|
||||
// ...obj,
|
||||
// [secret.key]: secret
|
||||
// }), {});
|
||||
|
||||
const vercelSecrets: VercelSecret[] = (await request.get(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
}
|
||||
}
|
||||
))
|
||||
.data
|
||||
.envs
|
||||
.filter((secret: VercelSecret) => secret.target.includes(integration.targetEnvironment));
|
||||
|
||||
const res: { [key: string]: VercelSecret } = {};
|
||||
|
||||
for await (const vercelSecret of vercelSecrets) {
|
||||
if (vercelSecret.type === 'encrypted') {
|
||||
// case: secret is encrypted -> need to decrypt
|
||||
const decryptedSecret = (await request.get(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${vercelSecret.id}`,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
}
|
||||
}
|
||||
))
|
||||
.data
|
||||
.envs
|
||||
.filter((secret: VercelSecret) => secret.target.includes(integration.targetEnvironment))
|
||||
.map(async (secret: VercelSecret) => (await axios.get(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`
|
||||
}
|
||||
}
|
||||
)).data)
|
||||
)).reduce((obj: any, secret: any) => ({
|
||||
...obj,
|
||||
[secret.key]: secret
|
||||
}), {});
|
||||
}
|
||||
)).data;
|
||||
|
||||
const updateSecrets: VercelSecret[] = [];
|
||||
const deleteSecrets: VercelSecret[] = [];
|
||||
const newSecrets: VercelSecret[] = [];
|
||||
res[vercelSecret.key] = decryptedSecret;
|
||||
} else {
|
||||
res[vercelSecret.key] = vercelSecret;
|
||||
}
|
||||
}
|
||||
|
||||
const updateSecrets: VercelSecret[] = [];
|
||||
const deleteSecrets: VercelSecret[] = [];
|
||||
const newSecrets: VercelSecret[] = [];
|
||||
|
||||
// Identify secrets to create
|
||||
Object.keys(secrets).map((key) => {
|
||||
@ -608,8 +678,10 @@ const syncSecretsVercel = async ({
|
||||
id: res[key].id,
|
||||
key: key,
|
||||
value: secrets[key],
|
||||
type: "encrypted",
|
||||
target: [integration.targetEnvironment],
|
||||
type: res[key].type,
|
||||
target: res[key].target.includes(integration.targetEnvironment)
|
||||
? [...res[key].target]
|
||||
: [...res[key].target, integration.targetEnvironment]
|
||||
});
|
||||
}
|
||||
} else {
|
||||
@ -618,7 +690,7 @@ const syncSecretsVercel = async ({
|
||||
id: res[key].id,
|
||||
key: key,
|
||||
value: res[key].value,
|
||||
type: "encrypted",
|
||||
type: "encrypted", // value doesn't matter
|
||||
target: [integration.targetEnvironment],
|
||||
});
|
||||
}
|
||||
@ -626,48 +698,47 @@ const syncSecretsVercel = async ({
|
||||
|
||||
// Sync/push new secrets
|
||||
if (newSecrets.length > 0) {
|
||||
await axios.post(
|
||||
await request.post(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v10/projects/${integration.app}/env`,
|
||||
newSecrets,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// Sync/push updated secrets
|
||||
if (updateSecrets.length > 0) {
|
||||
updateSecrets.forEach(async (secret: VercelSecret) => {
|
||||
|
||||
for await (const secret of updateSecrets) {
|
||||
if (secret.type !== 'sensitive') {
|
||||
const { id, ...updatedSecret } = secret;
|
||||
await axios.patch(
|
||||
await request.patch(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
|
||||
updatedSecret,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Delete secrets
|
||||
if (deleteSecrets.length > 0) {
|
||||
deleteSecrets.forEach(async (secret: VercelSecret) => {
|
||||
await axios.delete(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
},
|
||||
}
|
||||
);
|
||||
});
|
||||
for await (const secret of deleteSecrets) {
|
||||
await request.delete(
|
||||
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
|
||||
{
|
||||
params,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
@ -717,12 +788,13 @@ const syncSecretsNetlify = async ({
|
||||
});
|
||||
|
||||
const res = (
|
||||
await axios.get(
|
||||
await request.get(
|
||||
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env`,
|
||||
{
|
||||
params: getParams,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
)
|
||||
@ -830,13 +902,14 @@ const syncSecretsNetlify = async ({
|
||||
});
|
||||
|
||||
if (newSecrets.length > 0) {
|
||||
await axios.post(
|
||||
await request.post(
|
||||
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env`,
|
||||
newSecrets,
|
||||
{
|
||||
params: syncParams,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
@ -844,7 +917,7 @@ const syncSecretsNetlify = async ({
|
||||
|
||||
if (updateSecrets.length > 0) {
|
||||
updateSecrets.forEach(async (secret: NetlifySecret) => {
|
||||
await axios.patch(
|
||||
await request.patch(
|
||||
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${secret.key}`,
|
||||
{
|
||||
context: secret.values[0].context,
|
||||
@ -854,6 +927,7 @@ const syncSecretsNetlify = async ({
|
||||
params: syncParams,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
@ -862,12 +936,13 @@ const syncSecretsNetlify = async ({
|
||||
|
||||
if (deleteSecrets.length > 0) {
|
||||
deleteSecrets.forEach(async (key: string) => {
|
||||
await axios.delete(
|
||||
await request.delete(
|
||||
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${key}`,
|
||||
{
|
||||
params: syncParams,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
@ -876,12 +951,13 @@ const syncSecretsNetlify = async ({
|
||||
|
||||
if (deleteSecretValues.length > 0) {
|
||||
deleteSecretValues.forEach(async (secret: NetlifySecret) => {
|
||||
await axios.delete(
|
||||
await request.delete(
|
||||
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${secret.key}/value/${secret.values[0].id}`,
|
||||
{
|
||||
params: syncParams,
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
@ -1026,7 +1102,7 @@ const syncSecretsRender = async ({
|
||||
accessToken: string;
|
||||
}) => {
|
||||
try {
|
||||
await axios.put(
|
||||
await request.put(
|
||||
`${INTEGRATION_RENDER_API_URL}/v1/services/${integration.appId}/env-vars`,
|
||||
Object.keys(secrets).map((key) => ({
|
||||
key,
|
||||
@ -1035,6 +1111,7 @@ const syncSecretsRender = async ({
|
||||
{
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`,
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
}
|
||||
);
|
||||
@ -1083,23 +1160,21 @@ const syncSecretsFlyio = async ({
|
||||
}
|
||||
`;
|
||||
|
||||
await axios({
|
||||
url: INTEGRATION_FLYIO_API_URL,
|
||||
method: "post",
|
||||
await request.post(INTEGRATION_FLYIO_API_URL, {
|
||||
query: SetSecrets,
|
||||
variables: {
|
||||
input: {
|
||||
appId: integration.app,
|
||||
secrets: Object.entries(secrets).map(([key, value]) => ({
|
||||
key,
|
||||
value,
|
||||
})),
|
||||
},
|
||||
},
|
||||
}, {
|
||||
headers: {
|
||||
Authorization: "Bearer " + accessToken,
|
||||
},
|
||||
data: {
|
||||
query: SetSecrets,
|
||||
variables: {
|
||||
input: {
|
||||
appId: integration.app,
|
||||
secrets: Object.entries(secrets).map(([key, value]) => ({
|
||||
key,
|
||||
value,
|
||||
})),
|
||||
},
|
||||
},
|
||||
'Accept-Encoding': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
@ -1120,23 +1195,18 @@ const syncSecretsFlyio = async ({
|
||||
}
|
||||
}`;
|
||||
|
||||
const getSecretsRes = (
|
||||
await axios({
|
||||
method: "post",
|
||||
url: INTEGRATION_FLYIO_API_URL,
|
||||
headers: {
|
||||
'Authorization': 'Bearer ' + accessToken,
|
||||
'Content-Type': 'application/json',
|
||||
'Accept-Encoding': 'application/json'
|
||||
},
|
||||
data: {
|
||||
query: GetSecrets,
|
||||
variables: {
|
||||
appName: integration.app,
|
||||
},
|
||||
},
|
||||
})
|
||||
).data.data.app.secrets;
|
||||
const getSecretsRes = (await request.post(INTEGRATION_FLYIO_API_URL, {
|
||||
query: GetSecrets,
|
||||
variables: {
|
||||
appName: integration.app,
|
||||
},
|
||||
}, {
|
||||
headers: {
|
||||
Authorization: "Bearer " + accessToken,
|
||||
'Content-Type': 'application/json',
|
||||
'Accept-Encoding': 'application/json',
|
||||
},
|
||||
})).data.data.app.secrets;
|
||||
|
||||
const deleteSecretsKeys = getSecretsRes
|
||||
.filter((secret: FlyioSecret) => !(secret.name in secrets))
|
||||
@ -1161,23 +1231,22 @@ const syncSecretsFlyio = async ({
|
||||
}
|
||||
}`;
|
||||
|
||||
await axios({
|
||||
method: "post",
|
||||
url: INTEGRATION_FLYIO_API_URL,
|
||||
await request.post(INTEGRATION_FLYIO_API_URL, {
|
||||
query: DeleteSecrets,
|
||||
variables: {
|
||||
input: {
|
||||
appId: integration.app,
|
||||
keys: deleteSecretsKeys,
|
||||
},
|
||||
},
|
||||
}, {
|
||||
headers: {
|
||||
Authorization: "Bearer " + accessToken,
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
data: {
|
||||
query: DeleteSecrets,
|
||||
variables: {
|
||||
input: {
|
||||
appId: integration.app,
|
||||
keys: deleteSecretsKeys,
|
||||
},
|
||||
},
|
||||
'Accept-Encoding': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
@ -1203,7 +1272,7 @@ const syncSecretsCircleCI = async ({
|
||||
}) => {
|
||||
try {
|
||||
const circleciOrganizationDetail = (
|
||||
await axios.get(`${INTEGRATION_CIRCLECI_API_URL}/v2/me/collaborations`, {
|
||||
await request.get(`${INTEGRATION_CIRCLECI_API_URL}/v2/me/collaborations`, {
|
||||
headers: {
|
||||
"Circle-Token": accessToken,
|
||||
"Accept-Encoding": "application/json",
|
||||
@ -1216,7 +1285,7 @@ const syncSecretsCircleCI = async ({
|
||||
// sync secrets to CircleCI
|
||||
Object.keys(secrets).forEach(
|
||||
async (key) =>
|
||||
await axios.post(
|
||||
await request.post(
|
||||
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar`,
|
||||
{
|
||||
name: key,
|
||||
@ -1233,7 +1302,7 @@ const syncSecretsCircleCI = async ({
|
||||
|
||||
// get secrets from CircleCI
|
||||
const getSecretsRes = (
|
||||
await axios.get(
|
||||
await request.get(
|
||||
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar`,
|
||||
{
|
||||
headers: {
|
||||
@ -1247,7 +1316,7 @@ const syncSecretsCircleCI = async ({
|
||||
// delete secrets from CircleCI
|
||||
getSecretsRes.forEach(async (sec: any) => {
|
||||
if (!(sec.name in secrets)) {
|
||||
await axios.delete(
|
||||
await request.delete(
|
||||
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar/${sec.name}`,
|
||||
{
|
||||
headers: {
|
||||
@ -1265,4 +1334,196 @@ const syncSecretsCircleCI = async ({
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Sync/push [secrets] to TravisCI project
|
||||
* @param {Object} obj
|
||||
* @param {IIntegration} obj.integration - integration details
|
||||
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
|
||||
* @param {String} obj.accessToken - access token for TravisCI integration
|
||||
*/
|
||||
const syncSecretsTravisCI = async ({
|
||||
integration,
|
||||
secrets,
|
||||
accessToken,
|
||||
}: {
|
||||
integration: IIntegration;
|
||||
secrets: any;
|
||||
accessToken: string;
|
||||
}) => {
|
||||
try {
|
||||
// get secrets from travis-ci
|
||||
const getSecretsRes = (
|
||||
await request.get(
|
||||
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars?repository_id=${integration.appId}`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `token ${accessToken}`,
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
)
|
||||
.data
|
||||
?.env_vars
|
||||
.reduce((obj: any, secret: any) => ({
|
||||
...obj,
|
||||
[secret.name]: secret
|
||||
}), {});
|
||||
|
||||
// add secrets
|
||||
for await (const key of Object.keys(secrets)) {
|
||||
if (!(key in getSecretsRes)) {
|
||||
// case: secret does not exist in travis ci
|
||||
// -> add secret
|
||||
await request.post(
|
||||
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars?repository_id=${integration.appId}`,
|
||||
{
|
||||
env_var: {
|
||||
name: key,
|
||||
value: secrets[key]
|
||||
}
|
||||
},
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `token ${accessToken}`,
|
||||
"Content-Type": "application/json",
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
);
|
||||
} else {
|
||||
// case: secret exists in travis ci
|
||||
// -> update/set secret
|
||||
await request.patch(
|
||||
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars/${getSecretsRes[key].id}?repository_id=${getSecretsRes[key].repository_id}`,
|
||||
{
|
||||
env_var: {
|
||||
name: key,
|
||||
value: secrets[key],
|
||||
}
|
||||
},
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `token ${accessToken}`,
|
||||
"Content-Type": "application/json",
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
for await (const key of Object.keys(getSecretsRes)) {
|
||||
if (!(key in secrets)){
|
||||
// delete secret
|
||||
await request.delete(
|
||||
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars/${getSecretsRes[key].id}?repository_id=${getSecretsRes[key].repository_id}`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `token ${accessToken}`,
|
||||
"Content-Type": "application/json",
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error("Failed to sync secrets to GitLab");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync/push [secrets] to GitLab repo with name [integration.app]
|
||||
* @param {Object} obj
|
||||
* @param {IIntegration} obj.integration - integration details
|
||||
* @param {IIntegrationAuth} obj.integrationAuth - integration auth details
|
||||
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
|
||||
* @param {String} obj.accessToken - access token for GitLab integration
|
||||
*/
|
||||
const syncSecretsGitLab = async ({
|
||||
integration,
|
||||
secrets,
|
||||
accessToken,
|
||||
}: {
|
||||
integration: IIntegration;
|
||||
secrets: any;
|
||||
accessToken: string;
|
||||
}) => {
|
||||
try {
|
||||
// get secrets from gitlab
|
||||
const getSecretsRes = (
|
||||
await request.get(
|
||||
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `Bearer ${accessToken}`,
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
).data;
|
||||
|
||||
for (const key of Object.keys(secrets)) {
|
||||
const existingSecret = getSecretsRes.find((s: any) => s.key == key);
|
||||
if (!existingSecret) {
|
||||
await request.post(
|
||||
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables`,
|
||||
{
|
||||
key: key,
|
||||
value: secrets[key],
|
||||
protected: false,
|
||||
masked: false,
|
||||
raw: false,
|
||||
environment_scope:'*'
|
||||
},
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `Bearer ${accessToken}`,
|
||||
"Content-Type": "application/json",
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
}else {
|
||||
// udpate secret
|
||||
await request.put(
|
||||
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables/${existingSecret.key}`,
|
||||
{
|
||||
...existingSecret,
|
||||
value: secrets[existingSecret.key]
|
||||
},
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `Bearer ${accessToken}`,
|
||||
"Content-Type": "application/json",
|
||||
"Accept-Encoding": "application/json",
|
||||
},
|
||||
}
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// delete secrets
|
||||
for (const sec of getSecretsRes) {
|
||||
if (!(sec.key in secrets)) {
|
||||
await request.delete(
|
||||
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables/${sec.key}`,
|
||||
{
|
||||
headers: {
|
||||
"Authorization": `Bearer ${accessToken}`,
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
}catch (err) {
|
||||
Sentry.setUser(null);
|
||||
Sentry.captureException(err);
|
||||
throw new Error("Failed to sync secrets to GitLab");
|
||||
}
|
||||
}
|
||||
|
||||
export { syncSecrets };
|
||||
|
@ -1,4 +1,5 @@
|
||||
import requireAuth from './requireAuth';
|
||||
import requireMfaAuth from './requireMfaAuth';
|
||||
import requireBotAuth from './requireBotAuth';
|
||||
import requireSignupAuth from './requireSignupAuth';
|
||||
import requireWorkspaceAuth from './requireWorkspaceAuth';
|
||||
@ -15,6 +16,7 @@ import validateRequest from './validateRequest';
|
||||
|
||||
export {
|
||||
requireAuth,
|
||||
requireMfaAuth,
|
||||
requireBotAuth,
|
||||
requireSignupAuth,
|
||||
requireWorkspaceAuth,
|
||||
|
@ -1,11 +1,12 @@
|
||||
import { ErrorRequestHandler } from "express";
|
||||
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { InternalServerError } from "../utils/errors";
|
||||
import { InternalServerError, UnauthorizedRequestError } from "../utils/errors";
|
||||
import { getLogger } from "../utils/logger";
|
||||
import RequestError, { LogLevel } from "../utils/requestError";
|
||||
import { NODE_ENV } from "../config";
|
||||
|
||||
import { TokenExpiredError } from 'jsonwebtoken';
|
||||
|
||||
export const requestErrorHandler: ErrorRequestHandler = (error: RequestError | Error, req, res, next) => {
|
||||
if (res.headersSent) return next();
|
||||
|
43
backend/src/middleware/requireMfaAuth.ts
Normal file
43
backend/src/middleware/requireMfaAuth.ts
Normal file
@ -0,0 +1,43 @@
|
||||
import jwt from 'jsonwebtoken';
|
||||
import { Request, Response, NextFunction } from 'express';
|
||||
import { User } from '../models';
|
||||
import { JWT_MFA_SECRET } from '../config';
|
||||
import { BadRequestError, UnauthorizedRequestError } from '../utils/errors';
|
||||
|
||||
declare module 'jsonwebtoken' {
|
||||
export interface UserIDJwtPayload extends jwt.JwtPayload {
|
||||
userId: string;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate if (MFA) JWT temporary token on request is valid (e.g. not expired)
|
||||
* and if there is an associated user.
|
||||
*/
|
||||
const requireMfaAuth = async (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
) => {
|
||||
// JWT (temporary) authentication middleware for complete signup
|
||||
const [ AUTH_TOKEN_TYPE, AUTH_TOKEN_VALUE ] = <[string, string]>req.headers['authorization']?.split(' ', 2) ?? [null, null]
|
||||
if(AUTH_TOKEN_TYPE === null) return next(BadRequestError({message: `Missing Authorization Header in the request header.`}))
|
||||
if(AUTH_TOKEN_TYPE.toLowerCase() !== 'bearer') return next(BadRequestError({message: `The provided authentication type '${AUTH_TOKEN_TYPE}' is not supported.`}))
|
||||
if(AUTH_TOKEN_VALUE === null) return next(BadRequestError({message: 'Missing Authorization Body in the request header'}))
|
||||
|
||||
const decodedToken = <jwt.UserIDJwtPayload>(
|
||||
jwt.verify(AUTH_TOKEN_VALUE, JWT_MFA_SECRET)
|
||||
);
|
||||
|
||||
const user = await User.findOne({
|
||||
_id: decodedToken.userId
|
||||
}).select('+publicKey');
|
||||
|
||||
if (!user)
|
||||
return next(UnauthorizedRequestError({message: 'Unable to authenticate for User account completion. Try logging in again'}))
|
||||
|
||||
req.user = user;
|
||||
return next();
|
||||
};
|
||||
|
||||
export default requireMfaAuth;
|
@ -10,7 +10,7 @@ import MembershipOrg, { IMembershipOrg } from './membershipOrg';
|
||||
import Organization, { IOrganization } from './organization';
|
||||
import Secret, { ISecret } from './secret';
|
||||
import ServiceToken, { IServiceToken } from './serviceToken';
|
||||
import Token, { IToken } from './token';
|
||||
import TokenData, { ITokenData } from './tokenData';
|
||||
import User, { IUser } from './user';
|
||||
import UserAction, { IUserAction } from './userAction';
|
||||
import Workspace, { IWorkspace } from './workspace';
|
||||
@ -43,8 +43,8 @@ export {
|
||||
ISecret,
|
||||
ServiceToken,
|
||||
IServiceToken,
|
||||
Token,
|
||||
IToken,
|
||||
TokenData,
|
||||
ITokenData,
|
||||
User,
|
||||
IUser,
|
||||
UserAction,
|
||||
|
@ -7,9 +7,11 @@ import {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
} from "../variables";
|
||||
|
||||
export interface IIntegration {
|
||||
@ -30,10 +32,12 @@ export interface IIntegration {
|
||||
| 'heroku'
|
||||
| 'vercel'
|
||||
| 'netlify'
|
||||
| 'github'
|
||||
| 'github'
|
||||
| 'gitlab'
|
||||
| 'render'
|
||||
| 'flyio'
|
||||
| 'circleci';
|
||||
| 'circleci'
|
||||
| 'travisci';
|
||||
integrationAuth: Types.ObjectId;
|
||||
}
|
||||
|
||||
@ -94,9 +98,11 @@ const integrationSchema = new Schema<IIntegration>(
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
],
|
||||
required: true,
|
||||
},
|
||||
|
@ -7,15 +7,17 @@ import {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
} from "../variables";
|
||||
|
||||
export interface IIntegrationAuth {
|
||||
_id: Types.ObjectId;
|
||||
workspace: Types.ObjectId;
|
||||
integration: 'heroku' | 'vercel' | 'netlify' | 'github' | 'render' | 'flyio' | 'azure-key-vault' | 'circleci' | 'aws-parameter-store' | 'aws-secret-manager';
|
||||
integration: 'heroku' | 'vercel' | 'netlify' | 'github' | 'gitlab' | 'render' | 'flyio' | 'azure-key-vault' | 'circleci' | 'travisci' | 'aws-parameter-store' | 'aws-secret-manager';
|
||||
teamId: string;
|
||||
accountId: string;
|
||||
refreshCiphertext?: string;
|
||||
@ -47,9 +49,11 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
],
|
||||
required: true,
|
||||
},
|
||||
|
@ -5,7 +5,7 @@ export interface IToken {
|
||||
email: string;
|
||||
token: string;
|
||||
createdAt: Date;
|
||||
ttl: Number;
|
||||
ttl: number;
|
||||
}
|
||||
|
||||
const tokenSchema = new Schema<IToken>({
|
||||
|
55
backend/src/models/tokenData.ts
Normal file
55
backend/src/models/tokenData.ts
Normal file
@ -0,0 +1,55 @@
|
||||
import { Schema, Types, model } from 'mongoose';
|
||||
|
||||
export interface ITokenData {
|
||||
type: string;
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organization?: Types.ObjectId;
|
||||
tokenHash: string;
|
||||
triesLeft?: number;
|
||||
expiresAt: Date;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
const tokenDataSchema = new Schema<ITokenData>({
|
||||
type: {
|
||||
type: String,
|
||||
enum: [
|
||||
'emailConfirmation',
|
||||
'emailMfa',
|
||||
'organizationInvitation',
|
||||
'passwordReset'
|
||||
],
|
||||
required: true
|
||||
},
|
||||
email: {
|
||||
type: String
|
||||
},
|
||||
phoneNumber: {
|
||||
type: String
|
||||
},
|
||||
organization: { // organizationInvitation-specific field
|
||||
type: Schema.Types.ObjectId,
|
||||
ref: 'Organization'
|
||||
},
|
||||
tokenHash: {
|
||||
type: String,
|
||||
select: false,
|
||||
required: true
|
||||
},
|
||||
triesLeft: {
|
||||
type: Number
|
||||
},
|
||||
expiresAt: {
|
||||
type: Date,
|
||||
expires: 0,
|
||||
required: true
|
||||
}
|
||||
}, {
|
||||
timestamps: true
|
||||
});
|
||||
|
||||
const TokenData = model<ITokenData>('TokenData', tokenDataSchema);
|
||||
|
||||
export default TokenData;
|
@ -1,10 +1,14 @@
|
||||
import { Schema, model, Types } from 'mongoose';
|
||||
import { Schema, model, Types, Document } from 'mongoose';
|
||||
|
||||
export interface IUser {
|
||||
export interface IUser extends Document {
|
||||
_id: Types.ObjectId;
|
||||
email: string;
|
||||
firstName?: string;
|
||||
lastName?: string;
|
||||
encryptionVersion: number;
|
||||
protectedKey: string;
|
||||
protectedKeyIV: string;
|
||||
protectedKeyTag: string;
|
||||
publicKey?: string;
|
||||
encryptedPrivateKey?: string;
|
||||
iv?: string;
|
||||
@ -12,7 +16,12 @@ export interface IUser {
|
||||
salt?: string;
|
||||
verifier?: string;
|
||||
refreshVersion?: number;
|
||||
seenIps: [string];
|
||||
isMfaEnabled: boolean;
|
||||
mfaMethods: boolean;
|
||||
devices: {
|
||||
ip: string;
|
||||
userAgent: string;
|
||||
}[];
|
||||
}
|
||||
|
||||
const userSchema = new Schema<IUser>(
|
||||
@ -27,6 +36,23 @@ const userSchema = new Schema<IUser>(
|
||||
lastName: {
|
||||
type: String
|
||||
},
|
||||
encryptionVersion: {
|
||||
type: Number,
|
||||
select: false,
|
||||
default: 1 // to resolve backward-compatibility issues
|
||||
},
|
||||
protectedKey: { // introduced as part of encryption version 2
|
||||
type: String,
|
||||
select: false
|
||||
},
|
||||
protectedKeyIV: { // introduced as part of encryption version 2
|
||||
type: String,
|
||||
select: false
|
||||
},
|
||||
protectedKeyTag: { // introduced as part of encryption version 2
|
||||
type: String,
|
||||
select: false
|
||||
},
|
||||
publicKey: {
|
||||
type: String,
|
||||
select: false
|
||||
@ -35,11 +61,11 @@ const userSchema = new Schema<IUser>(
|
||||
type: String,
|
||||
select: false
|
||||
},
|
||||
iv: {
|
||||
iv: { // iv of [encryptedPrivateKey]
|
||||
type: String,
|
||||
select: false
|
||||
},
|
||||
tag: {
|
||||
tag: { // tag of [encryptedPrivateKey]
|
||||
type: String,
|
||||
select: false
|
||||
},
|
||||
@ -56,8 +82,21 @@ const userSchema = new Schema<IUser>(
|
||||
default: 0,
|
||||
select: false
|
||||
},
|
||||
seenIps: [String]
|
||||
},
|
||||
isMfaEnabled: {
|
||||
type: Boolean,
|
||||
default: false
|
||||
},
|
||||
mfaMethods: [{
|
||||
type: String
|
||||
}],
|
||||
devices: {
|
||||
type: [{
|
||||
ip: String,
|
||||
userAgent: String
|
||||
}],
|
||||
default: []
|
||||
}
|
||||
},
|
||||
{
|
||||
timestamps: true
|
||||
}
|
||||
|
@ -7,7 +7,7 @@ import { authLimiter } from '../../helpers/rateLimiter';
|
||||
|
||||
router.post('/token', validateRequest, authController.getNewToken);
|
||||
|
||||
router.post(
|
||||
router.post( // deprecated (moved to api/v2/auth/login1)
|
||||
'/login1',
|
||||
authLimiter,
|
||||
body('email').exists().trim().notEmpty(),
|
||||
@ -16,7 +16,7 @@ router.post(
|
||||
authController.login1
|
||||
);
|
||||
|
||||
router.post(
|
||||
router.post( // deprecated (moved to api/v2/auth/login2)
|
||||
'/login2',
|
||||
authLimiter,
|
||||
body('email').exists().trim().notEmpty(),
|
||||
|
@ -31,7 +31,7 @@ router.patch(
|
||||
requireBotAuth({
|
||||
acceptedRoles: [ADMIN, MEMBER]
|
||||
}),
|
||||
body('isActive').isBoolean(),
|
||||
body('isActive').exists().isBoolean(),
|
||||
body('botKey'),
|
||||
validateRequest,
|
||||
botController.setBotActiveState
|
||||
|
@ -10,7 +10,7 @@ router.post(
|
||||
requireAuth({
|
||||
acceptedAuthModes: ['jwt']
|
||||
}),
|
||||
body('clientPublicKey').exists().trim().notEmpty(),
|
||||
body('clientPublicKey').exists().isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
passwordController.srp1
|
||||
);
|
||||
@ -22,11 +22,14 @@ router.post(
|
||||
acceptedAuthModes: ['jwt']
|
||||
}),
|
||||
body('clientProof').exists().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().trim().notEmpty().notEmpty(), // private key encrypted under new pwd
|
||||
body('iv').exists().trim().notEmpty(), // new iv for private key
|
||||
body('tag').exists().trim().notEmpty(), // new tag for private key
|
||||
body('salt').exists().trim().notEmpty(), // part of new pwd
|
||||
body('verifier').exists().trim().notEmpty(), // part of new pwd
|
||||
body('protectedKey').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyIV').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyTag').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // private key encrypted under new pwd
|
||||
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(), // new iv for private key
|
||||
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(), // new tag for private key
|
||||
body('salt').exists().isString().trim().notEmpty(), // part of new pwd
|
||||
body('verifier').exists().isString().trim().notEmpty(), // part of new pwd
|
||||
validateRequest,
|
||||
passwordController.changePassword
|
||||
);
|
||||
@ -34,7 +37,7 @@ router.post(
|
||||
router.post(
|
||||
'/email/password-reset',
|
||||
passwordLimiter,
|
||||
body('email').exists().trim().notEmpty(),
|
||||
body('email').exists().isString().trim().notEmpty().isEmail(),
|
||||
validateRequest,
|
||||
passwordController.emailPasswordReset
|
||||
);
|
||||
@ -42,8 +45,8 @@ router.post(
|
||||
router.post(
|
||||
'/email/password-reset-verify',
|
||||
passwordLimiter,
|
||||
body('email').exists().trim().notEmpty().isEmail(),
|
||||
body('code').exists().trim().notEmpty(),
|
||||
body('email').exists().isString().trim().notEmpty().isEmail(),
|
||||
body('code').exists().isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
passwordController.emailPasswordResetVerify
|
||||
);
|
||||
@ -61,12 +64,12 @@ router.post(
|
||||
requireAuth({
|
||||
acceptedAuthModes: ['jwt']
|
||||
}),
|
||||
body('clientProof').exists().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().trim().notEmpty(), // (backup) private key encrypted under a strong key
|
||||
body('iv').exists().trim().notEmpty(), // new iv for (backup) private key
|
||||
body('tag').exists().trim().notEmpty(), // new tag for (backup) private key
|
||||
body('salt').exists().trim().notEmpty(), // salt generated from strong key
|
||||
body('verifier').exists().trim().notEmpty(), // salt generated from strong key
|
||||
body('clientProof').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // (backup) private key encrypted under a strong key
|
||||
body('iv').exists().isString().trim().notEmpty(), // new iv for (backup) private key
|
||||
body('tag').exists().isString().trim().notEmpty(), // new tag for (backup) private key
|
||||
body('salt').exists().isString().trim().notEmpty(), // salt generated from strong key
|
||||
body('verifier').exists().isString().trim().notEmpty(), // salt generated from strong key
|
||||
validateRequest,
|
||||
passwordController.createBackupPrivateKey
|
||||
);
|
||||
@ -74,11 +77,14 @@ router.post(
|
||||
router.post(
|
||||
'/password-reset',
|
||||
requireSignupAuth,
|
||||
body('encryptedPrivateKey').exists().trim().notEmpty(), // private key encrypted under new pwd
|
||||
body('iv').exists().trim().notEmpty(), // new iv for private key
|
||||
body('tag').exists().trim().notEmpty(), // new tag for private key
|
||||
body('salt').exists().trim().notEmpty(), // part of new pwd
|
||||
body('verifier').exists().trim().notEmpty(), // part of new pwd
|
||||
body('protectedKey').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyIV').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyTag').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().isString().trim().notEmpty(), // private key encrypted under new pwd
|
||||
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(), // new iv for private key
|
||||
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(), // new tag for private key
|
||||
body('salt').exists().isString().trim().notEmpty(), // part of new pwd
|
||||
body('verifier').exists().isString().trim().notEmpty(), // part of new pwd
|
||||
validateRequest,
|
||||
passwordController.resetPassword
|
||||
);
|
||||
|
@ -1,7 +1,7 @@
|
||||
import express from 'express';
|
||||
const router = express.Router();
|
||||
import { body } from 'express-validator';
|
||||
import { requireSignupAuth, validateRequest } from '../../middleware';
|
||||
import { validateRequest } from '../../middleware';
|
||||
import { signupController } from '../../controllers/v1';
|
||||
import { authLimiter } from '../../helpers/rateLimiter';
|
||||
|
||||
@ -22,39 +22,4 @@ router.post(
|
||||
signupController.verifyEmailSignup
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/complete-account/signup',
|
||||
authLimiter,
|
||||
requireSignupAuth,
|
||||
body('email').exists().trim().notEmpty().isEmail(),
|
||||
body('firstName').exists().trim().notEmpty(),
|
||||
body('lastName').exists().trim().notEmpty(),
|
||||
body('publicKey').exists().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().trim().notEmpty(),
|
||||
body('iv').exists().trim().notEmpty(),
|
||||
body('tag').exists().trim().notEmpty(),
|
||||
body('salt').exists().trim().notEmpty(),
|
||||
body('verifier').exists().trim().notEmpty(),
|
||||
body('organizationName').exists().trim().notEmpty(),
|
||||
validateRequest,
|
||||
signupController.completeAccountSignup
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/complete-account/invite',
|
||||
authLimiter,
|
||||
requireSignupAuth,
|
||||
body('email').exists().trim().notEmpty().isEmail(),
|
||||
body('firstName').exists().trim().notEmpty(),
|
||||
body('lastName').exists().trim().notEmpty(),
|
||||
body('publicKey').exists().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().trim().notEmpty(),
|
||||
body('iv').exists().trim().notEmpty(),
|
||||
body('tag').exists().trim().notEmpty(),
|
||||
body('salt').exists().trim().notEmpty(),
|
||||
body('verifier').exists().trim().notEmpty(),
|
||||
validateRequest,
|
||||
signupController.completeAccountInvite
|
||||
);
|
||||
|
||||
export default router;
|
||||
|
44
backend/src/routes/v2/auth.ts
Normal file
44
backend/src/routes/v2/auth.ts
Normal file
@ -0,0 +1,44 @@
|
||||
import express from 'express';
|
||||
const router = express.Router();
|
||||
import { body } from 'express-validator';
|
||||
import { requireMfaAuth, validateRequest } from '../../middleware';
|
||||
import { authController } from '../../controllers/v2';
|
||||
import { authLimiter } from '../../helpers/rateLimiter';
|
||||
|
||||
router.post(
|
||||
'/login1',
|
||||
authLimiter,
|
||||
body('email').isString().trim().notEmpty(),
|
||||
body('clientPublicKey').isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
authController.login1
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/login2',
|
||||
authLimiter,
|
||||
body('email').isString().trim().notEmpty(),
|
||||
body('clientProof').isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
authController.login2
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/mfa/send',
|
||||
authLimiter,
|
||||
body('email').isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
authController.sendMfaToken
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/mfa/verify',
|
||||
authLimiter,
|
||||
requireMfaAuth,
|
||||
body('email').isString().trim().notEmpty(),
|
||||
body('mfaToken').isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
authController.verifyMfaToken
|
||||
);
|
||||
|
||||
export default router;
|
@ -1,3 +1,5 @@
|
||||
import auth from './auth';
|
||||
import signup from './signup';
|
||||
import users from './users';
|
||||
import organizations from './organizations';
|
||||
import workspace from './workspace';
|
||||
@ -9,6 +11,8 @@ import environment from "./environment"
|
||||
import tags from "./tags"
|
||||
|
||||
export {
|
||||
auth,
|
||||
signup,
|
||||
users,
|
||||
organizations,
|
||||
workspace,
|
||||
|
@ -6,14 +6,52 @@ import {
|
||||
requireSecretsAuth,
|
||||
validateRequest
|
||||
} from '../../middleware';
|
||||
import { query, check, body } from 'express-validator';
|
||||
import { query, body } from 'express-validator';
|
||||
import { secretsController } from '../../controllers/v2';
|
||||
import { validateSecrets } from '../../helpers/secret';
|
||||
import {
|
||||
ADMIN,
|
||||
MEMBER,
|
||||
SECRET_PERSONAL,
|
||||
SECRET_SHARED
|
||||
} from '../../variables';
|
||||
import {
|
||||
BatchSecretRequest
|
||||
} from '../../types/secret';
|
||||
|
||||
router.post(
|
||||
'/batch',
|
||||
requireAuth({
|
||||
acceptedAuthModes: ['jwt', 'apiKey']
|
||||
}),
|
||||
requireWorkspaceAuth({
|
||||
acceptedRoles: [ADMIN, MEMBER],
|
||||
location: 'body'
|
||||
}),
|
||||
body('workspaceId').exists().isString().trim(),
|
||||
body('environment').exists().isString().trim(),
|
||||
body('requests')
|
||||
.exists()
|
||||
.custom(async (requests: BatchSecretRequest[], { req }) => {
|
||||
if (Array.isArray(requests)) {
|
||||
const secretIds = requests
|
||||
.map((request) => request.secret._id)
|
||||
.filter((secretId) => secretId !== undefined)
|
||||
|
||||
if (secretIds.length > 0) {
|
||||
const relevantSecrets = await validateSecrets({
|
||||
userId: req.user._id.toString(),
|
||||
secretIds
|
||||
});
|
||||
|
||||
req.secrets = relevantSecrets;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}),
|
||||
validateRequest,
|
||||
secretsController.batchSecrets
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/',
|
||||
|
49
backend/src/routes/v2/signup.ts
Normal file
49
backend/src/routes/v2/signup.ts
Normal file
@ -0,0 +1,49 @@
|
||||
import express from 'express';
|
||||
const router = express.Router();
|
||||
import { body } from 'express-validator';
|
||||
import { requireSignupAuth, validateRequest } from '../../middleware';
|
||||
import { signupController } from '../../controllers/v2';
|
||||
import { authLimiter } from '../../helpers/rateLimiter';
|
||||
|
||||
router.post(
|
||||
'/complete-account/signup',
|
||||
authLimiter,
|
||||
requireSignupAuth,
|
||||
body('email').exists().isString().trim().notEmpty().isEmail(),
|
||||
body('firstName').exists().isString().trim().notEmpty(),
|
||||
body('lastName').exists().isString().trim().notEmpty(),
|
||||
body('protectedKey').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyIV').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyTag').exists().isString().trim().notEmpty(),
|
||||
body('publicKey').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(),
|
||||
body('salt').exists().isString().trim().notEmpty(),
|
||||
body('verifier').exists().isString().trim().notEmpty(),
|
||||
body('organizationName').exists().isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
signupController.completeAccountSignup
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/complete-account/invite',
|
||||
authLimiter,
|
||||
requireSignupAuth,
|
||||
body('email').exists().isString().trim().notEmpty().isEmail(),
|
||||
body('firstName').exists().isString().trim().notEmpty(),
|
||||
body('lastName').exists().isString().trim().notEmpty(),
|
||||
body('protectedKey').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyIV').exists().isString().trim().notEmpty(),
|
||||
body('protectedKeyTag').exists().isString().trim().notEmpty(),
|
||||
body('publicKey').exists().trim().notEmpty(),
|
||||
body('encryptedPrivateKey').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKeyIV').exists().isString().trim().notEmpty(),
|
||||
body('encryptedPrivateKeyTag').exists().isString().trim().notEmpty(),
|
||||
body('salt').exists().isString().trim().notEmpty(),
|
||||
body('verifier').exists().isString().trim().notEmpty(),
|
||||
validateRequest,
|
||||
signupController.completeAccountInvite
|
||||
);
|
||||
|
||||
export default router;
|
@ -1,8 +1,10 @@
|
||||
import express from 'express';
|
||||
const router = express.Router();
|
||||
import {
|
||||
requireAuth
|
||||
requireAuth,
|
||||
validateRequest
|
||||
} from '../../middleware';
|
||||
import { body } from 'express-validator';
|
||||
import { usersController } from '../../controllers/v2';
|
||||
|
||||
router.get(
|
||||
@ -13,6 +15,16 @@ router.get(
|
||||
usersController.getMe
|
||||
);
|
||||
|
||||
router.patch(
|
||||
'/me/mfa',
|
||||
requireAuth({
|
||||
acceptedAuthModes: ['jwt', 'apiKey']
|
||||
}),
|
||||
body('isMfaEnabled').exists().isBoolean(),
|
||||
validateRequest,
|
||||
usersController.updateMyMfaEnabled
|
||||
);
|
||||
|
||||
router.get(
|
||||
'/me/organizations',
|
||||
requireAuth({
|
||||
|
@ -118,7 +118,6 @@ router.delete( // TODO - rewire dashboard to this route
|
||||
workspaceController.deleteWorkspaceMembership
|
||||
);
|
||||
|
||||
|
||||
router.patch(
|
||||
'/:workspaceId/auto-capitalization',
|
||||
requireAuth({
|
||||
|
@ -1,5 +1,3 @@
|
||||
import { Bot, IBot } from '../models';
|
||||
import * as Sentry from '@sentry/node';
|
||||
import { handleEventHelper } from '../helpers/event';
|
||||
|
||||
interface Event {
|
||||
|
69
backend/src/services/TokenService.ts
Normal file
69
backend/src/services/TokenService.ts
Normal file
@ -0,0 +1,69 @@
|
||||
import { Types } from 'mongoose';
|
||||
import { createTokenHelper, validateTokenHelper } from '../helpers/token';
|
||||
|
||||
/**
|
||||
* Class to handle token actions
|
||||
* TODO: elaborate more on this class
|
||||
*/
|
||||
class TokenService {
|
||||
/**
|
||||
* Create a token [token] for type [type] with associated details
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.type - type or context of token (e.g. emailConfirmation)
|
||||
* @param {String} obj.email - email associated with the token
|
||||
* @param {String} obj.phoneNumber - phone number associated with the token
|
||||
* @param {Types.ObjectId} obj.organizationId - id of organization associated with the token
|
||||
* @returns {String} token - the token to create
|
||||
*/
|
||||
static async createToken({
|
||||
type,
|
||||
email,
|
||||
phoneNumber,
|
||||
organizationId
|
||||
}: {
|
||||
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organizationId?: Types.ObjectId;
|
||||
}) {
|
||||
return await createTokenHelper({
|
||||
type,
|
||||
email,
|
||||
phoneNumber,
|
||||
organizationId
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate whether or not token [token] and its associated details match a token in the DB
|
||||
* @param {Object} obj
|
||||
* @param {String} obj.type - type or context of token (e.g. emailConfirmation)
|
||||
* @param {String} obj.email - email associated with the token
|
||||
* @param {String} obj.phoneNumber - phone number associated with the token
|
||||
* @param {Types.ObjectId} obj.organizationId - id of organization associated with the token
|
||||
* @param {String} obj.token - the token to validate
|
||||
*/
|
||||
static async validateToken({
|
||||
type,
|
||||
email,
|
||||
phoneNumber,
|
||||
organizationId,
|
||||
token
|
||||
}: {
|
||||
type: 'emailConfirmation' | 'emailMfa' | 'organizationInvitation' | 'passwordReset';
|
||||
email?: string;
|
||||
phoneNumber?: string;
|
||||
organizationId?: Types.ObjectId;
|
||||
token: string;
|
||||
}) {
|
||||
return await validateTokenHelper({
|
||||
type,
|
||||
email,
|
||||
phoneNumber,
|
||||
organizationId,
|
||||
token
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export default TokenService;
|
@ -3,11 +3,13 @@ import postHogClient from './PostHogClient';
|
||||
import BotService from './BotService';
|
||||
import EventService from './EventService';
|
||||
import IntegrationService from './IntegrationService';
|
||||
import TokenService from './TokenService';
|
||||
|
||||
export {
|
||||
DatabaseService,
|
||||
postHogClient,
|
||||
BotService,
|
||||
EventService,
|
||||
IntegrationService
|
||||
IntegrationService,
|
||||
TokenService
|
||||
}
|
@ -1,6 +1,17 @@
|
||||
import nodemailer from 'nodemailer';
|
||||
import { SMTP_HOST, SMTP_PORT, SMTP_USERNAME, SMTP_PASSWORD, SMTP_SECURE } from '../config';
|
||||
import { SMTP_HOST_SENDGRID, SMTP_HOST_MAILGUN } from '../variables';
|
||||
import {
|
||||
SMTP_HOST,
|
||||
SMTP_PORT,
|
||||
SMTP_USERNAME,
|
||||
SMTP_PASSWORD,
|
||||
SMTP_SECURE
|
||||
} from '../config';
|
||||
import {
|
||||
SMTP_HOST_SENDGRID,
|
||||
SMTP_HOST_MAILGUN,
|
||||
SMTP_HOST_SOCKETLABS,
|
||||
SMTP_HOST_ZOHOMAIL
|
||||
} from '../variables';
|
||||
import SMTPConnection from 'nodemailer/lib/smtp-connection';
|
||||
import * as Sentry from '@sentry/node';
|
||||
|
||||
@ -27,6 +38,18 @@ if (SMTP_SECURE) {
|
||||
ciphers: 'TLSv1.2'
|
||||
}
|
||||
break;
|
||||
case SMTP_HOST_SOCKETLABS:
|
||||
mailOpts.requireTLS = true;
|
||||
mailOpts.tls = {
|
||||
ciphers: 'TLSv1.2'
|
||||
}
|
||||
break;
|
||||
case SMTP_HOST_ZOHOMAIL:
|
||||
mailOpts.requireTLS = true;
|
||||
mailOpts.tls = {
|
||||
ciphers: 'TLSv1.2'
|
||||
}
|
||||
break;
|
||||
default:
|
||||
if (SMTP_HOST.includes('amazonaws.com')) {
|
||||
mailOpts.tls = {
|
||||
|
19
backend/src/templates/emailMfa.handlebars
Normal file
19
backend/src/templates/emailMfa.handlebars
Normal file
@ -0,0 +1,19 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="x-ua-compatible" content="ie=edge">
|
||||
<title>MFA Code</title>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<h2>Infisical</h2>
|
||||
<h2>Sign in attempt requires further verification</h2>
|
||||
<p>Your MFA code is below — enter it where you started signing in to Infisical.</p>
|
||||
<h2>{{code}}</h2>
|
||||
<p>The MFA code will be valid for 2 minutes.</p>
|
||||
<p>Not you? Contact Infisical or your administrator immediately.</p>
|
||||
</body>
|
||||
|
||||
</html>
|
@ -1,14 +1,17 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="x-ua-compatible" content="ie=edge">
|
||||
<title>Email Verification</title>
|
||||
<title>Code</title>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<h2>Confirm your email address</h2>
|
||||
<p>Your confirmation code is below — enter it in the browser window where you've started signing up for Infisical.</p>
|
||||
<h1>{{code}}</h1>
|
||||
<p>Questions about setting up Infisical? Email us at support@infisical.com</p>
|
||||
</body>
|
||||
|
||||
</html>
|
19
backend/src/templates/newDevice.handlebars
Normal file
19
backend/src/templates/newDevice.handlebars
Normal file
@ -0,0 +1,19 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="x-ua-compatible" content="ie=edge">
|
||||
<title>Successful login for {{email}} from new device</title>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<h2>Infisical</h2>
|
||||
<p>We're verifying a recent login for {{email}}:</p>
|
||||
<p><strong>Timestamp</strong>: {{timestamp}}</p>
|
||||
<p><strong>IP address</strong>: {{ip}}</p>
|
||||
<p><strong>User agent</strong>: {{userAgent}}</p>
|
||||
<p>If you believe that this login is suspicious, please contact Infisical or reset your password immediately.</p>
|
||||
</body>
|
||||
|
||||
</html>
|
38
backend/src/types/secret/index.d.ts
vendored
38
backend/src/types/secret/index.d.ts
vendored
@ -1,5 +1,7 @@
|
||||
import { Types } from 'mongoose';
|
||||
import { Assign, Omit } from 'utility-types';
|
||||
import { ISecret } from '../../models';
|
||||
import { mongo } from 'mongoose';
|
||||
|
||||
// Everything is required, except the omitted types
|
||||
export type CreateSecretRequestBody = Omit<ISecret, "user" | "version" | "environment" | "workspace">;
|
||||
@ -12,3 +14,39 @@ export type SanitizedSecretModify = Partial<Omit<ISecret, "user" | "version" | "
|
||||
|
||||
// Everything is required, except the omitted types
|
||||
export type SanitizedSecretForCreate = Omit<ISecret, "version" | "_id">;
|
||||
|
||||
export interface BatchSecretRequest {
|
||||
id: string;
|
||||
method: 'POST' | 'PATCH' | 'DELETE';
|
||||
secret: Secret;
|
||||
}
|
||||
|
||||
export interface BatchSecret {
|
||||
_id: string;
|
||||
type: 'shared' | 'personal',
|
||||
secretKeyCiphertext: string;
|
||||
secretKeyIV: string;
|
||||
secretKeyTag: string;
|
||||
secretValueCiphertext: string;
|
||||
secretValueIV: string;
|
||||
secretValueTag: string;
|
||||
secretCommentCiphertext: string;
|
||||
secretCommentIV: string;
|
||||
secretCommentTag: string;
|
||||
tags: string[];
|
||||
}
|
||||
|
||||
export interface BatchSecret {
|
||||
_id: string;
|
||||
type: 'shared' | 'personal',
|
||||
secretKeyCiphertext: string;
|
||||
secretKeyIV: string;
|
||||
secretKeyTag: string;
|
||||
secretValueCiphertext: string;
|
||||
secretValueIV: string;
|
||||
secretValueTag: string;
|
||||
secretCommentCiphertext: string;
|
||||
secretCommentIV: string;
|
||||
secretCommentTag: string;
|
||||
tags: string[];
|
||||
}
|
@ -13,9 +13,11 @@ import {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
INTEGRATION_SET,
|
||||
INTEGRATION_OAUTH2,
|
||||
INTEGRATION_AZURE_TOKEN_URL,
|
||||
@ -23,12 +25,15 @@ import {
|
||||
INTEGRATION_VERCEL_TOKEN_URL,
|
||||
INTEGRATION_NETLIFY_TOKEN_URL,
|
||||
INTEGRATION_GITHUB_TOKEN_URL,
|
||||
INTEGRATION_GITLAB_TOKEN_URL,
|
||||
INTEGRATION_HEROKU_API_URL,
|
||||
INTEGRATION_GITLAB_API_URL,
|
||||
INTEGRATION_VERCEL_API_URL,
|
||||
INTEGRATION_NETLIFY_API_URL,
|
||||
INTEGRATION_RENDER_API_URL,
|
||||
INTEGRATION_FLYIO_API_URL,
|
||||
INTEGRATION_CIRCLECI_API_URL,
|
||||
INTEGRATION_TRAVISCI_API_URL,
|
||||
INTEGRATION_OPTIONS,
|
||||
} from "./integration";
|
||||
import { OWNER, ADMIN, MEMBER, INVITED, ACCEPTED } from "./organization";
|
||||
@ -40,10 +45,24 @@ import {
|
||||
ACTION_ADD_SECRETS,
|
||||
ACTION_UPDATE_SECRETS,
|
||||
ACTION_DELETE_SECRETS,
|
||||
ACTION_READ_SECRETS,
|
||||
} from "./action";
|
||||
import { SMTP_HOST_SENDGRID, SMTP_HOST_MAILGUN } from "./smtp";
|
||||
import { PLAN_STARTER, PLAN_PRO } from "./stripe";
|
||||
ACTION_READ_SECRETS
|
||||
} from './action';
|
||||
import {
|
||||
SMTP_HOST_SENDGRID,
|
||||
SMTP_HOST_MAILGUN,
|
||||
SMTP_HOST_SOCKETLABS,
|
||||
SMTP_HOST_ZOHOMAIL
|
||||
} from './smtp';
|
||||
import { PLAN_STARTER, PLAN_PRO } from './stripe';
|
||||
import {
|
||||
MFA_METHOD_EMAIL
|
||||
} from './user';
|
||||
import {
|
||||
TOKEN_EMAIL_CONFIRMATION,
|
||||
TOKEN_EMAIL_MFA,
|
||||
TOKEN_EMAIL_ORG_INVITATION,
|
||||
TOKEN_EMAIL_PASSWORD_RESET
|
||||
} from './token';
|
||||
|
||||
export {
|
||||
OWNER,
|
||||
@ -65,9 +84,11 @@ export {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
INTEGRATION_SET,
|
||||
INTEGRATION_OAUTH2,
|
||||
INTEGRATION_AZURE_TOKEN_URL,
|
||||
@ -75,12 +96,15 @@ export {
|
||||
INTEGRATION_VERCEL_TOKEN_URL,
|
||||
INTEGRATION_NETLIFY_TOKEN_URL,
|
||||
INTEGRATION_GITHUB_TOKEN_URL,
|
||||
INTEGRATION_GITLAB_TOKEN_URL,
|
||||
INTEGRATION_HEROKU_API_URL,
|
||||
INTEGRATION_GITLAB_API_URL,
|
||||
INTEGRATION_VERCEL_API_URL,
|
||||
INTEGRATION_NETLIFY_API_URL,
|
||||
INTEGRATION_RENDER_API_URL,
|
||||
INTEGRATION_FLYIO_API_URL,
|
||||
INTEGRATION_CIRCLECI_API_URL,
|
||||
INTEGRATION_TRAVISCI_API_URL,
|
||||
EVENT_PUSH_SECRETS,
|
||||
EVENT_PULL_SECRETS,
|
||||
ACTION_LOGIN,
|
||||
@ -92,6 +116,13 @@ export {
|
||||
INTEGRATION_OPTIONS,
|
||||
SMTP_HOST_SENDGRID,
|
||||
SMTP_HOST_MAILGUN,
|
||||
SMTP_HOST_SOCKETLABS,
|
||||
SMTP_HOST_ZOHOMAIL,
|
||||
PLAN_STARTER,
|
||||
PLAN_PRO,
|
||||
MFA_METHOD_EMAIL,
|
||||
TOKEN_EMAIL_CONFIRMATION,
|
||||
TOKEN_EMAIL_MFA,
|
||||
TOKEN_EMAIL_ORG_INVITATION,
|
||||
TOKEN_EMAIL_PASSWORD_RESET
|
||||
};
|
||||
|
@ -1,5 +1,6 @@
|
||||
import {
|
||||
CLIENT_ID_AZURE,
|
||||
CLIENT_ID_GITLAB,
|
||||
TENANT_ID_AZURE
|
||||
} from '../config';
|
||||
import {
|
||||
@ -7,6 +8,7 @@ import {
|
||||
CLIENT_ID_NETLIFY,
|
||||
CLIENT_ID_GITHUB,
|
||||
CLIENT_SLUG_VERCEL,
|
||||
CLIENT_SECRET_GITLAB,
|
||||
} from "../config";
|
||||
|
||||
// integrations
|
||||
@ -17,18 +19,22 @@ const INTEGRATION_HEROKU = "heroku";
|
||||
const INTEGRATION_VERCEL = "vercel";
|
||||
const INTEGRATION_NETLIFY = "netlify";
|
||||
const INTEGRATION_GITHUB = "github";
|
||||
const INTEGRATION_GITLAB = "gitlab";
|
||||
const INTEGRATION_RENDER = "render";
|
||||
const INTEGRATION_FLYIO = "flyio";
|
||||
const INTEGRATION_CIRCLECI = "circleci";
|
||||
const INTEGRATION_TRAVISCI = "travisci";
|
||||
const INTEGRATION_SET = new Set([
|
||||
INTEGRATION_AZURE_KEY_VAULT,
|
||||
INTEGRATION_HEROKU,
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
]);
|
||||
|
||||
// integration types
|
||||
@ -42,14 +48,18 @@ const INTEGRATION_VERCEL_TOKEN_URL =
|
||||
const INTEGRATION_NETLIFY_TOKEN_URL = "https://api.netlify.com/oauth/token";
|
||||
const INTEGRATION_GITHUB_TOKEN_URL =
|
||||
"https://github.com/login/oauth/access_token";
|
||||
const INTEGRATION_GITLAB_TOKEN_URL = "https://gitlab.com/oauth/token";
|
||||
|
||||
|
||||
// integration apps endpoints
|
||||
const INTEGRATION_HEROKU_API_URL = "https://api.heroku.com";
|
||||
const INTEGRATION_GITLAB_API_URL = "https://gitlab.com/api";
|
||||
const INTEGRATION_VERCEL_API_URL = "https://api.vercel.com";
|
||||
const INTEGRATION_NETLIFY_API_URL = "https://api.netlify.com";
|
||||
const INTEGRATION_RENDER_API_URL = "https://api.render.com";
|
||||
const INTEGRATION_FLYIO_API_URL = "https://api.fly.io/graphql";
|
||||
const INTEGRATION_CIRCLECI_API_URL = "https://circleci.com/api";
|
||||
const INTEGRATION_TRAVISCI_API_URL = "https://api.travis-ci.com";
|
||||
|
||||
const INTEGRATION_OPTIONS = [
|
||||
{
|
||||
@ -89,6 +99,15 @@ const INTEGRATION_OPTIONS = [
|
||||
clientId: CLIENT_ID_GITHUB,
|
||||
docsLink: ''
|
||||
},
|
||||
{
|
||||
name: 'GitLab',
|
||||
slug: 'gitlab',
|
||||
image: 'GitLab.png',
|
||||
isAvailable: true,
|
||||
type: 'oauth',
|
||||
clientId: CLIENT_ID_GITLAB,
|
||||
docsLink: ''
|
||||
},
|
||||
{
|
||||
name: 'Render',
|
||||
slug: 'render',
|
||||
@ -134,6 +153,15 @@ const INTEGRATION_OPTIONS = [
|
||||
clientId: '',
|
||||
docsLink: ''
|
||||
},
|
||||
{
|
||||
name: 'Travis CI',
|
||||
slug: 'travisci',
|
||||
image: 'Travis CI.png',
|
||||
isAvailable: true,
|
||||
type: 'pat',
|
||||
clientId: '',
|
||||
docsLink: ''
|
||||
},
|
||||
{
|
||||
name: 'Azure Key Vault',
|
||||
slug: 'azure-key-vault',
|
||||
@ -152,15 +180,6 @@ const INTEGRATION_OPTIONS = [
|
||||
type: '',
|
||||
clientId: '',
|
||||
docsLink: ''
|
||||
},
|
||||
{
|
||||
name: 'Travis CI',
|
||||
slug: 'travisci',
|
||||
image: 'Travis CI.png',
|
||||
isAvailable: false,
|
||||
type: '',
|
||||
clientId: '',
|
||||
docsLink: ''
|
||||
}
|
||||
]
|
||||
|
||||
@ -172,9 +191,11 @@ export {
|
||||
INTEGRATION_VERCEL,
|
||||
INTEGRATION_NETLIFY,
|
||||
INTEGRATION_GITHUB,
|
||||
INTEGRATION_GITLAB,
|
||||
INTEGRATION_RENDER,
|
||||
INTEGRATION_FLYIO,
|
||||
INTEGRATION_CIRCLECI,
|
||||
INTEGRATION_TRAVISCI,
|
||||
INTEGRATION_SET,
|
||||
INTEGRATION_OAUTH2,
|
||||
INTEGRATION_AZURE_TOKEN_URL,
|
||||
@ -182,11 +203,14 @@ export {
|
||||
INTEGRATION_VERCEL_TOKEN_URL,
|
||||
INTEGRATION_NETLIFY_TOKEN_URL,
|
||||
INTEGRATION_GITHUB_TOKEN_URL,
|
||||
INTEGRATION_GITLAB_API_URL,
|
||||
INTEGRATION_HEROKU_API_URL,
|
||||
INTEGRATION_GITLAB_TOKEN_URL,
|
||||
INTEGRATION_VERCEL_API_URL,
|
||||
INTEGRATION_NETLIFY_API_URL,
|
||||
INTEGRATION_RENDER_API_URL,
|
||||
INTEGRATION_FLYIO_API_URL,
|
||||
INTEGRATION_CIRCLECI_API_URL,
|
||||
INTEGRATION_TRAVISCI_API_URL,
|
||||
INTEGRATION_OPTIONS,
|
||||
};
|
||||
|
@ -1,7 +1,11 @@
|
||||
const SMTP_HOST_SENDGRID = 'smtp.sendgrid.net';
|
||||
const SMTP_HOST_MAILGUN = 'smtp.mailgun.org';
|
||||
const SMTP_HOST_SOCKETLABS = 'smtp.socketlabs.com';
|
||||
const SMTP_HOST_ZOHOMAIL = 'smtp.zoho.com';
|
||||
|
||||
export {
|
||||
SMTP_HOST_SENDGRID,
|
||||
SMTP_HOST_MAILGUN
|
||||
SMTP_HOST_MAILGUN,
|
||||
SMTP_HOST_SOCKETLABS,
|
||||
SMTP_HOST_ZOHOMAIL
|
||||
}
|
11
backend/src/variables/token.ts
Normal file
11
backend/src/variables/token.ts
Normal file
@ -0,0 +1,11 @@
|
||||
const TOKEN_EMAIL_CONFIRMATION = 'emailConfirmation';
|
||||
const TOKEN_EMAIL_MFA = 'emailMfa';
|
||||
const TOKEN_EMAIL_ORG_INVITATION = 'organizationInvitation';
|
||||
const TOKEN_EMAIL_PASSWORD_RESET = 'passwordReset';
|
||||
|
||||
export {
|
||||
TOKEN_EMAIL_CONFIRMATION,
|
||||
TOKEN_EMAIL_MFA,
|
||||
TOKEN_EMAIL_ORG_INVITATION,
|
||||
TOKEN_EMAIL_PASSWORD_RESET
|
||||
}
|
5
backend/src/variables/user.ts
Normal file
5
backend/src/variables/user.ts
Normal file
@ -0,0 +1,5 @@
|
||||
const MFA_METHOD_EMAIL = 'email';
|
||||
|
||||
export {
|
||||
MFA_METHOD_EMAIL
|
||||
}
|
@ -7,8 +7,8 @@ require (
|
||||
github.com/muesli/mango-cobra v1.2.0
|
||||
github.com/muesli/roff v0.1.0
|
||||
github.com/spf13/cobra v1.6.1
|
||||
golang.org/x/crypto v0.3.0
|
||||
golang.org/x/term v0.3.0
|
||||
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d
|
||||
golang.org/x/term v0.5.0
|
||||
)
|
||||
|
||||
require (
|
||||
@ -31,8 +31,8 @@ require (
|
||||
github.com/oklog/ulid v1.3.1 // indirect
|
||||
github.com/rivo/uniseg v0.2.0 // indirect
|
||||
go.mongodb.org/mongo-driver v1.10.0 // indirect
|
||||
golang.org/x/net v0.2.0 // indirect
|
||||
golang.org/x/sys v0.3.0 // indirect
|
||||
golang.org/x/net v0.7.0 // indirect
|
||||
golang.org/x/sys v0.5.0 // indirect
|
||||
)
|
||||
|
||||
require (
|
||||
|
16
cli/go.sum
16
cli/go.sum
@ -103,13 +103,13 @@ github.com/xdg-go/stringprep v1.0.3/go.mod h1:W3f5j4i+9rC0kuIEJL0ky1VpHXQU3ocBgk
|
||||
github.com/youmark/pkcs8 v0.0.0-20181117223130-1be2e3e5546d/go.mod h1:rHwXgn7JulP+udvsHwJoVG1YGAP6VLg4y9I5dyZdqmA=
|
||||
go.mongodb.org/mongo-driver v1.10.0 h1:UtV6N5k14upNp4LTduX0QCufG124fSu25Wz9tu94GLg=
|
||||
go.mongodb.org/mongo-driver v1.10.0/go.mod h1:wsihk0Kdgv8Kqu1Anit4sfK+22vSFbUrAVEYRhCXrA8=
|
||||
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d h1:sK3txAijHtOK88l68nt020reeT1ZdKLIYetKl95FzVY=
|
||||
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
|
||||
golang.org/x/crypto v0.3.0 h1:a06MkbcxBrEFc0w0QIZWXrH/9cCX6KJyWbBOIwAn+7A=
|
||||
golang.org/x/crypto v0.3.0/go.mod h1:hebNnKkNXi2UzZN1eVRvBB7co0a+JxK6XbPiWVs/3J4=
|
||||
golang.org/x/net v0.0.0-20211029224645-99673261e6eb/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2 h1:CIJ76btIcR3eFI5EgSo6k1qKw9KJexJuRLI9G7Hp5wE=
|
||||
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.2.0 h1:sZfSu1wtKLGlWI4ZZayP0ck9Y73K1ynO6gqzTdBVdPU=
|
||||
golang.org/x/net v0.2.0/go.mod h1:KqCZLdyyvdV855qA2rE3GC2aiw5xGR5TEjj8smXukLY=
|
||||
golang.org/x/net v0.7.0 h1:rJrUqqhjsgNp7KqAIc25s9pZnjU7TUcSY7HcVZjdn1g=
|
||||
golang.org/x/net v0.7.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
|
||||
golang.org/x/sync v0.0.0-20210220032951-036812b2e83c/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sys v0.0.0-20181122145206-62eef0e2fa9b/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
@ -121,11 +121,11 @@ golang.org/x/sys v0.0.0-20210630005230-0f9fa26af87c/go.mod h1:oPkhp1MJrh7nUepCBc
|
||||
golang.org/x/sys v0.0.0-20210819135213-f52c844e1c1c/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20220310020820-b874c991c1a5/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.3.0 h1:w8ZOecv6NaNa/zC8944JTU3vz4u6Lagfk4RPQxv92NQ=
|
||||
golang.org/x/sys v0.3.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.5.0 h1:MUK/U/4lj1t1oPg0HfuXDN/Z1wv31ZJ/YcPiGccS4DU=
|
||||
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||
golang.org/x/term v0.3.0 h1:qoo4akIqOcDME5bhc/NgxUdovd6BSS2uMsVjB56q1xI=
|
||||
golang.org/x/term v0.3.0/go.mod h1:q750SLmJuPmVoN1blW3UFBPREJfb1KmY3vwxfr+nFDA=
|
||||
golang.org/x/term v0.5.0 h1:n2a8QNdAb0sZNpU9R1ALUXBbY+w51fCQDN+7EdxNBsY=
|
||||
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
|
||||
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
|
||||
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
||||
|
@ -128,6 +128,68 @@ func CallGetSecretsV2(httpClient *resty.Client, request GetEncryptedSecretsV2Req
|
||||
return secretsResponse, nil
|
||||
}
|
||||
|
||||
func CallLogin1V2(httpClient *resty.Client, request GetLoginOneV2Request) (GetLoginOneV2Response, error) {
|
||||
var loginOneV2Response GetLoginOneV2Response
|
||||
response, err := httpClient.
|
||||
R().
|
||||
SetResult(&loginOneV2Response).
|
||||
SetHeader("User-Agent", USER_AGENT).
|
||||
SetBody(request).
|
||||
Post(fmt.Sprintf("%v/v2/auth/login1", config.INFISICAL_URL))
|
||||
|
||||
if err != nil {
|
||||
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V2: Unable to complete api request [err=%s]", err)
|
||||
}
|
||||
|
||||
if response.IsError() {
|
||||
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V2: Unsuccessful response: [response=%s]", response)
|
||||
}
|
||||
|
||||
return loginOneV2Response, nil
|
||||
}
|
||||
|
||||
func CallVerifyMfaToken(httpClient *resty.Client, request VerifyMfaTokenRequest) (*VerifyMfaTokenResponse, *VerifyMfaTokenErrorResponse, error) {
|
||||
var verifyMfaTokenResponse VerifyMfaTokenResponse
|
||||
var responseError VerifyMfaTokenErrorResponse
|
||||
response, err := httpClient.
|
||||
R().
|
||||
SetResult(&verifyMfaTokenResponse).
|
||||
SetHeader("User-Agent", USER_AGENT).
|
||||
SetError(&responseError).
|
||||
SetBody(request).
|
||||
Post(fmt.Sprintf("%v/v2/auth/mfa/verify", config.INFISICAL_URL))
|
||||
|
||||
if err != nil {
|
||||
return nil, nil, fmt.Errorf("CallVerifyMfaToken: Unable to complete api request [err=%s]", err)
|
||||
}
|
||||
|
||||
if response.IsError() {
|
||||
return nil, &responseError, nil
|
||||
}
|
||||
|
||||
return &verifyMfaTokenResponse, nil, nil
|
||||
}
|
||||
|
||||
func CallLogin2V2(httpClient *resty.Client, request GetLoginTwoV2Request) (GetLoginTwoV2Response, error) {
|
||||
var loginTwoV2Response GetLoginTwoV2Response
|
||||
response, err := httpClient.
|
||||
R().
|
||||
SetResult(&loginTwoV2Response).
|
||||
SetHeader("User-Agent", USER_AGENT).
|
||||
SetBody(request).
|
||||
Post(fmt.Sprintf("%v/v2/auth/login2", config.INFISICAL_URL))
|
||||
|
||||
if err != nil {
|
||||
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V2: Unable to complete api request [err=%s]", err)
|
||||
}
|
||||
|
||||
if response.IsError() {
|
||||
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V2: Unsuccessful response: [response=%s]", response)
|
||||
}
|
||||
|
||||
return loginTwoV2Response, nil
|
||||
}
|
||||
|
||||
func CallGetAllWorkSpacesUserBelongsTo(httpClient *resty.Client) (GetWorkSpacesResponse, error) {
|
||||
var workSpacesResponse GetWorkSpacesResponse
|
||||
response, err := httpClient.
|
||||
|
@ -263,3 +263,63 @@ type GetAccessibleEnvironmentsResponse struct {
|
||||
IsWriteDenied bool `json:"isWriteDenied"`
|
||||
} `json:"accessibleEnvironments"`
|
||||
}
|
||||
|
||||
type GetLoginOneV2Request struct {
|
||||
Email string `json:"email"`
|
||||
ClientPublicKey string `json:"clientPublicKey"`
|
||||
}
|
||||
|
||||
type GetLoginOneV2Response struct {
|
||||
ServerPublicKey string `json:"serverPublicKey"`
|
||||
Salt string `json:"salt"`
|
||||
}
|
||||
|
||||
type GetLoginTwoV2Request struct {
|
||||
Email string `json:"email"`
|
||||
ClientProof string `json:"clientProof"`
|
||||
}
|
||||
|
||||
type GetLoginTwoV2Response struct {
|
||||
MfaEnabled bool `json:"mfaEnabled"`
|
||||
EncryptionVersion int `json:"encryptionVersion"`
|
||||
Token string `json:"token"`
|
||||
PublicKey string `json:"publicKey"`
|
||||
EncryptedPrivateKey string `json:"encryptedPrivateKey"`
|
||||
Iv string `json:"iv"`
|
||||
Tag string `json:"tag"`
|
||||
ProtectedKey string `json:"protectedKey"`
|
||||
ProtectedKeyIV string `json:"protectedKeyIV"`
|
||||
ProtectedKeyTag string `json:"protectedKeyTag"`
|
||||
}
|
||||
|
||||
type VerifyMfaTokenRequest struct {
|
||||
Email string `json:"email"`
|
||||
MFAToken string `json:"mfaToken"`
|
||||
}
|
||||
|
||||
type VerifyMfaTokenResponse struct {
|
||||
EncryptionVersion int `json:"encryptionVersion"`
|
||||
Token string `json:"token"`
|
||||
PublicKey string `json:"publicKey"`
|
||||
EncryptedPrivateKey string `json:"encryptedPrivateKey"`
|
||||
Iv string `json:"iv"`
|
||||
Tag string `json:"tag"`
|
||||
ProtectedKey string `json:"protectedKey"`
|
||||
ProtectedKeyIV string `json:"protectedKeyIV"`
|
||||
ProtectedKeyTag string `json:"protectedKeyTag"`
|
||||
}
|
||||
|
||||
type VerifyMfaTokenErrorResponse struct {
|
||||
Type string `json:"type"`
|
||||
Message string `json:"message"`
|
||||
Context struct {
|
||||
Code string `json:"code"`
|
||||
TriesLeft int `json:"triesLeft"`
|
||||
} `json:"context"`
|
||||
Level int `json:"level"`
|
||||
LevelName string `json:"level_name"`
|
||||
StatusCode int `json:"status_code"`
|
||||
DatetimeIso time.Time `json:"datetime_iso"`
|
||||
Application string `json:"application"`
|
||||
Extra []interface{} `json:"extra"`
|
||||
}
|
||||
|
49
cli/packages/cmd/cmd_test.go
Normal file
49
cli/packages/cmd/cmd_test.go
Normal file
@ -0,0 +1,49 @@
|
||||
package cmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/Infisical/infisical-merge/packages/models"
|
||||
)
|
||||
|
||||
func TestFilterReservedEnvVars(t *testing.T) {
|
||||
|
||||
// some test env vars.
|
||||
// HOME and PATH are reserved key words and should be filtered out
|
||||
// XDG_SESSION_ID and LC_CTYPE are reserved key word prefixes and should be filtered out
|
||||
// The filter function only checks the keys of the env map, so we dont need to set any values
|
||||
env := map[string]models.SingleEnvironmentVariable{
|
||||
"test": {},
|
||||
"test2": {},
|
||||
"HOME": {},
|
||||
"PATH": {},
|
||||
"XDG_SESSION_ID": {},
|
||||
"LC_CTYPE": {},
|
||||
}
|
||||
|
||||
// check to see if there are any reserved key words in secrets to inject
|
||||
filterReservedEnvVars(env)
|
||||
|
||||
if len(env) != 2 {
|
||||
t.Errorf("Expected 2 secrets to be returned, got %d", len(env))
|
||||
}
|
||||
if _, ok := env["test"]; !ok {
|
||||
t.Errorf("Expected test to be returned")
|
||||
}
|
||||
if _, ok := env["test2"]; !ok {
|
||||
t.Errorf("Expected test2 to be returned")
|
||||
}
|
||||
if _, ok := env["HOME"]; ok {
|
||||
t.Errorf("Expected HOME to be filtered out")
|
||||
}
|
||||
if _, ok := env["PATH"]; ok {
|
||||
t.Errorf("Expected PATH to be filtered out")
|
||||
}
|
||||
if _, ok := env["XDG_SESSION_ID"]; ok {
|
||||
t.Errorf("Expected XDG_SESSION_ID to be filtered out")
|
||||
}
|
||||
if _, ok := env["LC_CTYPE"]; ok {
|
||||
t.Errorf("Expected LC_CTYPE to be filtered out")
|
||||
}
|
||||
|
||||
}
|
@ -36,12 +36,20 @@ var exportCmd = &cobra.Command{
|
||||
// util.RequireLocalWorkspaceFile()
|
||||
},
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
envName, err := cmd.Flags().GetString("env")
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
shouldExpandSecrets, err := cmd.Flags().GetBool("expand")
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
shouldExpandSecrets, err := cmd.Flags().GetBool("expand")
|
||||
projectId, err := cmd.Flags().GetString("projectId")
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
@ -66,7 +74,7 @@ var exportCmd = &cobra.Command{
|
||||
util.HandleError(err, "Unable to parse flag")
|
||||
}
|
||||
|
||||
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: envName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
|
||||
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs, WorkspaceId: projectId})
|
||||
if err != nil {
|
||||
util.HandleError(err, "Unable to fetch secrets")
|
||||
}
|
||||
@ -103,6 +111,7 @@ func init() {
|
||||
exportCmd.Flags().Bool("secret-overriding", true, "Prioritizes personal secrets, if any, with the same name over shared secrets")
|
||||
exportCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")
|
||||
exportCmd.Flags().StringP("tags", "t", "", "filter secrets by tag slugs")
|
||||
exportCmd.Flags().String("projectId", "", "manually set the projectId to fetch secrets from")
|
||||
}
|
||||
|
||||
// Format according to the format flag
|
||||
|
@ -6,7 +6,6 @@ package cmd
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"os"
|
||||
|
||||
"github.com/Infisical/infisical-merge/packages/api"
|
||||
"github.com/Infisical/infisical-merge/packages/models"
|
||||
@ -91,12 +90,12 @@ func writeWorkspaceFile(selectedWorkspace models.Workspace) error {
|
||||
WorkspaceId: selectedWorkspace.ID,
|
||||
}
|
||||
|
||||
marshalledWorkspaceFile, err := json.Marshal(workspaceFileToSave)
|
||||
marshalledWorkspaceFile, err := json.MarshalIndent(workspaceFileToSave, "", " ")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
err = util.WriteToFile(util.INFISICAL_WORKSPACE_CONFIG_FILE_NAME, marshalledWorkspaceFile, os.ModePerm)
|
||||
err = util.WriteToFile(util.INFISICAL_WORKSPACE_CONFIG_FILE_NAME, marshalledWorkspaceFile, 0600)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
@ -13,7 +13,6 @@ import (
|
||||
"regexp"
|
||||
|
||||
"github.com/Infisical/infisical-merge/packages/api"
|
||||
"github.com/Infisical/infisical-merge/packages/config"
|
||||
"github.com/Infisical/infisical-merge/packages/crypto"
|
||||
"github.com/Infisical/infisical-merge/packages/models"
|
||||
"github.com/Infisical/infisical-merge/packages/srp"
|
||||
@ -23,8 +22,17 @@ import (
|
||||
"github.com/manifoldco/promptui"
|
||||
log "github.com/sirupsen/logrus"
|
||||
"github.com/spf13/cobra"
|
||||
"golang.org/x/crypto/argon2"
|
||||
)
|
||||
|
||||
type params struct {
|
||||
memory uint32
|
||||
iterations uint32
|
||||
parallelism uint8
|
||||
saltLength uint32
|
||||
keyLength uint32
|
||||
}
|
||||
|
||||
// loginCmd represents the login command
|
||||
var loginCmd = &cobra.Command{
|
||||
Use: "login",
|
||||
@ -33,13 +41,14 @@ var loginCmd = &cobra.Command{
|
||||
PreRun: toggleDebug,
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
currentLoggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
|
||||
if err != nil && (strings.Contains(err.Error(), "The specified item could not be found in the keyring") || strings.Contains(err.Error(), "unable to get key from Keyring")) { // if the key can't be found allow them to override
|
||||
// if the key can't be found or there is an error getting current credentials from key ring, allow them to override
|
||||
if err != nil && (strings.Contains(err.Error(), "The specified item could not be found in the keyring") || strings.Contains(err.Error(), "unable to get key from Keyring") || strings.Contains(err.Error(), "GetUserCredsFromKeyRing")) {
|
||||
log.Debug(err)
|
||||
} else if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
if currentLoggedInUserDetails.IsUserLoggedIn && !currentLoggedInUserDetails.LoginExpired { // if you are logged in but not expired
|
||||
if currentLoggedInUserDetails.IsUserLoggedIn && !currentLoggedInUserDetails.LoginExpired && len(currentLoggedInUserDetails.UserCredentials.PrivateKey) != 0 {
|
||||
shouldOverride, err := shouldOverrideLoginPrompt(currentLoggedInUserDetails.UserCredentials.Email)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
@ -55,36 +64,158 @@ var loginCmd = &cobra.Command{
|
||||
util.HandleError(err, "Unable to parse email and password for authentication")
|
||||
}
|
||||
|
||||
userCredentials, err := getFreshUserCredentials(email, password)
|
||||
loginOneResponse, loginTwoResponse, err := getFreshUserCredentials(email, password)
|
||||
if err != nil {
|
||||
log.Infoln("Unable to authenticate with the provided credentials, please try again")
|
||||
log.Debugln(err)
|
||||
return
|
||||
}
|
||||
|
||||
encryptedPrivateKey, _ := base64.StdEncoding.DecodeString(userCredentials.EncryptedPrivateKey)
|
||||
tag, err := base64.StdEncoding.DecodeString(userCredentials.Tag)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
if loginTwoResponse.MfaEnabled {
|
||||
i := 1
|
||||
for i < 6 {
|
||||
mfaVerifyCode := askForMFACode()
|
||||
|
||||
httpClient := resty.New()
|
||||
httpClient.SetAuthToken(loginTwoResponse.Token)
|
||||
verifyMFAresponse, mfaErrorResponse, requestError := api.CallVerifyMfaToken(httpClient, api.VerifyMfaTokenRequest{
|
||||
Email: email,
|
||||
MFAToken: mfaVerifyCode,
|
||||
})
|
||||
|
||||
if requestError != nil {
|
||||
util.HandleError(err)
|
||||
break
|
||||
} else if mfaErrorResponse != nil {
|
||||
if mfaErrorResponse.Context.Code == "mfa_invalid" {
|
||||
msg := fmt.Sprintf("Incorrect, verification code. You have %v attempts left", 5-i)
|
||||
fmt.Println(msg)
|
||||
if i == 5 {
|
||||
util.PrintErrorMessageAndExit("No tries left, please try again in a bit")
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if mfaErrorResponse.Context.Code == "mfa_expired" {
|
||||
util.PrintErrorMessageAndExit("Your 2FA verification code has expired, please try logging in again")
|
||||
break
|
||||
}
|
||||
i++
|
||||
} else {
|
||||
loginTwoResponse.EncryptedPrivateKey = verifyMFAresponse.EncryptedPrivateKey
|
||||
loginTwoResponse.EncryptionVersion = verifyMFAresponse.EncryptionVersion
|
||||
loginTwoResponse.Iv = verifyMFAresponse.Iv
|
||||
loginTwoResponse.ProtectedKey = verifyMFAresponse.ProtectedKey
|
||||
loginTwoResponse.ProtectedKeyIV = verifyMFAresponse.ProtectedKeyIV
|
||||
loginTwoResponse.ProtectedKeyTag = verifyMFAresponse.ProtectedKeyTag
|
||||
loginTwoResponse.PublicKey = verifyMFAresponse.PublicKey
|
||||
loginTwoResponse.Tag = verifyMFAresponse.Tag
|
||||
loginTwoResponse.Token = verifyMFAresponse.Token
|
||||
loginTwoResponse.EncryptionVersion = verifyMFAresponse.EncryptionVersion
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
IV, err := base64.StdEncoding.DecodeString(userCredentials.IV)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
var decryptedPrivateKey []byte
|
||||
|
||||
if loginTwoResponse.EncryptionVersion == 1 {
|
||||
log.Debug("Login version 1")
|
||||
encryptedPrivateKey, _ := base64.StdEncoding.DecodeString(loginTwoResponse.EncryptedPrivateKey)
|
||||
tag, err := base64.StdEncoding.DecodeString(loginTwoResponse.Tag)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
IV, err := base64.StdEncoding.DecodeString(loginTwoResponse.Iv)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
paddedPassword := fmt.Sprintf("%032s", password)
|
||||
key := []byte(paddedPassword)
|
||||
|
||||
computedDecryptedPrivateKey, err := crypto.DecryptSymmetric(key, encryptedPrivateKey, tag, IV)
|
||||
if err != nil || len(computedDecryptedPrivateKey) == 0 {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
decryptedPrivateKey = computedDecryptedPrivateKey
|
||||
|
||||
} else if loginTwoResponse.EncryptionVersion == 2 {
|
||||
log.Debug("Login version 2")
|
||||
protectedKey, err := base64.StdEncoding.DecodeString(loginTwoResponse.ProtectedKey)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
protectedKeyTag, err := base64.StdEncoding.DecodeString(loginTwoResponse.ProtectedKeyTag)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
protectedKeyIV, err := base64.StdEncoding.DecodeString(loginTwoResponse.ProtectedKeyIV)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
nonProtectedTag, err := base64.StdEncoding.DecodeString(loginTwoResponse.Tag)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
nonProtectedIv, err := base64.StdEncoding.DecodeString(loginTwoResponse.Iv)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
parameters := ¶ms{
|
||||
memory: 64 * 1024,
|
||||
iterations: 3,
|
||||
parallelism: 1,
|
||||
keyLength: 32,
|
||||
}
|
||||
|
||||
derivedKey, err := generateFromPassword(password, []byte(loginOneResponse.Salt), parameters)
|
||||
if err != nil {
|
||||
util.HandleError(fmt.Errorf("unable to generate argon hash from password [err=%s]", err))
|
||||
}
|
||||
|
||||
decryptedProtectedKey, err := crypto.DecryptSymmetric(derivedKey, protectedKey, protectedKeyTag, protectedKeyIV)
|
||||
if err != nil {
|
||||
util.HandleError(fmt.Errorf("unable to get decrypted protected key [err=%s]", err))
|
||||
}
|
||||
|
||||
encryptedPrivateKey, err := base64.StdEncoding.DecodeString(loginTwoResponse.EncryptedPrivateKey)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
decryptedProtectedKeyInHex, err := hex.DecodeString(string(decryptedProtectedKey))
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
computedDecryptedPrivateKey, err := crypto.DecryptSymmetric(decryptedProtectedKeyInHex, encryptedPrivateKey, nonProtectedTag, nonProtectedIv)
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
decryptedPrivateKey = computedDecryptedPrivateKey
|
||||
} else {
|
||||
util.PrintErrorMessageAndExit("Insufficient details to decrypt private key")
|
||||
}
|
||||
|
||||
paddedPassword := fmt.Sprintf("%032s", password)
|
||||
key := []byte(paddedPassword)
|
||||
|
||||
decryptedPrivateKey, err := crypto.DecryptSymmetric(key, encryptedPrivateKey, tag, IV)
|
||||
if err != nil || len(decryptedPrivateKey) == 0 {
|
||||
util.HandleError(err)
|
||||
if string(decryptedPrivateKey) == "" || email == "" || loginTwoResponse.Token == "" {
|
||||
log.Debugf("[decryptedPrivateKey=%s] [email=%s] [loginTwoResponse.Token=%s]", string(decryptedPrivateKey), email, loginTwoResponse.Token)
|
||||
util.PrintErrorMessageAndExit("We were unable to fetch required details to complete your login. Run with -d to see more info")
|
||||
}
|
||||
|
||||
userCredentialsToBeStored := &models.UserCredentials{
|
||||
Email: email,
|
||||
PrivateKey: string(decryptedPrivateKey),
|
||||
JTWToken: userCredentials.JTWToken,
|
||||
JTWToken: loginTwoResponse.Token,
|
||||
}
|
||||
|
||||
err = util.StoreUserCredsInKeyRing(userCredentialsToBeStored)
|
||||
@ -104,8 +235,16 @@ var loginCmd = &cobra.Command{
|
||||
// clear backed up secrets from prev account
|
||||
util.DeleteBackupSecrets()
|
||||
|
||||
color.Green("Nice! You are logged in as: %v", email)
|
||||
whilte := color.New(color.FgGreen)
|
||||
boldWhite := whilte.Add(color.Bold)
|
||||
boldWhite.Printf(">>>> Welcome to Infisical!")
|
||||
boldWhite.Printf(" You are now logged in as %v <<<< \n", email)
|
||||
|
||||
plainBold := color.New(color.Bold)
|
||||
|
||||
plainBold.Println("\nQuick links")
|
||||
fmt.Println("- Learn to inject secrets into your application at https://infisical.com/docs/cli/usage")
|
||||
fmt.Println("- Stuck? Join our slack for quick support https://infisical.com/slack")
|
||||
},
|
||||
}
|
||||
|
||||
@ -155,7 +294,7 @@ func askForLoginCredentials() (email string, password string, err error) {
|
||||
return userEmail, userPassword, nil
|
||||
}
|
||||
|
||||
func getFreshUserCredentials(email string, password string) (*api.LoginTwoResponse, error) {
|
||||
func getFreshUserCredentials(email string, password string) (*api.GetLoginOneV2Response, *api.GetLoginTwoV2Response, error) {
|
||||
log.Debugln("getFreshUserCredentials:", "email", email, "password", password)
|
||||
httpClient := resty.New()
|
||||
httpClient.SetRetryCount(5)
|
||||
@ -166,36 +305,24 @@ func getFreshUserCredentials(email string, password string) (*api.LoginTwoRespon
|
||||
srpA := hex.EncodeToString(srpClient.ComputeA())
|
||||
|
||||
// ** Login one
|
||||
loginOneRequest := api.LoginOneRequest{
|
||||
loginOneResponseResult, err := api.CallLogin1V2(httpClient, api.GetLoginOneV2Request{
|
||||
Email: email,
|
||||
ClientPublicKey: srpA,
|
||||
}
|
||||
|
||||
var loginOneResponseResult api.LoginOneResponse
|
||||
|
||||
loginOneResponse, err := httpClient.
|
||||
R().
|
||||
SetBody(loginOneRequest).
|
||||
SetResult(&loginOneResponseResult).
|
||||
Post(fmt.Sprintf("%v/v1/auth/login1", config.INFISICAL_URL))
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if loginOneResponse.StatusCode() > 299 {
|
||||
return nil, fmt.Errorf("ops, unsuccessful response code. [response=%v]", loginOneResponse)
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
// **** Login 2
|
||||
serverPublicKey_bytearray, err := hex.DecodeString(loginOneResponseResult.ServerPublicKey)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, nil, err
|
||||
}
|
||||
|
||||
userSalt, err := hex.DecodeString(loginOneResponseResult.ServerSalt)
|
||||
userSalt, err := hex.DecodeString(loginOneResponseResult.Salt)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return nil, nil, err
|
||||
}
|
||||
|
||||
srpClient.SetSalt(userSalt, []byte(email), []byte(password))
|
||||
@ -203,27 +330,16 @@ func getFreshUserCredentials(email string, password string) (*api.LoginTwoRespon
|
||||
|
||||
srpM1 := srpClient.ComputeM1()
|
||||
|
||||
LoginTwoRequest := api.LoginTwoRequest{
|
||||
loginTwoResponseResult, err := api.CallLogin2V2(httpClient, api.GetLoginTwoV2Request{
|
||||
Email: email,
|
||||
ClientProof: hex.EncodeToString(srpM1),
|
||||
}
|
||||
|
||||
var loginTwoResponseResult api.LoginTwoResponse
|
||||
loginTwoResponse, err := httpClient.
|
||||
R().
|
||||
SetBody(LoginTwoRequest).
|
||||
SetResult(&loginTwoResponseResult).
|
||||
Post(fmt.Sprintf("%v/v1/auth/login2", config.INFISICAL_URL))
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
if loginTwoResponse.StatusCode() > 299 {
|
||||
return nil, fmt.Errorf("ops, unsuccessful response code. [response=%v]", loginTwoResponse)
|
||||
}
|
||||
|
||||
return &loginTwoResponseResult, nil
|
||||
return &loginOneResponseResult, &loginTwoResponseResult, nil
|
||||
}
|
||||
|
||||
func shouldOverrideLoginPrompt(currentLoggedInUserEmail string) (bool, error) {
|
||||
@ -237,3 +353,21 @@ func shouldOverrideLoginPrompt(currentLoggedInUserEmail string) (bool, error) {
|
||||
}
|
||||
return result == "Yes", err
|
||||
}
|
||||
|
||||
func generateFromPassword(password string, salt []byte, p *params) (hash []byte, err error) {
|
||||
hash = argon2.IDKey([]byte(password), salt, p.iterations, p.memory, p.parallelism, p.keyLength)
|
||||
return hash, nil
|
||||
}
|
||||
|
||||
func askForMFACode() string {
|
||||
mfaCodePromptUI := promptui.Prompt{
|
||||
Label: "Enter the 2FA verification code sent to your email",
|
||||
}
|
||||
|
||||
mfaVerifyCode, err := mfaCodePromptUI.Run()
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
}
|
||||
|
||||
return mfaVerifyCode
|
||||
}
|
||||
|
@ -36,6 +36,9 @@ var resetCmd = &cobra.Command{
|
||||
|
||||
keyringInstance.Remove(util.KEYRING_SERVICE_NAME)
|
||||
|
||||
// delete secrets backup
|
||||
util.DeleteBackupSecrets()
|
||||
|
||||
util.PrintSuccessMessage("Reset successful")
|
||||
},
|
||||
}
|
||||
|
@ -54,9 +54,12 @@ var runCmd = &cobra.Command{
|
||||
return nil
|
||||
},
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
envName, err := cmd.Flags().GetString("env")
|
||||
if err != nil {
|
||||
util.HandleError(err, "Unable to parse flag")
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
infisicalToken, err := cmd.Flags().GetString("token")
|
||||
@ -79,7 +82,7 @@ var runCmd = &cobra.Command{
|
||||
util.HandleError(err, "Unable to parse flag")
|
||||
}
|
||||
|
||||
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: envName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
|
||||
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
|
||||
|
||||
if err != nil {
|
||||
util.HandleError(err, "Could not fetch secrets", "If you are using a service token to fetch secrets, please ensure it is valid")
|
||||
@ -107,13 +110,7 @@ var runCmd = &cobra.Command{
|
||||
}
|
||||
|
||||
// check to see if there are any reserved key words in secrets to inject
|
||||
reservedEnvironmentVariables := []string{"HOME", "PATH", "PS1", "PS2"}
|
||||
for _, reservedEnvName := range reservedEnvironmentVariables {
|
||||
if _, ok := secretsByKey[reservedEnvName]; ok {
|
||||
delete(secretsByKey, reservedEnvName)
|
||||
util.PrintWarning(fmt.Sprintf("Infisical secret named [%v] has been removed because it is a reserved secret name", reservedEnvName))
|
||||
}
|
||||
}
|
||||
filterReservedEnvVars(secretsByKey)
|
||||
|
||||
// now add infisical secrets
|
||||
for k, v := range secretsByKey {
|
||||
@ -146,6 +143,37 @@ var runCmd = &cobra.Command{
|
||||
},
|
||||
}
|
||||
|
||||
var (
|
||||
reservedEnvVars = []string{
|
||||
"HOME", "PATH", "PS1", "PS2",
|
||||
"PWD", "EDITOR", "XAUTHORITY", "USER",
|
||||
"TERM", "TERMINFO", "SHELL", "MAIL",
|
||||
}
|
||||
|
||||
reservedEnvVarPrefixes = []string{
|
||||
"XDG_",
|
||||
"LC_",
|
||||
}
|
||||
)
|
||||
|
||||
func filterReservedEnvVars(env map[string]models.SingleEnvironmentVariable) {
|
||||
for _, reservedEnvName := range reservedEnvVars {
|
||||
if _, ok := env[reservedEnvName]; ok {
|
||||
delete(env, reservedEnvName)
|
||||
util.PrintWarning(fmt.Sprintf("Infisical secret named [%v] has been removed because it is a reserved secret name", reservedEnvName))
|
||||
}
|
||||
}
|
||||
|
||||
for _, reservedEnvPrefix := range reservedEnvVarPrefixes {
|
||||
for envName := range env {
|
||||
if strings.HasPrefix(envName, reservedEnvPrefix) {
|
||||
delete(env, envName)
|
||||
util.PrintWarning(fmt.Sprintf("Infisical secret named [%v] has been removed because it contains a reserved prefix", envName))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func init() {
|
||||
rootCmd.AddCommand(runCmd)
|
||||
runCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")
|
||||
|
@ -31,9 +31,12 @@ var secretsCmd = &cobra.Command{
|
||||
PreRun: toggleDebug,
|
||||
Args: cobra.NoArgs,
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
environmentName, err := cmd.Flags().GetString("env")
|
||||
if err != nil {
|
||||
util.HandleError(err)
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
infisicalToken, err := cmd.Flags().GetString("token")
|
||||
@ -94,9 +97,12 @@ var secretsSetCmd = &cobra.Command{
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
util.RequireLocalWorkspaceFile()
|
||||
|
||||
environmentName, err := cmd.Flags().GetString("env")
|
||||
if err != nil {
|
||||
util.HandleError(err, "Unable to parse flag")
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
workspaceFile, err := util.GetWorkSpaceFromFile()
|
||||
@ -127,6 +133,11 @@ var secretsSetCmd = &cobra.Command{
|
||||
encryptedWorkspaceKeyNonce, _ := base64.StdEncoding.DecodeString(workspaceKeyResponse.Nonce)
|
||||
currentUsersPrivateKey, _ := base64.StdEncoding.DecodeString(loggedInUserDetails.UserCredentials.PrivateKey)
|
||||
|
||||
if len(currentUsersPrivateKey) == 0 || len(encryptedWorkspaceKeySenderPublicKey) == 0 {
|
||||
log.Debugf("Missing credentials for generating plainTextEncryptionKey: [currentUsersPrivateKey=%s] [encryptedWorkspaceKeySenderPublicKey=%s]", currentUsersPrivateKey, encryptedWorkspaceKeySenderPublicKey)
|
||||
util.PrintErrorMessageAndExit("Some required user credentials are missing to generate your [plainTextEncryptionKey]. Please run [infisical login] then try again")
|
||||
}
|
||||
|
||||
// decrypt workspace key
|
||||
plainTextEncryptionKey := crypto.DecryptAsymmetric(encryptedWorkspaceKey, encryptedWorkspaceKeyNonce, encryptedWorkspaceKeySenderPublicKey, currentUsersPrivateKey)
|
||||
|
||||
@ -270,11 +281,12 @@ var secretsDeleteCmd = &cobra.Command{
|
||||
PreRun: toggleDebug,
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
environmentName, err := cmd.Flags().GetString("env")
|
||||
if err != nil {
|
||||
log.Errorln("Unable to parse the environment name flag")
|
||||
log.Debugln(err)
|
||||
return
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails()
|
||||
@ -330,9 +342,12 @@ var secretsDeleteCmd = &cobra.Command{
|
||||
}
|
||||
|
||||
func getSecretsByNames(cmd *cobra.Command, args []string) {
|
||||
environmentName, err := cmd.Flags().GetString("env")
|
||||
if err != nil {
|
||||
util.HandleError(err, "Unable to parse flag")
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
infisicalToken, err := cmd.Flags().GetString("token")
|
||||
@ -352,10 +367,7 @@ func getSecretsByNames(cmd *cobra.Command, args []string) {
|
||||
|
||||
requestedSecrets := []models.SingleEnvironmentVariable{}
|
||||
|
||||
secretsMap := make(map[string]models.SingleEnvironmentVariable)
|
||||
for _, secret := range secrets {
|
||||
secretsMap[secret.Key] = secret
|
||||
}
|
||||
secretsMap := getSecretsByKeys(secrets)
|
||||
|
||||
for _, secretKeyFromArg := range args {
|
||||
if value, ok := secretsMap[strings.ToUpper(secretKeyFromArg)]; ok {
|
||||
@ -373,9 +385,12 @@ func getSecretsByNames(cmd *cobra.Command, args []string) {
|
||||
}
|
||||
|
||||
func generateExampleEnv(cmd *cobra.Command, args []string) {
|
||||
environmentName, err := cmd.Flags().GetString("env")
|
||||
if err != nil {
|
||||
util.HandleError(err, "Unable to parse flag")
|
||||
environmentName, _ := cmd.Flags().GetString("env")
|
||||
if !cmd.Flags().Changed("env") {
|
||||
environmentFromWorkspace := util.GetEnvFromWorkspaceFile()
|
||||
if environmentFromWorkspace != "" {
|
||||
environmentName = environmentFromWorkspace
|
||||
}
|
||||
}
|
||||
|
||||
infisicalToken, err := cmd.Flags().GetString("token")
|
||||
@ -575,7 +590,7 @@ func addHash(input string) string {
|
||||
}
|
||||
|
||||
func getSecretsByKeys(secrets []models.SingleEnvironmentVariable) map[string]models.SingleEnvironmentVariable {
|
||||
secretMapByName := make(map[string]models.SingleEnvironmentVariable)
|
||||
secretMapByName := make(map[string]models.SingleEnvironmentVariable, len(secrets))
|
||||
|
||||
for _, secret := range secrets {
|
||||
secretMapByName[secret.Key] = secret
|
||||
|
@ -39,7 +39,9 @@ type Workspace struct {
|
||||
}
|
||||
|
||||
type WorkspaceConfigFile struct {
|
||||
WorkspaceId string `json:"workspaceId"`
|
||||
WorkspaceId string `json:"workspaceId"`
|
||||
DefaultEnvironment string `json:"defaultEnvironment"`
|
||||
GitBranchToEnvironmentMapping map[string]string `json:"gitBranchToEnvironmentMapping"`
|
||||
}
|
||||
|
||||
type SymmetricEncryptionResult struct {
|
||||
@ -49,7 +51,9 @@ type SymmetricEncryptionResult struct {
|
||||
}
|
||||
|
||||
type GetAllSecretsParameters struct {
|
||||
Environment string
|
||||
InfisicalToken string
|
||||
TagSlugs string
|
||||
Environment string
|
||||
EnvironmentPassedViaFlag bool
|
||||
InfisicalToken string
|
||||
TagSlugs string
|
||||
WorkspaceId string
|
||||
}
|
||||
|
@ -6,6 +6,11 @@ import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"net/http"
|
||||
"os"
|
||||
"os/exec"
|
||||
"runtime"
|
||||
|
||||
"github.com/fatih/color"
|
||||
)
|
||||
|
||||
func CheckForUpdate() {
|
||||
@ -15,7 +20,26 @@ func CheckForUpdate() {
|
||||
return
|
||||
}
|
||||
if latestVersion != CLI_VERSION {
|
||||
PrintWarning(fmt.Sprintf("Please update your CLI. You are running version %s but the latest version is %s", CLI_VERSION, latestVersion))
|
||||
yellow := color.New(color.FgYellow).SprintFunc()
|
||||
blue := color.New(color.FgCyan).SprintFunc()
|
||||
black := color.New(color.FgBlack).SprintFunc()
|
||||
|
||||
msg := fmt.Sprintf("%s %s %s %s",
|
||||
yellow("A new release of infisical is available:"),
|
||||
blue(CLI_VERSION),
|
||||
black("->"),
|
||||
blue(latestVersion),
|
||||
)
|
||||
|
||||
fmt.Fprintln(os.Stderr, msg)
|
||||
|
||||
updateInstructions := GetUpdateInstructions()
|
||||
|
||||
if updateInstructions != "" {
|
||||
msg = fmt.Sprintf("\n%s\n", GetUpdateInstructions())
|
||||
fmt.Fprintln(os.Stderr, msg)
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
@ -26,7 +50,7 @@ func getLatestTag(repoOwner string, repoName string) (string, error) {
|
||||
return "", err
|
||||
}
|
||||
if resp.StatusCode != 200 {
|
||||
return "", errors.New(fmt.Sprintf("GitHub API returned status code %d", resp.StatusCode))
|
||||
return "", errors.New(fmt.Sprintf("gitHub API returned status code %d", resp.StatusCode))
|
||||
}
|
||||
|
||||
defer resp.Body.Close()
|
||||
@ -44,3 +68,53 @@ func getLatestTag(repoOwner string, repoName string) (string, error) {
|
||||
|
||||
return tags[0].Name[1:], nil
|
||||
}
|
||||
|
||||
func GetUpdateInstructions() string {
|
||||
os := runtime.GOOS
|
||||
switch os {
|
||||
case "darwin":
|
||||
return "To update, run: brew update && brew upgrade infisical"
|
||||
case "windows":
|
||||
return "To update, run: scoop update infisical"
|
||||
case "linux":
|
||||
pkgManager := getLinuxPackageManager()
|
||||
switch pkgManager {
|
||||
case "apt-get":
|
||||
return "To update, run: sudo apt-get update && sudo apt-get install infisical"
|
||||
case "yum":
|
||||
return "To update, run: sudo yum update infisical"
|
||||
case "apk":
|
||||
return "To update, run: sudo apk update && sudo apk upgrade infisical"
|
||||
case "yay":
|
||||
return "To update, run: yay -Syu infisical"
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
}
|
||||
|
||||
func getLinuxPackageManager() string {
|
||||
cmd := exec.Command("apt-get", "--version")
|
||||
if err := cmd.Run(); err == nil {
|
||||
return "apt-get"
|
||||
}
|
||||
|
||||
cmd = exec.Command("yum", "--version")
|
||||
if err := cmd.Run(); err == nil {
|
||||
return "yum"
|
||||
}
|
||||
|
||||
cmd = exec.Command("yay", "--version")
|
||||
if err := cmd.Run(); err == nil {
|
||||
return "yay"
|
||||
}
|
||||
|
||||
cmd = exec.Command("apk", "--version")
|
||||
if err := cmd.Run(); err == nil {
|
||||
return "apk"
|
||||
}
|
||||
|
||||
return ""
|
||||
}
|
||||
|
@ -5,6 +5,7 @@ import (
|
||||
"errors"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
|
||||
"github.com/Infisical/infisical-merge/packages/models"
|
||||
log "github.com/sirupsen/logrus"
|
||||
@ -41,7 +42,7 @@ func WriteInitalConfig(userCredentials *models.UserCredentials) error {
|
||||
}
|
||||
|
||||
// Create file in directory
|
||||
err = WriteToFile(fullConfigFilePath, configFileMarshalled, os.ModePerm)
|
||||
err = WriteToFile(fullConfigFilePath, configFileMarshalled, 0600)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
@ -73,7 +74,12 @@ func WorkspaceConfigFileExistsInCurrentPath() bool {
|
||||
}
|
||||
|
||||
func GetWorkSpaceFromFile() (models.WorkspaceConfigFile, error) {
|
||||
configFileAsBytes, err := os.ReadFile(INFISICAL_WORKSPACE_CONFIG_FILE_NAME)
|
||||
cfgFile, err := FindWorkspaceConfigFile()
|
||||
if err != nil {
|
||||
return models.WorkspaceConfigFile{}, err
|
||||
}
|
||||
|
||||
configFileAsBytes, err := os.ReadFile(cfgFile)
|
||||
if err != nil {
|
||||
return models.WorkspaceConfigFile{}, err
|
||||
}
|
||||
@ -87,6 +93,37 @@ func GetWorkSpaceFromFile() (models.WorkspaceConfigFile, error) {
|
||||
return workspaceConfigFile, nil
|
||||
}
|
||||
|
||||
// FindWorkspaceConfigFile searches for a .infisical.json file in the current directory and all parent directories.
|
||||
func FindWorkspaceConfigFile() (string, error) {
|
||||
dir, err := os.Getwd()
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
for {
|
||||
path := filepath.Join(dir, INFISICAL_WORKSPACE_CONFIG_FILE_NAME)
|
||||
_, err := os.Stat(path)
|
||||
if err == nil {
|
||||
// file found
|
||||
log.Debugf("FindWorkspaceConfigFile: workspace file found at [path=%s]", path)
|
||||
|
||||
return path, nil
|
||||
}
|
||||
|
||||
// check if we have reached the root directory
|
||||
if dir == filepath.Dir(dir) {
|
||||
break
|
||||
}
|
||||
|
||||
// move up one directory
|
||||
dir = filepath.Dir(dir)
|
||||
}
|
||||
|
||||
// file not found
|
||||
return "", fmt.Errorf("file not found: %s", INFISICAL_WORKSPACE_CONFIG_FILE_NAME)
|
||||
|
||||
}
|
||||
|
||||
func GetFullConfigFilePath() (fullPathToFile string, fullPathToDirectory string, err error) {
|
||||
homeDir, err := GetHomeDir()
|
||||
if err != nil {
|
||||
@ -114,52 +151,6 @@ func GetWorkspaceConfigByPath(path string) (workspaceConfig models.WorkspaceConf
|
||||
return workspaceConfigFile, nil
|
||||
}
|
||||
|
||||
// Will get the list of .infisical.json files that are located
|
||||
// within the root of each sub folder from where the CLI is ran from
|
||||
func GetAllWorkSpaceConfigsStartingFromCurrentPath() (workspaces []models.WorkspaceConfigFile, err error) {
|
||||
currentDir, err := os.Getwd()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("GetAllProjectConfigs: unable to get the current directory because [%s]", err)
|
||||
}
|
||||
|
||||
files, err := os.ReadDir(currentDir)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("GetAllProjectConfigs: unable to read the contents of the current directory because [%s]", err)
|
||||
}
|
||||
|
||||
listOfWorkSpaceConfigs := []models.WorkspaceConfigFile{}
|
||||
for _, file := range files {
|
||||
if !file.IsDir() && file.Name() == INFISICAL_WORKSPACE_CONFIG_FILE_NAME {
|
||||
pathToWorkspaceConfigFile := currentDir + "/" + INFISICAL_WORKSPACE_CONFIG_FILE_NAME
|
||||
|
||||
workspaceConfig, err := GetWorkspaceConfigByPath(pathToWorkspaceConfigFile)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("GetAllProjectConfigs: Unable to get config file because [%s]", err)
|
||||
}
|
||||
|
||||
listOfWorkSpaceConfigs = append(listOfWorkSpaceConfigs, workspaceConfig)
|
||||
|
||||
} else if file.IsDir() {
|
||||
pathToSubFolder := currentDir + "/" + file.Name()
|
||||
pathToMaybeWorkspaceConfigFile := pathToSubFolder + "/" + INFISICAL_WORKSPACE_CONFIG_FILE_NAME
|
||||
|
||||
_, err := os.Stat(pathToMaybeWorkspaceConfigFile)
|
||||
if err != nil {
|
||||
continue // workspace config file doesn't exist
|
||||
}
|
||||
|
||||
workspaceConfig, err := GetWorkspaceConfigByPath(pathToMaybeWorkspaceConfigFile)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("GetAllProjectConfigs: Unable to get config file because [%s]", err)
|
||||
}
|
||||
|
||||
listOfWorkSpaceConfigs = append(listOfWorkSpaceConfigs, workspaceConfig)
|
||||
}
|
||||
}
|
||||
|
||||
return listOfWorkSpaceConfigs, nil
|
||||
}
|
||||
|
||||
// Get the infisical config file and if it doesn't exist, return empty config model, otherwise raise error
|
||||
func GetConfigFile() (models.ConfigFile, error) {
|
||||
fullConfigFilePath, _, err := GetFullConfigFilePath()
|
||||
@ -206,7 +197,7 @@ func WriteConfigFile(configFile *models.ConfigFile) error {
|
||||
}
|
||||
|
||||
// Create file in directory
|
||||
err = os.WriteFile(fullConfigFilePath, configFileMarshalled, os.ModePerm)
|
||||
err = os.WriteFile(fullConfigFilePath, configFileMarshalled, 0600)
|
||||
if err != nil {
|
||||
return fmt.Errorf("writeConfigFile: Unable to write to file [err=%s]", err)
|
||||
}
|
||||
|
@ -1,10 +1,14 @@
|
||||
package util
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"crypto/sha256"
|
||||
"encoding/base64"
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path"
|
||||
"strings"
|
||||
)
|
||||
|
||||
type DecodedSymmetricEncryptionDetails = struct {
|
||||
@ -85,8 +89,8 @@ func RequireServiceToken() {
|
||||
}
|
||||
|
||||
func RequireLocalWorkspaceFile() {
|
||||
workspaceFileExists := WorkspaceConfigFileExistsInCurrentPath()
|
||||
if !workspaceFileExists {
|
||||
workspaceFilePath, _ := FindWorkspaceConfigFile()
|
||||
if workspaceFilePath == "" {
|
||||
PrintErrorMessageAndExit("It looks you have not yet connected this project to Infisical", "To do so, run [infisical init] then run your command again")
|
||||
}
|
||||
|
||||
@ -110,3 +114,26 @@ func GetHashFromStringList(list []string) string {
|
||||
sum := sha256.Sum256(hash.Sum(nil))
|
||||
return fmt.Sprintf("%x", sum)
|
||||
}
|
||||
|
||||
// execCmd is a struct that holds the command and arguments to be executed.
|
||||
// By using this struct, we can easily mock the command and arguments.
|
||||
type execCmd struct {
|
||||
cmd string
|
||||
args []string
|
||||
}
|
||||
|
||||
var getCurrentBranchCmd = execCmd{
|
||||
cmd: "git",
|
||||
args: []string{"symbolic-ref", "--short", "HEAD"},
|
||||
}
|
||||
|
||||
func getCurrentBranch() (string, error) {
|
||||
cmd := exec.Command(getCurrentBranchCmd.cmd, getCurrentBranchCmd.args...)
|
||||
var out bytes.Buffer
|
||||
cmd.Stdout = &out
|
||||
err := cmd.Run()
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return path.Base(strings.TrimSpace(out.String())), nil
|
||||
}
|
||||
|
@ -20,6 +20,9 @@ func PrintErrorAndExit(exitCode int, err error, messages ...string) {
|
||||
}
|
||||
}
|
||||
|
||||
supportMsg := fmt.Sprintf("\n\nIf this issue continues, get support at https://infisical.com/slack")
|
||||
fmt.Fprintln(os.Stderr, supportMsg)
|
||||
|
||||
os.Exit(exitCode)
|
||||
}
|
||||
|
||||
|
@ -76,10 +76,31 @@ func GetPlainTextSecretsViaJTW(JTWToken string, receiversPrivateKey string, work
|
||||
return nil, fmt.Errorf("unable to get your encrypted workspace key. [err=%v]", err)
|
||||
}
|
||||
|
||||
encryptedWorkspaceKey, _ := base64.StdEncoding.DecodeString(workspaceKeyResponse.EncryptedKey)
|
||||
encryptedWorkspaceKeySenderPublicKey, _ := base64.StdEncoding.DecodeString(workspaceKeyResponse.Sender.PublicKey)
|
||||
encryptedWorkspaceKeyNonce, _ := base64.StdEncoding.DecodeString(workspaceKeyResponse.Nonce)
|
||||
currentUsersPrivateKey, _ := base64.StdEncoding.DecodeString(receiversPrivateKey)
|
||||
encryptedWorkspaceKey, err := base64.StdEncoding.DecodeString(workspaceKeyResponse.EncryptedKey)
|
||||
if err != nil {
|
||||
HandleError(err, "Unable to get bytes represented by the base64 for encryptedWorkspaceKey")
|
||||
}
|
||||
|
||||
encryptedWorkspaceKeySenderPublicKey, err := base64.StdEncoding.DecodeString(workspaceKeyResponse.Sender.PublicKey)
|
||||
if err != nil {
|
||||
HandleError(err, "Unable to get bytes represented by the base64 for encryptedWorkspaceKeySenderPublicKey")
|
||||
}
|
||||
|
||||
encryptedWorkspaceKeyNonce, err := base64.StdEncoding.DecodeString(workspaceKeyResponse.Nonce)
|
||||
if err != nil {
|
||||
HandleError(err, "Unable to get bytes represented by the base64 for encryptedWorkspaceKeyNonce")
|
||||
}
|
||||
|
||||
currentUsersPrivateKey, err := base64.StdEncoding.DecodeString(receiversPrivateKey)
|
||||
if err != nil {
|
||||
HandleError(err, "Unable to get bytes represented by the base64 for currentUsersPrivateKey")
|
||||
}
|
||||
|
||||
if len(currentUsersPrivateKey) == 0 || len(encryptedWorkspaceKeySenderPublicKey) == 0 {
|
||||
log.Debugf("Missing credentials for generating plainTextEncryptionKey: [currentUsersPrivateKey=%s] [encryptedWorkspaceKeySenderPublicKey=%s]", currentUsersPrivateKey, encryptedWorkspaceKeySenderPublicKey)
|
||||
PrintErrorMessageAndExit("Some required user credentials are missing to generate your [plainTextEncryptionKey]. Please run [infisical login] then try again")
|
||||
}
|
||||
|
||||
plainTextWorkspaceKey := crypto.DecryptAsymmetric(encryptedWorkspaceKey, encryptedWorkspaceKeyNonce, encryptedWorkspaceKeySenderPublicKey, currentUsersPrivateKey)
|
||||
|
||||
encryptedSecrets, err := api.CallGetSecretsV2(httpClient, api.GetEncryptedSecretsV2Request{
|
||||
@ -131,6 +152,10 @@ func GetAllEnvironmentVariables(params models.GetAllSecretsParameters) ([]models
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if params.WorkspaceId != "" {
|
||||
workspaceFile.WorkspaceId = params.WorkspaceId
|
||||
}
|
||||
|
||||
// Verify environment
|
||||
err = ValidateEnvironmentName(params.Environment, workspaceFile.WorkspaceId, loggedInUserDetails.UserCredentials)
|
||||
if err != nil {
|
||||
@ -415,7 +440,7 @@ func WriteBackupSecrets(workspace string, environment string, encryptionKey []by
|
||||
}
|
||||
|
||||
listOfSecretsMarshalled, _ := json.Marshal(encryptedSecrets)
|
||||
err = os.WriteFile(fmt.Sprintf("%s/%s", fullPathToSecretsBackupFolder, fileName), listOfSecretsMarshalled, os.ModePerm)
|
||||
err = os.WriteFile(fmt.Sprintf("%s/%s", fullPathToSecretsBackupFolder, fileName), listOfSecretsMarshalled, 0600)
|
||||
if err != nil {
|
||||
return fmt.Errorf("WriteBackupSecrets: Unable to write backup secrets to file [err=%s]", err)
|
||||
}
|
||||
@ -481,3 +506,35 @@ func DeleteBackupSecrets() error {
|
||||
|
||||
return os.RemoveAll(fullPathToSecretsBackupFolder)
|
||||
}
|
||||
|
||||
func GetEnvFromWorkspaceFile() string {
|
||||
workspaceFile, err := GetWorkSpaceFromFile()
|
||||
if err != nil {
|
||||
log.Debugf("getEnvFromWorkspaceFile: [err=%s]", err)
|
||||
return ""
|
||||
}
|
||||
|
||||
if env := GetEnvelopmentBasedOnGitBranch(workspaceFile); env != "" {
|
||||
return env
|
||||
}
|
||||
|
||||
return workspaceFile.DefaultEnvironment
|
||||
}
|
||||
|
||||
func GetEnvelopmentBasedOnGitBranch(workspaceFile models.WorkspaceConfigFile) string {
|
||||
branch, err := getCurrentBranch()
|
||||
if err != nil {
|
||||
log.Debugf("getEnvelopmentBasedOnGitBranch: [err=%s]", err)
|
||||
}
|
||||
|
||||
envBasedOnGitBranch, ok := workspaceFile.GitBranchToEnvironmentMapping[branch]
|
||||
|
||||
log.Debugf("GetEnvelopmentBasedOnGitBranch: [envBasedOnGitBranch=%s] [ok=%t]", envBasedOnGitBranch, ok)
|
||||
|
||||
if err == nil && ok {
|
||||
return envBasedOnGitBranch
|
||||
} else {
|
||||
log.Debugf("getEnvelopmentBasedOnGitBranch: [err=%s]", err)
|
||||
return ""
|
||||
}
|
||||
}
|
||||
|
@ -1,6 +1,9 @@
|
||||
package util
|
||||
|
||||
import (
|
||||
"io"
|
||||
"os"
|
||||
"path"
|
||||
"testing"
|
||||
|
||||
"github.com/Infisical/infisical-merge/packages/models"
|
||||
@ -158,3 +161,98 @@ func Test_SubstituteSecrets_When_No_SubstituteNeeded(t *testing.T) {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func Test_Read_Env_From_File(t *testing.T) {
|
||||
type testCase struct {
|
||||
TestFile string
|
||||
ExpectedEnv string
|
||||
}
|
||||
|
||||
var cases = []testCase{
|
||||
{
|
||||
TestFile: "testdata/infisical-default-env.json",
|
||||
ExpectedEnv: "myDefaultEnv",
|
||||
},
|
||||
{
|
||||
TestFile: "testdata/infisical-branch-env.json",
|
||||
ExpectedEnv: "myMainEnv",
|
||||
},
|
||||
{
|
||||
TestFile: "testdata/infisical-no-matching-branch-env.json",
|
||||
ExpectedEnv: "myDefaultEnv",
|
||||
},
|
||||
}
|
||||
|
||||
// create a tmp directory for testing
|
||||
testDir, err := os.MkdirTemp(os.TempDir(), "infisical-test")
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to create temp directory: %s", err)
|
||||
}
|
||||
|
||||
// safe the current working directory
|
||||
originalDir, err := os.Getwd()
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to get current working directory: %s", err)
|
||||
}
|
||||
|
||||
// backup the original git command
|
||||
originalGitCmd := getCurrentBranchCmd
|
||||
|
||||
// make sure to clean up after the test
|
||||
t.Cleanup(func() {
|
||||
os.Chdir(originalDir)
|
||||
os.RemoveAll(testDir)
|
||||
getCurrentBranchCmd = originalGitCmd
|
||||
})
|
||||
|
||||
// mock the git command to return "main" as the current branch
|
||||
getCurrentBranchCmd = execCmd{cmd: "echo", args: []string{"main"}}
|
||||
|
||||
for _, c := range cases {
|
||||
// make sure we start in the original directory
|
||||
err = os.Chdir(originalDir)
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to change working directory: %s", err)
|
||||
}
|
||||
|
||||
// remove old test file if it exists
|
||||
err = os.Remove(path.Join(testDir, INFISICAL_WORKSPACE_CONFIG_FILE_NAME))
|
||||
if err != nil && !os.IsNotExist(err) {
|
||||
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to remove old test file: %s", err)
|
||||
}
|
||||
|
||||
// deploy the test file
|
||||
copyTestFile(t, c.TestFile, path.Join(testDir, INFISICAL_WORKSPACE_CONFIG_FILE_NAME))
|
||||
|
||||
// change the working directory to the tmp directory
|
||||
err = os.Chdir(testDir)
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_DefaultEnv_From_File: Failed to change working directory: %s", err)
|
||||
}
|
||||
|
||||
// get env from file
|
||||
env := GetEnvFromWorkspaceFile()
|
||||
if env != c.ExpectedEnv {
|
||||
t.Errorf("Test_Read_DefaultEnv_From_File: Expected env to be %s but got %s", c.ExpectedEnv, env)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func copyTestFile(t *testing.T, src, dst string) {
|
||||
srcFile, err := os.Open(src)
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_Env_From_File_By_Branch: Failed to open source file: %s", err)
|
||||
}
|
||||
defer srcFile.Close()
|
||||
|
||||
dstFile, err := os.Create(dst)
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_Env_From_File_By_Branch: Failed to create destination file: %s", err)
|
||||
}
|
||||
defer dstFile.Close()
|
||||
|
||||
_, err = io.Copy(dstFile, srcFile)
|
||||
if err != nil {
|
||||
t.Errorf("Test_Read_Env_From_File_By_Branch: Failed to copy file: %s", err)
|
||||
}
|
||||
}
|
||||
|
7
cli/packages/util/testdata/infisical-branch-env.json
vendored
Normal file
7
cli/packages/util/testdata/infisical-branch-env.json
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
{
|
||||
"workspaceId": "12345678",
|
||||
"defaultEnvironment": "myDefaultEnv",
|
||||
"gitBranchToEnvironmentMapping": {
|
||||
"main": "myMainEnv"
|
||||
}
|
||||
}
|
5
cli/packages/util/testdata/infisical-default-env.json
vendored
Normal file
5
cli/packages/util/testdata/infisical-default-env.json
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
{
|
||||
"workspaceId": "12345678",
|
||||
"defaultEnvironment": "myDefaultEnv",
|
||||
"gitBranchToEnvironmentMapping": null
|
||||
}
|
7
cli/packages/util/testdata/infisical-no-matching-branch-env.json
vendored
Normal file
7
cli/packages/util/testdata/infisical-no-matching-branch-env.json
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
{
|
||||
"workspaceId": "12345678",
|
||||
"defaultEnvironment": "myDefaultEnv",
|
||||
"gitBranchToEnvironmentMapping": {
|
||||
"notmain": "myMainEnv"
|
||||
}
|
||||
}
|
@ -1,4 +1,4 @@
|
||||
version: '3'
|
||||
version: "3"
|
||||
|
||||
services:
|
||||
nginx:
|
||||
@ -22,7 +22,6 @@ services:
|
||||
depends_on:
|
||||
- mongo
|
||||
image: infisical/backend
|
||||
command: npm run start
|
||||
env_file: .env
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
|
@ -11,31 +11,84 @@ infisical export [options]
|
||||
|
||||
Export environment variables from the platform into a file format.
|
||||
|
||||
## Options
|
||||
## Subcommands & flags
|
||||
|
||||
| Option | Description | Default value |
|
||||
| ------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------- |
|
||||
| `--env` | Used to set the environment that secrets are pulled from. Accepted values: `dev`, `staging`, `test`, `prod` | `dev` |
|
||||
| `--projectId` | Only required if injecting via the [service token method](../token). If you are not using service token, the project id will be automatically retrieved from the `.infisical.json` located at the root of your local project. | `None` |
|
||||
| `--expand` | Parse shell parameter expansions in your secrets (e.g., `${DOMAIN}`) | `true` |
|
||||
| `--format` | Format of the output file. Accepted values: `dotenv`, `dotenv-export`, `csv` and `json` | `dotenv` |
|
||||
<Accordion title="infisical export" defaultOpen="true">
|
||||
Use this command to export environment variables from the platform into a raw file formats
|
||||
|
||||
## Examples
|
||||
```bash
|
||||
$ infisical export
|
||||
|
||||
```bash
|
||||
# Export variables to a .env file
|
||||
infisical export > .env
|
||||
# Export variables to a .env file
|
||||
infisical export > .env
|
||||
|
||||
# Export variables to a .env file (with export keyword)
|
||||
infisical export --format=dotenv-export > .env
|
||||
# Export variables to a .env file (with export keyword)
|
||||
infisical export --format=dotenv-export > .env
|
||||
|
||||
# Export variables to a CSV file
|
||||
infisical export --format=csv > secrets.csv
|
||||
# Export variables to a CSV file
|
||||
infisical export --format=csv > secrets.csv
|
||||
|
||||
# Export variables to a JSON file
|
||||
infisical export --format=json > secrets.json
|
||||
# Export variables to a JSON file
|
||||
infisical export --format=json > secrets.json
|
||||
|
||||
# Export variables to a YAML file
|
||||
infisical export --format=yaml > secrets.yaml
|
||||
# Export variables to a YAML file
|
||||
infisical export --format=yaml > secrets.yaml
|
||||
```
|
||||
|
||||
```
|
||||
### flags
|
||||
<Accordion title="--env">
|
||||
Used to set the environment that secrets are pulled from.
|
||||
|
||||
```bash
|
||||
# Example
|
||||
infisical export --env=prod
|
||||
```
|
||||
|
||||
Note: this flag only accepts environment slug names not the fully qualified name. To view the slug name of an environment, visit the project settings page.
|
||||
|
||||
default value: `dev`
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="--projectId">
|
||||
By default the project id is retrieved from the `.infisical.json` located at the root of your local project.
|
||||
This flag allows you to override this behavior by explicitly defining the project to fetch your secrets from.
|
||||
|
||||
```bash
|
||||
# Example
|
||||
|
||||
infisical export --projectId=XXXXXXXXXXXXXX
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="--expand">
|
||||
Parse shell parameter expansions in your secrets (e.g., `${DOMAIN}`)
|
||||
|
||||
Default value: `true`
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="--format">
|
||||
Format of the output file. Accepted values: `dotenv`, `dotenv-export`, `csv` and `json`
|
||||
|
||||
Default value: `dotenv`
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="--secret-overriding">
|
||||
Prioritizes personal secrets with the same name over shared secrets
|
||||
|
||||
Default value: `true`
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="--tags">
|
||||
When working with tags, you can use this flag to filter and retrieve only secrets that are associated with a specific tag(s).
|
||||
|
||||
```bash
|
||||
# Example
|
||||
infisical run --tags=tag1,tag2,tag3 -- npm run dev
|
||||
```
|
||||
|
||||
Note: you must reference the tag by its slug name not its fully qualified name. Go to project settings to view all tag slugs.
|
||||
|
||||
By default, all secrets are fetched
|
||||
</Accordion>
|
||||
|
||||
</Accordion>
|
||||
|
@ -79,4 +79,17 @@ Inject secrets from Infisical into your application process.
|
||||
Default value: `true`
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="--tags">
|
||||
When working with tags, you can use this flag to filter and retrieve only secrets that are associated with a specific tag(s).
|
||||
|
||||
```bash
|
||||
# Example
|
||||
infisical run --tags=tag1,tag2,tag3 -- npm run dev
|
||||
```
|
||||
|
||||
Note: you must reference the tag by its slug name not its fully qualified name. Go to project settings to view all tag slugs.
|
||||
|
||||
By default, all secrets are fetched
|
||||
</Accordion>
|
||||
|
||||
</Accordion>
|
||||
|
@ -18,4 +18,8 @@ If you are still experiencing trouble, please seek support.
|
||||
<Accordion title="Can I fetch secrets with Infisical if I am offline?">
|
||||
Yes. If you have previously retrieved secrets for a specific project and environment (such as dev, staging, or prod), the `run`/`secret` command will utilize the saved secrets, even when offline, on subsequent fetch attempts.
|
||||
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Can I upload the .infisical.json file that was generated?">
|
||||
Yes. This is simply a configuration file and contains no sensitive data.
|
||||
</Accordion>
|
@ -20,7 +20,7 @@ The Infisical CLI provides a way to inject environment variables from the platfo
|
||||
### Updates
|
||||
|
||||
```bash
|
||||
brew upgrade infisical
|
||||
brew update && brew upgrade infisical
|
||||
```
|
||||
|
||||
</Tab>
|
||||
@ -45,19 +45,19 @@ The Infisical CLI provides a way to inject environment variables from the platfo
|
||||
<Tab title="Alpine">
|
||||
Install prerequisite
|
||||
```bash
|
||||
sudo apk add --no-cache bash sudo
|
||||
apk add --no-cache bash sudo
|
||||
```
|
||||
|
||||
Add Infisical repository
|
||||
```bash
|
||||
curl -1sLf \
|
||||
'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.alpine.sh' \
|
||||
| sudo -E bash
|
||||
| bash
|
||||
```
|
||||
|
||||
Then install CLI
|
||||
```bash
|
||||
sudo apk update && sudo apk add infisical
|
||||
apk update && sudo apk add infisical
|
||||
```
|
||||
|
||||
</Tab>
|
||||
|
44
docs/cli/project-config.mdx
Normal file
44
docs/cli/project-config.mdx
Normal file
@ -0,0 +1,44 @@
|
||||
---
|
||||
title: "Project config file"
|
||||
description: "Project config file & customization options"
|
||||
---
|
||||
|
||||
To link your local project on your machine with an Infisical project, we suggest using the infisical init CLI command. This will generate a `.infisical.json` file in the root directory of your project.
|
||||
|
||||
The `.infisical.json` file specifies various parameters, such as the Infisical project to retrieve secrets from, along with other configuration options. Furthermore, you can define additional properties in the file to further tailor your local development experience.
|
||||
|
||||
## Set default environment
|
||||
If you need to change environments while using the CLI, you can do so by including the `--env` flag in your command.
|
||||
However, this can be inconvenient if you typically work in just one environment.
|
||||
To simplify the process, you can establish a default environment, which will be used for every command unless you specify otherwise.
|
||||
|
||||
```json .infisical.json
|
||||
{
|
||||
"workspaceId": "63ee5410a45f7a1ed39ba118",
|
||||
"defaultEnvironment": "test",
|
||||
"gitBranchToEnvironmentMapping": null
|
||||
}
|
||||
```
|
||||
|
||||
### How it works
|
||||
If both `defaultEnvironment` and `gitBranchToEnvironmentMapping` are configured, `gitBranchToEnvironmentMapping` will take precedence over `defaultEnvironment`.
|
||||
However, if `gitBranchToEnvironmentMapping` is not set and `defaultEnvironment` is, then the `defaultEnvironment` will be used to execute your Infisical CLI commands.
|
||||
If you wish to override the `defaultEnvironment`, you can do so by using the `--env` flag explicitly.
|
||||
|
||||
## Set Infisical environment based on GitHub branch
|
||||
When fetching your secrets from Infisical, you can switch between environments by using the `--env` flag. However, in certain cases, you may prefer the environment to be automatically mapped based on the current GitHub branch you are working on.
|
||||
To achieve this, simply add the `gitBranchToEnvironmentMapping` property to your configuration file, as shown below.
|
||||
|
||||
```json .infisical.json
|
||||
{
|
||||
"workspaceId": "63ee5410a45f7a1ed39ba118",
|
||||
"gitBranchToEnvironmentMapping": {
|
||||
"branchName": "dev",
|
||||
"anotherBranchName": "staging"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### How it works
|
||||
After configuring this property, every time you use the CLI with the specified configuration file, it will automatically verify if there is a corresponding environment mapping for the current Github branch you are on.
|
||||
If it exists, the CLI will use that environment to retrieve secrets. You can override this behavior by explicitly using the `--env` flag while interacting with the CLI.
|
@ -1,12 +1,12 @@
|
||||
---
|
||||
title: "Activity Logs"
|
||||
title: "Audit Logs"
|
||||
description: "See which events are triggered within your Infisical project."
|
||||
---
|
||||
|
||||
Activity logs record all actions going through Infisical including who performed which CRUD operations on environment variables and from what IP address. They help answer questions like:
|
||||
Audit logs record all actions going through Infisical including who performed which CRUD operations on environment variables and from what IP address. They help answer questions like:
|
||||
|
||||
- Who added or updated environment variables recently?
|
||||
- Did Bob read environment variables last week (if at all)?
|
||||
- What IP address was used for that action?
|
||||
|
||||

|
||||

|
||||
|
@ -5,8 +5,6 @@ description: "How to sync your secrets among various 3rd-party services with Inf
|
||||
|
||||
Integrations allow environment variables to be synced across your entire infrastructure from local development to CI/CD and production.
|
||||
|
||||
We're still relatively early with integrations. 6+ integrations are already avaiable but expect more coming very soon.
|
||||
|
||||
<Card title="View integrations" icon="link" href="/integrations/overview">
|
||||
View all available integrations and their guides
|
||||
</Card>
|
||||
|
18
docs/getting-started/dashboard/mfa.mdx
Normal file
18
docs/getting-started/dashboard/mfa.mdx
Normal file
@ -0,0 +1,18 @@
|
||||
---
|
||||
title: "MFA"
|
||||
description: "Secure your Infisical account with MFA"
|
||||
---
|
||||
|
||||
MFA requires users to provide multiple forms of identification to access their account. Currently, this means logging in with your password and a 6-digit code sent to your email.
|
||||
|
||||
## Email 2FA
|
||||
|
||||
Check the box in Personal Settings > Two-factor Authentication to enable email-based 2FA.
|
||||
|
||||

|
||||
|
||||
<Note>
|
||||
Infisical currently supports email-based 2FA. We're actively working on
|
||||
building support for other forms of identification via SMS and Authenticator
|
||||
App.
|
||||
</Note>
|
@ -12,7 +12,7 @@ This is a non-exhaustive list of features that Infisical offers:
|
||||
- Sync secrets to platforms via integrations to platforms like GitHub, Vercel, and Netlify.
|
||||
- Rollback secrets to any point in time.
|
||||
- Rollback each secrets to any version.
|
||||
- Track actions through activity logs.
|
||||
- Track actions through audit logs.
|
||||
|
||||
## CLI
|
||||
|
||||
|
BIN
docs/images/email-socketlabs-credentials.png
Normal file
BIN
docs/images/email-socketlabs-credentials.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 315 KiB |
BIN
docs/images/email-socketlabs-dashboard.png
Normal file
BIN
docs/images/email-socketlabs-dashboard.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 468 KiB |
BIN
docs/images/email-socketlabs-domains.png
Normal file
BIN
docs/images/email-socketlabs-domains.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 332 KiB |
BIN
docs/images/integrations-circleci-auth.png
Normal file
BIN
docs/images/integrations-circleci-auth.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 162 KiB |
BIN
docs/images/integrations-circleci-create.png
Normal file
BIN
docs/images/integrations-circleci-create.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 180 KiB |
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user