Compare commits

..

115 Commits

Author SHA1 Message Date
73c7b917ab update secret rotation intro 2023-11-02 17:17:20 -04:00
a8470d2133 Merge pull request #1150 from akhilmhdh/fix/mrg-bug-fixes
fix: resolved error in org settings and integrations page alert hidde…
2023-11-02 12:44:50 -04:00
ca8fff320d Merge branch 'main' into fix/mrg-bug-fixes 2023-11-02 12:44:26 -04:00
d4a5eb12e8 Patch checkly integration 2023-11-02 11:54:01 +02:00
86d82737f4 Merge pull request #1145 from atimapreandrew/checkly-sync-on-group-level
Checkly sync on group level
2023-11-02 11:10:40 +02:00
abbeb67b95 fix: resolved error in org settings and integrations page alert hidden on no integrations 2023-11-02 14:39:13 +05:30
c0c96d6407 Update checkly integration docs 2023-11-02 10:00:43 +02:00
58ff6a43bc Update Checkly groups integration 2023-11-02 09:37:36 +02:00
079a09a3d1 Remove create new org 2023-11-01 14:27:46 -07:00
a07bd5ad40 add log to secretRotationQueue process 2023-11-01 16:49:31 -04:00
9cc99e41b8 Merge pull request #1117 from akhilmhdh/feat/secret-rotation
Secret rotation
2023-11-01 16:39:24 -04:00
f256493cb3 Merge remote-tracking branch 'origin' into checkly-sync-on-group-level 2023-11-01 20:37:15 +02:00
40238788e5 feat(secret-rotation): changed queue logging to pino 2023-11-01 22:24:58 +05:30
75eeda4278 feat(secret-rotation): changed to mysql2 client and refactored queue util functions 2023-11-01 22:22:46 +05:30
c1ea441e3a AJV set strict to false 2023-11-01 22:21:31 +05:30
8b522a3fb5 feat(secret-rotation): updated docs for secret rotation 2023-11-01 22:21:31 +05:30
c36352f05f feat(secret-rotation): updated helper text for options 2023-11-01 22:21:31 +05:30
2de898fdbd feat: backward compatiable enc key 2023-11-01 22:21:31 +05:30
bc68a00265 feat(secret-rotation): updated lottie and added side effect on successfully 2023-11-01 22:21:31 +05:30
1382688e58 add secretRotation to feature set 2023-11-01 22:21:31 +05:30
9248f36edb feat(secret-rotation): added db ssl option in test function 2023-11-01 22:21:31 +05:30
c9c40521b2 feat(secret-rotation): added db ssl support 2023-11-01 22:21:31 +05:30
97e4338335 feat(secret-rotation): implemented frontend ui for secret rotation 2023-11-01 22:21:31 +05:30
82e924baff feat(secret-rotation): implemented api and queue for secret rotation 2023-11-01 22:21:31 +05:30
2350219cc9 Merge pull request #1148 from Infisical/revert-transactions
Remove transactions from delete organization, workspace, user
2023-11-01 17:51:50 +02:00
28d7c72390 Merge remote-tracking branch 'origin' into revert-transactions 2023-11-01 16:41:32 +02:00
e7321e8060 Merge pull request #1146 from Infisical/pino
Replace winston with pino logging
2023-11-01 20:07:33 +05:30
28a2aebe67 chore: removed npx from pino-pretty 2023-11-01 20:06:02 +05:30
20d4f16d33 Move pino-pretty to dev-dep, dev script 2023-11-01 13:20:10 +02:00
75992e5566 Merge remote-tracking branch 'origin' into pino 2023-11-01 11:16:34 +02:00
7622a3f518 Remove transactions from delete organization, workspace, user 2023-11-01 11:06:36 +02:00
3b0bd362c9 Refactor requestErrorHandler, adjust request errors to appropriate pino log level 2023-11-01 10:10:39 +02:00
ad4513f926 Replace winston with pino 2023-10-31 15:03:36 +02:00
279958d54c Checkly group level sync support 2023-10-31 07:18:59 +01:00
1be46b5e57 Merge pull request #1085 from rtpa25/feat/create-multiple-orgs-under-same-account
feat: adds ability to create multiple orgs under the same account
2023-10-31 11:21:56 +05:30
98d9dd256b remove image name from k8 docs 2023-10-30 16:59:25 -04:00
e5eee14409 Merge pull request #1121 from Infisical/snyk-fix-9c5c22e2d4bdb58631063170328a0670
[Snyk] Security upgrade crypto-js from 4.1.1 to 4.2.0
2023-10-30 14:18:10 -04:00
f3c76c79ee Checkly group level sync support 2023-10-30 18:23:23 +01:00
5bdb6ad6a1 Merge pull request #1141 from akhilmhdh/fix/backward-enc-key
feat: backward compatiable enc key in webhook
2023-10-30 13:19:18 -04:00
c6846f8bf1 feat: updated backward compatiable enc key 2023-10-30 22:41:15 +05:30
46f03f33b0 fix approval plan typo 2023-10-30 12:48:06 -04:00
6280d7eb34 feat: backward compatiable enc key in webhook 2023-10-30 21:30:37 +05:30
29286d2125 Merge pull request #1138 from Shraeyas:develop
Fix bug with copying secret to clipboard with an override
2023-10-29 22:56:14 -04:00
c9f01ce086 Fix bug with copying secret to clipboard with an override 2023-10-29 21:25:36 +05:30
bc43e109eb update zod version for frontend 2023-10-27 18:06:59 -04:00
238c43a360 Merge pull request #1131 from akhilmhdh/fix/build-failing-ts
fix: standalone build failure due to ts error
2023-10-27 17:44:10 -04:00
040a50d599 Merge pull request #1132 from Tchoupinax/main
feat(helm-chart): repair usage of resources key
2023-10-27 17:42:48 -04:00
8a1a3e9ab9 chore(helm-chart): increase the version to 0.4.2 2023-10-27 20:45:35 +02:00
2585d50b29 feat(helm-chart): repair usage of resources key 2023-10-27 20:42:35 +02:00
4792e752c2 update dependency of rate limiter 2023-10-27 10:48:25 -04:00
1d161f6c97 fix: standalone build failure due to ts error 2023-10-27 20:05:51 +05:30
0d94b6deed Merge pull request #1130 from Infisical/revert-1129-revert-1128-mongodb-dep
Revert "Revert "Remove mongodb direct dependency from backend""
2023-10-27 10:26:03 -04:00
75428bb750 Revert "Revert "Remove mongodb direct dependency from backend"" 2023-10-27 10:24:49 -04:00
d90680cc91 Merge pull request #1129 from Infisical/revert-1128-mongodb-dep
Revert "Remove mongodb direct dependency from backend"
2023-10-27 10:21:04 -04:00
031c05b82d Revert "Remove mongodb direct dependency from backend" 2023-10-27 10:20:51 -04:00
ffc6dcdeb4 Merge pull request #1128 from Infisical/mongodb-dep
Remove mongodb direct dependency from backend
2023-10-27 15:19:01 +01:00
dfc74262ee Remove mongodb direct dependency from backend 2023-10-27 15:16:22 +01:00
59e46ef1d0 Merge pull request #1125 from akhilmhdh/fix/deep-main-page
fix: resolved nav header secret path issues
2023-10-27 10:01:57 -04:00
36e4cd71d3 Merge pull request #1127 from Infisical/update-node-saml
Update subdependencies, node-saml
2023-10-27 14:54:49 +01:00
d60b3d1598 Update subdependencies, node-saml 2023-10-27 14:52:22 +01:00
15504346cd fix: resolved nav header secret path issues 2023-10-27 12:16:37 +05:30
508ed7f7d6 Merge pull request #1124 from akhilmhdh/fix/folder-create-overview
fix:resolved overview page add secret not working when folder not exist in one level deep
2023-10-26 13:11:23 -04:00
c097e43a4e fix:resolved overview page add secret not working when folder not existing 2023-10-26 15:14:05 +05:30
40a9a15709 fix: backend/package.json & backend/package-lock.json to reduce vulnerabilities
The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-JS-CRYPTOJS-6028119
2023-10-25 16:41:52 +00:00
c0592ad904 remove cypress folder from root 2023-10-24 10:41:16 -04:00
32970e4990 Merge pull request #1107 from Infisical/cypress
adding cypress tests
2023-10-24 10:38:47 -04:00
619bbf2027 fix: fixes broken nonePage.tsx 2023-10-24 06:09:12 +05:30
1476d06b7e feat: adds cancel button and uses zod over yup 2023-10-24 06:02:20 +05:30
fb59b02ab4 Merge branch 'Infisical:main' into feat/create-multiple-orgs-under-same-account 2023-10-24 05:42:31 +05:30
fc3db93f8b Merge pull request #1102 from G3root/hasura-cloud
feat: add hasura cloud integration
2023-10-23 20:37:52 +01:00
120f1cb5dd Remove print statements, clean hasura cloud integration frontend 2023-10-23 20:06:35 +01:00
bb9b060fc0 Update syncSecretsHasuraCloud 2023-10-23 19:52:55 +01:00
26605638fa Merge pull request #1110 from techemmy/docs/add-REAMDE-for-contributing-to-the-docs
docs: add README file for instructions on how to get the doc started …
2023-10-23 14:52:17 +01:00
76758732af Merge pull request #1112 from Infisical/auth-jwt-standardization
API Key V2
2023-10-23 12:30:00 +01:00
827d5b25c2 Cleanup comments API Key V2 2023-10-23 12:18:44 +01:00
b32b19bcc1 Finish API Key V2 2023-10-23 11:58:16 +01:00
69b9881cbc docs: add README file for instructions on how to get the doc started in local development 2023-10-22 16:40:33 +01:00
1084323d6d Merge pull request #1014 from G3root/e2e-warning
feat: display warning message in integrations page when e2e is enabled
2023-10-22 14:42:15 +01:00
c98c45157a Merge branch 'main' into e2e-warning 2023-10-22 14:39:09 +01:00
6009dda2d2 Merge pull request #1105 from G3root/fix-batch-delete-integration
fix: batch deleting secrets not getting synced for integrations
2023-10-21 14:42:23 +05:30
d4e8162c41 fix: sync deleted secret 2023-10-21 01:39:08 +05:30
f6ad641858 chore: add logs 2023-10-21 01:38:41 +05:30
32acc370a4 feat: add delete method 2023-10-21 01:36:23 +05:30
ba9b1b45ae update docker docs for self host 2023-10-20 13:36:09 +01:00
e05b26c727 Add docs for Hasura Cloud 2023-10-20 11:25:06 +01:00
e22557b4bb Merge pull request #1088 from adelowo/support_path_when_generating_sample_env
[ENG-179] Add suport for --path for the secrets generate-example-env command
2023-10-20 11:10:12 +01:00
cbbb12c74e Merge pull request #1099 from Infisical/jwt-refactor
Update JWT secret scheme, replace many secrets with one secret
2023-10-20 11:02:58 +01:00
60beda604f Merge branch 'jwt-refactor' of https://github.com/Infisical/infisical into jwt-refactor 2023-10-20 10:55:40 +01:00
ae50987f91 Default AUTH_SECRET to JWT_AUTH_SECRET for backwards compatibility 2023-10-20 10:55:29 +01:00
32977e06f8 add warning text for .env.example 2023-10-20 10:38:42 +01:00
4d78f4a824 feat: add create page 2023-10-20 13:22:40 +05:30
47bf483c2e feat: add logo 2023-10-20 13:20:43 +05:30
40e5ecfd7d feat: add sync 2023-10-20 13:20:00 +05:30
0fb0744f09 feat: add get apps 2023-10-20 13:18:26 +05:30
058712e8ec Update JWT secret scheme, replace many secrets with one secret 2023-10-19 15:53:36 +01:00
e13b3f72b1 feat: add authorize page 2023-10-18 23:08:13 +05:30
a6e02238ad feat: add hasura cloud 2023-10-18 22:40:34 +05:30
ebe4f70b51 docs: add hasura cloud integration 2023-10-18 22:37:31 +05:30
c3c7316ec0 feat: add to redirect provider 2023-10-18 22:15:48 +05:30
2cd791a433 feat: add integration page 2023-10-18 21:51:35 +05:30
912818eec8 Merge branch 'main' into feat/create-multiple-orgs-under-same-account 2023-10-16 18:55:48 -07:00
e0dfb2548f update flag help text 2023-10-15 02:55:29 +01:00
01997a5187 support --path when generating sample env files 2023-10-15 02:50:41 +01:00
840eef7bce feat: improves the flow of account creation 2023-10-13 11:05:24 +05:30
70b9d435d1 feat: adds ability to create multiple orgs under the same account 2023-10-13 10:42:54 +05:30
9546916aad fix: add props 2023-09-30 17:12:52 +05:30
59c861c695 fix: rename variants 2023-09-30 17:07:52 +05:30
2eff06cf06 fix: alert component styles 2023-09-22 23:20:45 +05:30
a024eecf2c chore: remove utils 2023-09-22 23:10:46 +05:30
a2ad9e10b4 chore: enable prop types 2023-09-22 22:37:36 +05:30
7fa4e09874 feat: use alert component 2023-09-20 23:09:03 +05:30
20c4e956aa feat: add warnings 2023-09-19 17:25:37 +05:30
4a227d05ce feat: add className utility 2023-09-19 17:25:03 +05:30
6f57ef03d1 feat: add alert component 2023-09-19 17:24:33 +05:30
257b4b0490 chore: disable prop-types rule 2023-09-19 17:08:54 +05:30
189 changed files with 8577 additions and 4031 deletions

View File

@ -1,24 +1,12 @@
# Keys
# Required key for platform encryption/decryption ops
# THIS IS A SAMPLE ENCRYPTION KEY AND SHOULD NOT BE USED FOR PRODUCTION
# THIS IS A SAMPLE ENCRYPTION KEY AND SHOULD NEVER BE USED FOR PRODUCTION
ENCRYPTION_KEY=6c1fe4e407b8911c104518103505b218
# JWT
# Required secrets to sign JWT tokens
JWT_SIGNUP_SECRET=3679e04ca949f914c03332aaaeba805a
JWT_REFRESH_SECRET=5f2f3c8f0159068dc2bbb3a652a716ff
JWT_AUTH_SECRET=4be6ba5602e0fa0ac6ac05c3cd4d247f
JWT_SERVICE_SECRET=f32f716d70a42c5703f4656015e76200
JWT_SERVICE_TOKEN_SECRET=f32f716d70a42c5703f4656015e76200
JWT_PROVIDER_AUTH_SECRET=f32f716d70a42c5703f4656015e76201
# JWT lifetime
# Optional lifetimes for JWT tokens expressed in seconds or a string
# describing a time span (e.g. 60, "2 days", "10h", "7d")
JWT_AUTH_LIFETIME=
JWT_REFRESH_LIFETIME=
JWT_SIGNUP_LIFETIME=
JWT_PROVIDER_AUTH_LIFETIME=
# THIS IS A SAMPLE AUTH_SECRET KEY AND SHOULD NEVER BE USED FOR PRODUCTION
AUTH_SECRET=5lrMXKKWCVocS/uerPsl7V+TX/aaUaI7iDkgl3tSmLE=
# MongoDB
# Backend will connect to the MongoDB instance at connection string MONGO_URL which can either be a ref
@ -68,5 +56,12 @@ SENTRY_DSN=
POSTHOG_HOST=
POSTHOG_PROJECT_API_KEY=
CLIENT_ID_GOOGLE=
CLIENT_SECRET_GOOGLE=
# SSO-specific variables
CLIENT_ID_GOOGLE_LOGIN=
CLIENT_SECRET_GOOGLE_LOGIN=
CLIENT_ID_GITHUB_LOGIN=
CLIENT_SECRET_GITHUB_LOGIN=
CLIENT_ID_GITLAB_LOGIN=
CLIENT_SECRET_GITLAB_LOGIN=

3262
backend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -6,11 +6,12 @@
"@godaddy/terminus": "^4.12.0",
"@node-saml/passport-saml": "^4.0.4",
"@octokit/rest": "^19.0.5",
"@sentry/node": "^7.49.0",
"@sentry/node": "^7.77.0",
"@sentry/tracing": "^7.48.0",
"@types/crypto-js": "^4.1.1",
"@types/libsodium-wrappers": "^0.7.10",
"@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0",
"argon2": "^0.30.3",
"aws-sdk": "^2.1364.0",
"axios": "^1.3.5",
@ -19,7 +20,7 @@
"bigint-conversion": "^2.4.0",
"cookie-parser": "^1.4.6",
"cors": "^2.8.5",
"crypto-js": "^4.1.1",
"crypto-js": "^4.2.0",
"dotenv": "^16.0.1",
"express": "^4.18.1",
"express-async-errors": "^3.1.1",
@ -29,13 +30,14 @@
"helmet": "^5.1.1",
"infisical-node": "^1.2.1",
"ioredis": "^5.3.2",
"jmespath": "^0.16.0",
"js-yaml": "^4.1.0",
"jsonwebtoken": "^9.0.0",
"jsrp": "^0.2.4",
"libsodium-wrappers": "^0.7.10",
"lodash": "^4.17.21",
"mongodb": "^5.7.0",
"mongoose": "^7.4.1",
"mysql2": "^3.6.2",
"nanoid": "^3.3.6",
"node-cache": "^5.1.2",
"nodemailer": "^6.8.0",
@ -43,6 +45,9 @@
"passport-github": "^1.1.0",
"passport-gitlab2": "^5.0.0",
"passport-google-oauth20": "^2.0.0",
"pino": "^8.16.1",
"pino-http": "^8.5.1",
"pg": "^8.11.3",
"posthog-node": "^2.6.0",
"probot": "^12.3.1",
"query-string": "^7.1.3",
@ -53,16 +58,19 @@
"tweetnacl-util": "^0.15.1",
"typescript": "^4.9.3",
"utility-types": "^3.10.0",
"winston": "^3.8.2",
"winston-loki": "^6.0.6",
"zod": "^3.22.3"
},
"overrides": {
"rate-limit-mongo": {
"mongodb": "5.8.0"
}
},
"name": "infisical-api",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node build/index.js",
"dev": "nodemon",
"dev": "nodemon index.js | pino-pretty --colorize",
"swagger-autogen": "node ./swagger/index.ts",
"build": "rimraf ./build && tsc && cp -R ./src/templates ./build && cp -R ./src/data ./build",
"lint": "eslint . --ext .ts",
@ -94,12 +102,15 @@
"@types/cors": "^2.8.12",
"@types/express": "^4.17.14",
"@types/jest": "^29.5.0",
"@types/jmespath": "^0.15.1",
"@types/jsonwebtoken": "^8.5.9",
"@types/lodash": "^4.14.191",
"@types/node": "^18.11.3",
"@types/nodemailer": "^6.4.6",
"@types/passport": "^1.0.12",
"@types/pg": "^8.10.7",
"@types/picomatch": "^2.3.0",
"@types/pino": "^7.0.5",
"@types/supertest": "^2.0.12",
"@types/swagger-jsdoc": "^6.0.1",
"@types/swagger-ui-express": "^4.1.3",
@ -113,6 +124,7 @@
"jest-junit": "^15.0.0",
"nodemon": "^2.0.19",
"npm": "^8.19.3",
"pino-pretty": "^10.2.3",
"smee-client": "^1.2.3",
"supertest": "^6.3.3",
"swagger-autogen": "^2.23.5",

View File

@ -17,17 +17,13 @@ export const getRootEncryptionKey = async () => {
}
export const getInviteOnlySignup = async () => (await client.getSecret("INVITE_ONLY_SIGNUP")).secretValue === "true"
export const getSaltRounds = async () => parseInt((await client.getSecret("SALT_ROUNDS")).secretValue) || 10;
export const getAuthSecret = async () => (await client.getSecret("JWT_AUTH_SECRET")).secretValue ?? (await client.getSecret("AUTH_SECRET")).secretValue;
export const getJwtAuthLifetime = async () => (await client.getSecret("JWT_AUTH_LIFETIME")).secretValue || "10d";
export const getJwtAuthSecret = async () => (await client.getSecret("JWT_AUTH_SECRET")).secretValue;
export const getJwtMfaLifetime = async () => (await client.getSecret("JWT_MFA_LIFETIME")).secretValue || "5m";
export const getJwtMfaSecret = async () => (await client.getSecret("JWT_MFA_LIFETIME")).secretValue || "5m";
export const getJwtRefreshLifetime = async () => (await client.getSecret("JWT_REFRESH_LIFETIME")).secretValue || "90d";
export const getJwtRefreshSecret = async () => (await client.getSecret("JWT_REFRESH_SECRET")).secretValue;
export const getJwtServiceSecret = async () => (await client.getSecret("JWT_SERVICE_SECRET")).secretValue;
export const getJwtServiceSecret = async () => (await client.getSecret("JWT_SERVICE_SECRET")).secretValue; // TODO: deprecate (related to ST V1)
export const getJwtSignupLifetime = async () => (await client.getSecret("JWT_SIGNUP_LIFETIME")).secretValue || "15m";
export const getJwtProviderAuthSecret = async () => (await client.getSecret("JWT_PROVIDER_AUTH_SECRET")).secretValue;
export const getJwtProviderAuthLifetime = async () => (await client.getSecret("JWT_PROVIDER_AUTH_LIFETIME")).secretValue || "15m";
export const getJwtSignupSecret = async () => (await client.getSecret("JWT_SIGNUP_SECRET")).secretValue;
export const getJwtServiceTokenSecret = async () => (await client.getSecret("JWT_SERVICE_TOKEN_SECRET")).secretValue;
export const getMongoURL = async () => (await client.getSecret("MONGO_URL")).secretValue;
export const getNodeEnv = async () => (await client.getSecret("NODE_ENV")).secretValue || "production";

View File

@ -6,15 +6,18 @@ const jsrp = require("jsrp");
import { LoginSRPDetail, TokenVersion, User } from "../../models";
import { clearTokens, createToken, issueAuthTokens } from "../../helpers/auth";
import { checkUserDevice } from "../../helpers/user";
import { ACTION_LOGIN, ACTION_LOGOUT } from "../../variables";
import {
ACTION_LOGIN,
ACTION_LOGOUT,
AuthTokenType
} from "../../variables";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
import { EELogService } from "../../ee/services";
import { getUserAgentType } from "../../utils/posthog";
import {
getAuthSecret,
getHttpsEnabled,
getJwtAuthLifetime,
getJwtAuthSecret,
getJwtRefreshSecret
getJwtAuthLifetime
} from "../../config";
import { ActorType } from "../../ee/models";
import { validateRequest } from "../../helpers/validation";
@ -238,6 +241,7 @@ export const checkAuth = async (req: Request, res: Response) => {
* @returns
*/
export const getNewToken = async (req: Request, res: Response) => {
const refreshToken = req.cookies.jid;
if (!refreshToken)
@ -245,7 +249,9 @@ export const getNewToken = async (req: Request, res: Response) => {
message: "Failed to find refresh token in request cookies"
});
const decodedToken = <jwt.UserIDJwtPayload>jwt.verify(refreshToken, await getJwtRefreshSecret());
const decodedToken = <jwt.UserIDJwtPayload>jwt.verify(refreshToken, await getAuthSecret());
if (decodedToken.authTokenType !== AuthTokenType.REFRESH_TOKEN) throw UnauthorizedRequestError();
const user = await User.findOne({
_id: decodedToken.userId
@ -268,12 +274,13 @@ export const getNewToken = async (req: Request, res: Response) => {
const token = createToken({
payload: {
authTokenType: AuthTokenType.ACCESS_TOKEN,
userId: decodedToken.userId,
tokenVersionId: tokenVersion._id.toString(),
accessVersion: tokenVersion.refreshVersion
},
expiresIn: await getJwtAuthLifetime(),
secret: await getJwtAuthSecret()
secret: await getAuthSecret()
});
return res.status(200).send({

View File

@ -10,6 +10,7 @@ import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_UTF8,
INTEGRATION_BITBUCKET_API_URL,
INTEGRATION_CHECKLY_API_URL,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_NORTHFLANK_API_URL,
INTEGRATION_QOVERY_API_URL,
@ -28,7 +29,6 @@ import {
} from "../../ee/services/ProjectRoleService";
import { ForbiddenError } from "@casl/ability";
import { getIntegrationAuthAccessHelper } from "../../helpers";
import { ObjectId } from "mongodb";
/***
* Return integration authorization with id [integrationAuthId]
@ -222,7 +222,7 @@ export const getIntegrationAuthApps = async (req: Request, res: Response) => {
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken, accessId } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -260,7 +260,7 @@ export const getIntegrationAuthTeams = async (req: Request, res: Response) => {
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -296,7 +296,7 @@ export const getIntegrationAuthVercelBranches = async (req: Request, res: Respon
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -345,6 +345,59 @@ export const getIntegrationAuthVercelBranches = async (req: Request, res: Respon
});
};
/**
* Return list of Checkly groups for a specific user
* @param req
* @param res
*/
export const getIntegrationAuthChecklyGroups = async (req: Request, res: Response) => {
const {
params: { integrationAuthId },
query: { accountId }
} = await validateRequest(reqValidator.GetIntegrationAuthChecklyGroupsV1, req);
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
req.user._id,
integrationAuth.workspace.toString()
);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Read,
ProjectPermissionSub.Integrations
);
interface ChecklyGroup {
id: number;
name: string;
}
if (accountId && accountId !== "") {
const { data }: { data: ChecklyGroup[] } = (
await standardRequest.get(`${INTEGRATION_CHECKLY_API_URL}/v1/check-groups`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"X-Checkly-Account": accountId
}
})
);
return res.status(200).send({
groups: data.map((g: ChecklyGroup) => ({
name: g.name,
groupId: g.id,
}))
});
}
return res.status(200).send({
groups: []
});
}
/**
* Return list of Qovery Orgs for a specific user
* @param req
@ -357,7 +410,7 @@ export const getIntegrationAuthQoveryOrgs = async (req: Request, res: Response)
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -409,7 +462,7 @@ export const getIntegrationAuthQoveryProjects = async (req: Request, res: Respon
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -470,7 +523,7 @@ export const getIntegrationAuthQoveryEnvironments = async (req: Request, res: Re
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -531,7 +584,7 @@ export const getIntegrationAuthQoveryApps = async (req: Request, res: Response)
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -592,7 +645,7 @@ export const getIntegrationAuthQoveryContainers = async (req: Request, res: Resp
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -653,7 +706,7 @@ export const getIntegrationAuthQoveryJobs = async (req: Request, res: Response)
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -715,7 +768,7 @@ export const getIntegrationAuthRailwayEnvironments = async (req: Request, res: R
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -808,7 +861,7 @@ export const getIntegrationAuthRailwayServices = async (req: Request, res: Respo
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -932,7 +985,7 @@ export const getIntegrationAuthBitBucketWorkspaces = async (req: Request, res: R
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -988,7 +1041,7 @@ export const getIntegrationAuthNorthflankSecretGroups = async (req: Request, res
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -1076,7 +1129,7 @@ export const getIntegrationAuthTeamCityBuildConfigs = async (req: Request, res:
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(
@ -1145,7 +1198,7 @@ export const deleteIntegrationAuth = async (req: Request, res: Response) => {
// TODO(akhilmhdh): remove class -> static function path and makes these into reusable independent functions
const { integrationAuth, accessToken } = await getIntegrationAuthAccessHelper({
integrationAuthId: new ObjectId(integrationAuthId)
integrationAuthId: new Types.ObjectId(integrationAuthId)
});
const { permission } = await getUserProjectPermissions(

View File

@ -8,11 +8,11 @@ import { updateSubscriptionOrgQuantity } from "../../helpers/organization";
import { sendMail } from "../../helpers/nodemailer";
import { TokenService } from "../../services";
import { EELicenseService } from "../../ee/services";
import { ACCEPTED, INVITED, MEMBER, TOKEN_EMAIL_ORG_INVITATION } from "../../variables";
import { ACCEPTED, AuthTokenType, INVITED, MEMBER, TOKEN_EMAIL_ORG_INVITATION } from "../../variables";
import * as reqValidator from "../../validation/membershipOrg";
import {
getAuthSecret,
getJwtSignupLifetime,
getJwtSignupSecret,
getSiteURL,
getSmtpConfigured
} from "../../config";
@ -272,10 +272,11 @@ export const verifyUserToOrganization = async (req: Request, res: Response) => {
// generate temporary signup token
const token = createToken({
payload: {
authTokenType: AuthTokenType.SIGNUP_TOKEN,
userId: user._id.toString()
},
expiresIn: await getJwtSignupLifetime(),
secret: await getJwtSignupSecret()
secret: await getAuthSecret()
});
return res.status(200).send({

View File

@ -5,12 +5,12 @@ import * as bigintConversion from "bigint-conversion";
import { BackupPrivateKey, LoginSRPDetail, User } from "../../models";
import { clearTokens, createToken, sendMail } from "../../helpers";
import { TokenService } from "../../services";
import { TOKEN_EMAIL_PASSWORD_RESET } from "../../variables";
import { AuthTokenType, TOKEN_EMAIL_PASSWORD_RESET } from "../../variables";
import { BadRequestError } from "../../utils/errors";
import {
getAuthSecret,
getHttpsEnabled,
getJwtSignupLifetime,
getJwtSignupSecret,
getSiteURL
} from "../../config";
import { ActorType } from "../../ee/models";
@ -88,10 +88,11 @@ export const emailPasswordResetVerify = async (req: Request, res: Response) => {
// generate temporary password-reset token
const token = createToken({
payload: {
authTokenType: AuthTokenType.SIGNUP_TOKEN,
userId: user._id.toString()
},
expiresIn: await getJwtSignupLifetime(),
secret: await getJwtSignupSecret()
secret: await getAuthSecret()
});
return res.status(200).send({

View File

@ -4,14 +4,15 @@ import { checkEmailVerification, sendEmailVerification } from "../../helpers/sig
import { createToken } from "../../helpers/auth";
import { BadRequestError } from "../../utils/errors";
import {
getAuthSecret,
getInviteOnlySignup,
getJwtSignupLifetime,
getJwtSignupSecret,
getSmtpConfigured
} from "../../config";
import { validateUserEmail } from "../../validation";
import { validateRequest } from "../../helpers/validation";
import * as reqValidator from "../../validation/auth";
import { AuthTokenType } from "../../variables";
/**
* Signup step 1: Initialize account for user under email [email] and send a verification code
@ -95,10 +96,11 @@ export const verifyEmailSignup = async (req: Request, res: Response) => {
// generate temporary signup token
const token = createToken({
payload: {
authTokenType: AuthTokenType.SIGNUP_TOKEN,
userId: user._id.toString()
},
expiresIn: await getJwtSignupLifetime(),
secret: await getJwtSignupSecret()
secret: await getAuthSecret()
});
return res.status(200).send({

View File

@ -1,12 +1,16 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { client, getRootEncryptionKey } from "../../config";
import { client, getEncryptionKey, getRootEncryptionKey } from "../../config";
import { Webhook } from "../../models";
import { getWebhookPayload, triggerWebhookRequest } from "../../services/WebhookService";
import { BadRequestError, ResourceNotFoundError } from "../../utils/errors";
import { EEAuditLogService } from "../../ee/services";
import { EventType } from "../../ee/models";
import { ALGORITHM_AES_256_GCM, ENCODING_SCHEME_BASE64 } from "../../variables";
import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8
} from "../../variables";
import { validateRequest } from "../../helpers/validation";
import * as reqValidator from "../../validation/webhooks";
import {
@ -15,6 +19,7 @@ import {
getUserProjectPermissions
} from "../../ee/services/ProjectRoleService";
import { ForbiddenError } from "@casl/ability";
import { encryptSymmetric128BitHexKeyUTF8 } from "../../utils/crypto";
export const createWebhook = async (req: Request, res: Response) => {
const {
@ -31,17 +36,31 @@ export const createWebhook = async (req: Request, res: Response) => {
workspace: workspaceId,
environment,
secretPath,
url: webhookUrl,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_BASE64
url: webhookUrl
});
if (webhookSecretKey) {
const encryptionKey = await getEncryptionKey();
const rootEncryptionKey = await getRootEncryptionKey();
const { ciphertext, iv, tag } = client.encryptSymmetric(webhookSecretKey, rootEncryptionKey);
webhook.iv = iv;
webhook.tag = tag;
webhook.encryptedSecretKey = ciphertext;
if (rootEncryptionKey) {
const { ciphertext, iv, tag } = client.encryptSymmetric(webhookSecretKey, rootEncryptionKey);
webhook.iv = iv;
webhook.tag = tag;
webhook.encryptedSecretKey = ciphertext;
webhook.algorithm = ALGORITHM_AES_256_GCM;
webhook.keyEncoding = ENCODING_SCHEME_BASE64;
} else if (encryptionKey) {
const { ciphertext, iv, tag } = encryptSymmetric128BitHexKeyUTF8({
plaintext: webhookSecretKey,
key: encryptionKey
});
webhook.iv = iv;
webhook.tag = tag;
webhook.encryptedSecretKey = ciphertext;
webhook.algorithm = ALGORITHM_AES_256_GCM;
webhook.keyEncoding = ENCODING_SCHEME_UTF8;
}
}
await webhook.save();

View File

@ -10,9 +10,9 @@ import { sendMail } from "../../helpers/nodemailer";
import { TokenService } from "../../services";
import { EELogService } from "../../ee/services";
import { BadRequestError, InternalServerError } from "../../utils/errors";
import { ACTION_LOGIN, TOKEN_EMAIL_MFA } from "../../variables";
import { ACTION_LOGIN, AuthTokenType, TOKEN_EMAIL_MFA } from "../../variables";
import { getUserAgentType } from "../../utils/posthog"; // TODO: move this
import { getHttpsEnabled, getJwtMfaLifetime, getJwtMfaSecret } from "../../config";
import { getAuthSecret, getHttpsEnabled, getJwtMfaLifetime } from "../../config";
import { validateRequest } from "../../helpers/validation";
import * as reqValidator from "../../validation/auth";
@ -109,10 +109,11 @@ export const login2 = async (req: Request, res: Response) => {
// generate temporary MFA token
const token = createToken({
payload: {
authTokenType: AuthTokenType.MFA_TOKEN,
userId: user._id.toString()
},
expiresIn: await getJwtMfaLifetime(),
secret: await getJwtMfaSecret()
secret: await getAuthSecret()
});
const code = await TokenService.createToken({

View File

@ -11,7 +11,6 @@ import {
ValidationError as RouteValidationError,
UnauthorizedRequestError
} from "../../utils/errors";
import { AnyBulkWriteOperation } from "mongodb";
import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_UTF8,
@ -19,7 +18,7 @@ import {
SECRET_SHARED
} from "../../variables";
import { TelemetryService } from "../../services";
import { ISecret, Secret, User } from "../../models";
import { Secret, User } from "../../models";
import { AccountNotFoundError } from "../../utils/errors";
/**
@ -145,22 +144,22 @@ export const deleteSecrets = async (req: Request, res: Response) => {
const secretsUserCanDeleteSet: Set<string> = new Set(
secretIdsUserCanDelete.map((objectId) => objectId._id.toString())
);
const deleteOperationsToPerform: AnyBulkWriteOperation<ISecret>[] = [];
let numSecretsDeleted = 0;
secretIdsToDelete.forEach((secretIdToDelete) => {
if (secretsUserCanDeleteSet.has(secretIdToDelete)) {
const deleteOperation = {
deleteOne: { filter: { _id: new Types.ObjectId(secretIdToDelete) } }
};
deleteOperationsToPerform.push(deleteOperation);
numSecretsDeleted++;
} else {
throw RouteValidationError({
message: "You cannot delete secrets that you do not have access to"
});
}
});
// Filter out IDs that user can delete and then map them to delete operations
const deleteOperationsToPerform = secretIdsToDelete
.filter(secretIdToDelete => {
if (!secretsUserCanDeleteSet.has(secretIdToDelete)) {
throw RouteValidationError({
message: "You cannot delete secrets that you do not have access to"
});
}
return true;
})
.map(secretIdToDelete => ({
deleteOne: { filter: { _id: new Types.ObjectId(secretIdToDelete) } }
}));
const numSecretsDeleted = deleteOperationsToPerform.length;
await Secret.bulkWrite(deleteOperationsToPerform);

View File

@ -10,9 +10,9 @@ import { sendMail } from "../../helpers/nodemailer";
import { TokenService } from "../../services";
import { EELogService } from "../../ee/services";
import { BadRequestError, InternalServerError } from "../../utils/errors";
import { ACTION_LOGIN, TOKEN_EMAIL_MFA } from "../../variables";
import { ACTION_LOGIN, AuthTokenType, TOKEN_EMAIL_MFA } from "../../variables";
import { getUserAgentType } from "../../utils/posthog"; // TODO: move this
import { getHttpsEnabled, getJwtMfaLifetime, getJwtMfaSecret } from "../../config";
import { getAuthSecret, getHttpsEnabled, getJwtMfaLifetime } from "../../config";
import { AuthMethod } from "../../models/user";
import { validateRequest } from "../../helpers/validation";
import * as reqValidator from "../../validation/auth";
@ -134,10 +134,11 @@ export const login2 = async (req: Request, res: Response) => {
// generate temporary MFA token
const token = createToken({
payload: {
authTokenType: AuthTokenType.MFA_TOKEN,
userId: user._id.toString()
},
expiresIn: await getJwtMfaLifetime(),
secret: await getJwtMfaSecret()
secret: await getAuthSecret()
});
const code = await TokenService.createToken({

View File

@ -1,9 +1,11 @@
import * as usersController from "./usersController";
import * as secretsController from "./secretsController";
import * as workspacesController from "./workspacesController";
import * as authController from "./authController";
import * as signupController from "./signupController";
export {
usersController,
authController,
secretsController,
signupController,

View File

@ -140,7 +140,7 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
query: { secretPath, environment, workspaceId }
} = validatedData;
const {
query: { folderId, include_imports: includeImports }
query: { include_imports: includeImports }
} = validatedData;
// if the service token has single scope, it will get all secrets for that scope by default
@ -156,13 +156,6 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
workspaceId = serviceTokenDetails.workspace.toString();
}
if (folderId && folderId !== "root") {
const folder = await Folder.findOne({ workspace: workspaceId, environment });
if (!folder) throw BadRequestError({ message: "Folder not found" });
secretPath = getFolderWithPathFromId(folder.nodes, folderId).folderPath;
}
if (!environment || !workspaceId)
throw BadRequestError({ message: "Missing environment or workspace id" });
@ -177,7 +170,6 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
const secrets = await SecretService.getSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
folderId,
secretPath,
authData: req.authData
});
@ -467,20 +459,13 @@ export const deleteSecretByNameRaw = async (req: Request, res: Response) => {
export const getSecrets = async (req: Request, res: Response) => {
const validatedData = await validateRequest(reqValidator.GetSecretsV3, req);
const {
query: { environment, workspaceId, include_imports: includeImports, folderId }
query: { environment, workspaceId, include_imports: includeImports }
} = validatedData;
let {
query: { secretPath }
} = validatedData;
if (folderId && folderId !== "root") {
const folder = await Folder.findOne({ workspace: workspaceId, environment });
if (!folder) return res.send({ secrets: [] });
secretPath = getFolderWithPathFromId(folder.nodes, folderId).folderPath;
}
const { authVerifier: permissionCheckFn } = await checkSecretsPermission({
authData: req.authData,
workspaceId,
@ -492,7 +477,6 @@ export const getSecrets = async (req: Request, res: Response) => {
const secrets = await SecretService.getSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
folderId,
secretPath,
authData: req.authData
});
@ -875,6 +859,14 @@ export const createSecretByNameBatch = async (req: Request, res: Response) => {
authData: req.authData
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
secretPath
})
});
return res.status(200).send({
secrets: createdSecrets
});
@ -919,6 +911,14 @@ export const updateSecretByNameBatch = async (req: Request, res: Response) => {
authData: req.authData
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
secretPath
})
});
return res.status(200).send({
secrets: updatedSecrets
});
@ -963,6 +963,14 @@ export const deleteSecretByNameBatch = async (req: Request, res: Response) => {
authData: req.authData
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
secretPath
})
});
return res.status(200).send({
secrets: deletedSecrets
});

View File

@ -5,10 +5,10 @@ import { MembershipOrg, User } from "../../models";
import { completeAccount } from "../../helpers/user";
import { initializeDefaultOrg } from "../../helpers/signup";
import { issueAuthTokens, validateProviderAuthToken } from "../../helpers/auth";
import { ACCEPTED, INVITED } from "../../variables";
import { ACCEPTED, AuthTokenType, INVITED } from "../../variables";
import { standardRequest } from "../../config/request";
import { getHttpsEnabled, getJwtSignupSecret, getLoopsApiKey } from "../../config";
import { BadRequestError } from "../../utils/errors";
import { getAuthSecret, getHttpsEnabled, getLoopsApiKey } from "../../config";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
import { TelemetryService } from "../../services";
import { AuthMethod } from "../../models";
import { validateRequest } from "../../helpers/validation";
@ -78,12 +78,11 @@ export const completeAccountSignup = async (req: Request, res: Response) => {
}
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(AUTH_TOKEN_VALUE, await getJwtSignupSecret())
jwt.verify(AUTH_TOKEN_VALUE, await getAuthSecret())
);
if (decodedToken.userId !== user.id) {
throw BadRequestError();
}
if (decodedToken.authTokenType !== AuthTokenType.SIGNUP_TOKEN) throw UnauthorizedRequestError();
if (decodedToken.userId !== user.id) throw UnauthorizedRequestError();
}
// complete setting up user's account

View File

@ -0,0 +1,18 @@
import { Request, Response } from "express";
import { APIKeyDataV2 } from "../../models";
/**
* Return API keys belonging to current user.
* @param req
* @param res
* @returns
*/
export const getMyAPIKeys = async (req: Request, res: Response) => {
const apiKeyData = await APIKeyDataV2.find({
user: req.user._id
});
return res.status(200).send({
apiKeyData
});
}

View File

@ -10,6 +10,8 @@ import * as cloudProductsController from "./cloudProductsController";
import * as roleController from "./roleController";
import * as secretApprovalPolicyController from "./secretApprovalPolicyController";
import * as secretApprovalRequestController from "./secretApprovalRequestsController";
import * as secretRotationProviderController from "./secretRotationProviderController";
import * as secretRotationController from "./secretRotationController";
export {
secretController,
@ -23,5 +25,7 @@ export {
cloudProductsController,
roleController,
secretApprovalPolicyController,
secretApprovalRequestController
secretApprovalRequestController,
secretRotationProviderController,
secretRotationController
};

View File

@ -212,12 +212,13 @@ export const getUserPermissions = async (req: Request, res: Response) => {
const {
params: { orgId }
} = await validateRequest(GetUserPermission, req);
const { permission } = await getUserOrgPermissions(req.user._id, orgId);
const { permission, membership } = await getUserOrgPermissions(req.user._id, orgId);
res.status(200).json({
data: {
permissions: packRules(permission.rules)
permissions: packRules(permission.rules),
membership
}
});
};
@ -226,11 +227,12 @@ export const getUserWorkspacePermissions = async (req: Request, res: Response) =
const {
params: { workspaceId }
} = await validateRequest(GetUserProjectPermission, req);
const { permission } = await getUserProjectPermissions(req.user._id, workspaceId);
const { permission, membership } = await getUserProjectPermissions(req.user._id, workspaceId);
res.status(200).json({
data: {
permissions: packRules(permission.rules)
permissions: packRules(permission.rules),
membership
}
});
};

View File

@ -0,0 +1,91 @@
import { Request, Response } from "express";
import { validateRequest } from "../../../helpers/validation";
import * as reqValidator from "../../validation/secretRotation";
import * as secretRotationService from "../../secretRotation/service";
import {
getUserProjectPermissions,
ProjectPermissionActions,
ProjectPermissionSub
} from "../../services/ProjectRoleService";
import { ForbiddenError } from "@casl/ability";
export const createSecretRotation = async (req: Request, res: Response) => {
const {
body: {
provider,
customProvider,
interval,
outputs,
secretPath,
environment,
workspaceId,
inputs
}
} = await validateRequest(reqValidator.createSecretRotationV1, req);
const { permission } = await getUserProjectPermissions(req.user._id, workspaceId);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Create,
ProjectPermissionSub.SecretRotation
);
const secretRotation = await secretRotationService.createSecretRotation({
workspaceId,
inputs,
environment,
secretPath,
outputs,
interval,
customProvider,
provider
});
return res.send({ secretRotation });
};
export const restartSecretRotations = async (req: Request, res: Response) => {
const {
body: { id }
} = await validateRequest(reqValidator.restartSecretRotationV1, req);
const doc = await secretRotationService.getSecretRotationById({ id });
const { permission } = await getUserProjectPermissions(req.user._id, doc.workspace.toString());
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Edit,
ProjectPermissionSub.SecretRotation
);
const secretRotation = await secretRotationService.restartSecretRotation({ id });
return res.send({ secretRotation });
};
export const deleteSecretRotations = async (req: Request, res: Response) => {
const {
params: { id }
} = await validateRequest(reqValidator.removeSecretRotationV1, req);
const doc = await secretRotationService.getSecretRotationById({ id });
const { permission } = await getUserProjectPermissions(req.user._id, doc.workspace.toString());
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Delete,
ProjectPermissionSub.SecretRotation
);
const secretRotations = await secretRotationService.deleteSecretRotation({ id });
return res.send({ secretRotations });
};
export const getSecretRotations = async (req: Request, res: Response) => {
const {
query: { workspaceId }
} = await validateRequest(reqValidator.getSecretRotationV1, req);
const { permission } = await getUserProjectPermissions(req.user._id, workspaceId);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Read,
ProjectPermissionSub.SecretRotation
);
const secretRotations = await secretRotationService.getSecretRotationOfWorkspace(workspaceId);
return res.send({ secretRotations });
};

View File

@ -0,0 +1,28 @@
import { Request, Response } from "express";
import { validateRequest } from "../../../helpers/validation";
import * as reqValidator from "../../validation/secretRotationProvider";
import * as secretRotationProviderService from "../../secretRotation/service";
import {
getUserProjectPermissions,
ProjectPermissionActions,
ProjectPermissionSub
} from "../../services/ProjectRoleService";
import { ForbiddenError } from "@casl/ability";
export const getProviderTemplates = async (req: Request, res: Response) => {
const {
params: { workspaceId }
} = await validateRequest(reqValidator.getSecretRotationProvidersV1, req);
const { permission } = await getUserProjectPermissions(req.user._id, workspaceId);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Read,
ProjectPermissionSub.SecretRotation
);
const rotationProviderList = await secretRotationProviderService.getProviderTemplate({
workspaceId
});
return res.send(rotationProviderList);
};

View File

@ -0,0 +1,101 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { APIKeyDataV2 } from "../../../models/apiKeyDataV2";
import { validateRequest } from "../../../helpers/validation";
import { BadRequestError } from "../../../utils/errors";
import * as reqValidator from "../../../validation";
import { createToken } from "../../../helpers";
import { AuthTokenType } from "../../../variables";
import { getAuthSecret } from "../../../config";
/**
* Create API key data v2
* @param req
* @param res
*/
export const createAPIKeyData = async (req: Request, res: Response) => {
const {
body: {
name
}
} = await validateRequest(reqValidator.CreateAPIKeyV3, req);
const apiKeyData = await new APIKeyDataV2({
name,
user: req.user._id,
usageCount: 0,
}).save();
const apiKey = createToken({
payload: {
authTokenType: AuthTokenType.API_KEY,
apiKeyDataId: apiKeyData._id.toString(),
userId: req.user._id.toString()
},
secret: await getAuthSecret()
});
return res.status(200).send({
apiKeyData,
apiKey
});
}
/**
* Update API key data v2 with id [apiKeyDataId]
* @param req
* @param res
*/
export const updateAPIKeyData = async (req: Request, res: Response) => {
const {
params: { apiKeyDataId },
body: {
name,
}
} = await validateRequest(reqValidator.UpdateAPIKeyV3, req);
const apiKeyData = await APIKeyDataV2.findOneAndUpdate(
{
_id: new Types.ObjectId(apiKeyDataId),
user: req.user._id
},
{
name
},
{
new: true
}
);
if (!apiKeyData) throw BadRequestError({
message: "Failed to update API key"
});
return res.status(200).send({
apiKeyData
});
}
/**
* Delete API key data v2 with id [apiKeyDataId]
* @param req
* @param res
*/
export const deleteAPIKeyData = async (req: Request, res: Response) => {
const {
params: { apiKeyDataId }
} = await validateRequest(reqValidator.DeleteAPIKeyV3, req);
const apiKeyData = await APIKeyDataV2.findOneAndDelete({
_id: new Types.ObjectId(apiKeyDataId),
user: req.user._id
});
if (!apiKeyData) throw BadRequestError({
message: "Failed to delete API key"
});
return res.status(200).send({
apiKeyData
});
}

View File

@ -1,5 +1,7 @@
import * as serviceTokenDataController from "./serviceTokenDataController";
import * as apiKeyDataController from "./apiKeyDataController";
export {
serviceTokenDataController
serviceTokenDataController,
apiKeyDataController
}

View File

@ -30,7 +30,7 @@ import { EEAuditLogService, EELicenseService } from "../../services";
import { getJwtServiceTokenSecret } from "../../../config";
/**
* Return project key for service token
* Return project key for service token V3
* @param req
* @param res
*/
@ -57,7 +57,7 @@ export const getServiceTokenDataKey = async (req: Request, res: Response) => {
}
/**
* Create service token data
* Create service token data V3
* @param req
* @param res
* @returns
@ -165,7 +165,7 @@ export const createServiceTokenData = async (req: Request, res: Response) => {
}
/**
* Update service token data with id [serviceTokenDataId]
* Update service token V3 data with id [serviceTokenDataId]
* @param req
* @param res
* @returns

View File

@ -1,7 +1,8 @@
export enum ActorType {
USER = "user",
SERVICE = "service",
SERVICE_V3 = "service-v3"
USER = "user",
SERVICE = "service",
SERVICE_V3 = "service-v3",
Machine = "machine"
}
export enum UserAgentType {

View File

@ -1,11 +1,5 @@
import {
ActorType,
EventType
} from "./enums";
import {
IServiceTokenV3Scope,
IServiceTokenV3TrustedIp
} from "../../../models/serviceTokenDataV3";
import { ActorType, EventType } from "./enums";
import { IServiceTokenV3Scope, IServiceTokenV3TrustedIp } from "../../../models/serviceTokenDataV3";
interface UserActorMetadata {
userId: string;
@ -28,14 +22,15 @@ export interface ServiceActor {
}
export interface ServiceActorV3 {
type: ActorType.SERVICE_V3;
metadata: ServiceActorMetadata;
type: ActorType.SERVICE_V3;
metadata: ServiceActorMetadata;
}
export type Actor =
| UserActor
| ServiceActor
| ServiceActorV3;
export interface MachineActor {
type: ActorType.Machine;
}
export type Actor = UserActor | ServiceActor | ServiceActorV3 | MachineActor;
interface GetSecretsEvent {
type: EventType.GET_SECRETS;
@ -226,36 +221,36 @@ interface DeleteServiceTokenEvent {
}
interface CreateServiceTokenV3Event {
type: EventType.CREATE_SERVICE_TOKEN_V3;
metadata: {
name: string;
isActive: boolean;
scopes: Array<IServiceTokenV3Scope>;
trustedIps: Array<IServiceTokenV3TrustedIp>;
expiresAt?: Date;
}
type: EventType.CREATE_SERVICE_TOKEN_V3;
metadata: {
name: string;
isActive: boolean;
scopes: Array<IServiceTokenV3Scope>;
trustedIps: Array<IServiceTokenV3TrustedIp>;
expiresAt?: Date;
};
}
interface UpdateServiceTokenV3Event {
type: EventType.UPDATE_SERVICE_TOKEN_V3;
metadata: {
name?: string;
isActive?: boolean;
scopes?: Array<IServiceTokenV3Scope>;
trustedIps?: Array<IServiceTokenV3TrustedIp>;
expiresAt?: Date;
}
type: EventType.UPDATE_SERVICE_TOKEN_V3;
metadata: {
name?: string;
isActive?: boolean;
scopes?: Array<IServiceTokenV3Scope>;
trustedIps?: Array<IServiceTokenV3TrustedIp>;
expiresAt?: Date;
};
}
interface DeleteServiceTokenV3Event {
type: EventType.DELETE_SERVICE_TOKEN_V3;
metadata: {
name: string;
isActive: boolean;
scopes: Array<IServiceTokenV3Scope>;
expiresAt?: Date;
trustedIps: Array<IServiceTokenV3TrustedIp>;
}
type: EventType.DELETE_SERVICE_TOKEN_V3;
metadata: {
name: string;
isActive: boolean;
scopes: Array<IServiceTokenV3Scope>;
expiresAt?: Date;
trustedIps: Array<IServiceTokenV3TrustedIp>;
};
}
interface CreateEnvironmentEvent {
@ -427,15 +422,15 @@ interface UpdateUserRole {
}
interface UpdateUserDeniedPermissions {
type: EventType.UPDATE_USER_WORKSPACE_DENIED_PERMISSIONS,
metadata: {
userId: string;
email: string;
deniedPermissions: {
environmentSlug: string;
ability: string;
}[]
}
type: EventType.UPDATE_USER_WORKSPACE_DENIED_PERMISSIONS;
metadata: {
userId: string;
email: string;
deniedPermissions: {
environmentSlug: string;
ability: string;
}[];
};
}
interface SecretApprovalMerge {
type: EventType.SECRET_APPROVAL_MERGED;

View File

@ -10,6 +10,8 @@ import secretScanning from "./secretScanning";
import roles from "./role";
import secretApprovalPolicy from "./secretApprovalPolicy";
import secretApprovalRequest from "./secretApprovalRequest";
import secretRotationProvider from "./secretRotationProvider";
import secretRotation from "./secretRotation";
export {
secret,
@ -23,5 +25,7 @@ export {
secretScanning,
roles,
secretApprovalPolicy,
secretApprovalRequest
secretApprovalRequest,
secretRotationProvider,
secretRotation
};

View File

@ -0,0 +1,41 @@
import express from "express";
import { AuthMode } from "../../../variables";
import { requireAuth } from "../../../middleware";
import { secretRotationController } from "../../controllers/v1";
const router = express.Router();
router.post(
"/",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
secretRotationController.createSecretRotation
);
router.post(
"/restart",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
secretRotationController.restartSecretRotations
);
router.get(
"/",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
secretRotationController.getSecretRotations
);
router.delete(
"/:id",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
secretRotationController.deleteSecretRotations
);
export default router;

View File

@ -0,0 +1,17 @@
import express from "express";
import { AuthMode } from "../../../variables";
import { requireAuth } from "../../../middleware";
import { secretRotationProviderController } from "../../controllers/v1";
const router = express.Router();
router.get(
"/:workspaceId",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
secretRotationProviderController.getProviderTemplates
);
export default router;

View File

@ -0,0 +1,31 @@
import express from "express";
const router = express.Router();
import { requireAuth } from "../../../middleware";
import { AuthMode } from "../../../variables";
import { apiKeyDataController } from "../../controllers/v3";
router.post(
"/",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
apiKeyDataController.createAPIKeyData
);
router.patch(
"/:apiKeyDataId",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
apiKeyDataController.updateAPIKeyData
);
router.delete(
"/:apiKeyDataId",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
apiKeyDataController.deleteAPIKeyData
);
export default router;

View File

@ -1,5 +1,7 @@
import serviceTokenData from "./serviceTokenData";
import apiKeyData from "./apiKeyData";
export {
serviceTokenData
serviceTokenData,
apiKeyData
}

View File

View File

@ -0,0 +1,91 @@
import { Schema, model } from "mongoose";
import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8
} from "../../variables";
import { ISecretRotation } from "./types";
const secretRotationSchema = new Schema(
{
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace"
},
provider: {
type: String,
required: true
},
customProvider: {
type: Schema.Types.ObjectId,
ref: "SecretRotationProvider"
},
environment: {
type: String,
required: true
},
secretPath: {
type: String,
required: true
},
interval: {
type: Number,
required: true
},
lastRotatedAt: {
type: String
},
status: {
type: String,
enum: ["success", "failed"]
},
statusMessage: {
type: String
},
// encrypted data on input keys and secrets got
encryptedData: {
type: String,
select: false
},
encryptedDataIV: {
type: String,
select: false
},
encryptedDataTag: {
type: String,
select: false
},
algorithm: {
// the encryption algorithm used
type: String,
enum: [ALGORITHM_AES_256_GCM],
required: true,
select: false,
default: ALGORITHM_AES_256_GCM
},
keyEncoding: {
type: String,
enum: [ENCODING_SCHEME_UTF8, ENCODING_SCHEME_BASE64],
required: true,
select: false,
default: ENCODING_SCHEME_UTF8
},
outputs: [
{
key: {
type: String,
required: true
},
secret: {
type: Schema.Types.ObjectId,
ref: "Secret"
}
}
]
},
{
timestamps: true
}
);
export const SecretRotation = model<ISecretRotation>("SecretRotation", secretRotationSchema);

View File

@ -0,0 +1,288 @@
import Queue, { Job } from "bull";
import { client, getEncryptionKey, getRootEncryptionKey } from "../../../config";
import { BotService, EventService, TelemetryService } from "../../../services";
import { SecretRotation } from "../models";
import { rotationTemplates } from "../templates";
import {
ISecretRotationData,
ISecretRotationEncData,
ISecretRotationProviderTemplate,
TProviderFunctionTypes
} from "../types";
import {
decryptSymmetric128BitHexKeyUTF8,
encryptSymmetric128BitHexKeyUTF8
} from "../../../utils/crypto";
import { ISecret, Secret } from "../../../models";
import { ENCODING_SCHEME_BASE64, ENCODING_SCHEME_UTF8, SECRET_SHARED } from "../../../variables";
import { EESecretService } from "../../services";
import { SecretVersion } from "../../models";
import { eventPushSecrets } from "../../../events";
import { logger } from "../../../utils/logging";
import {
secretRotationPreSetFn,
secretRotationRemoveFn,
secretRotationSetFn,
secretRotationTestFn
} from "./queue.utils";
const secretRotationQueue = new Queue("secret-rotation-service", process.env.REDIS_URL as string);
secretRotationQueue.process(async (job: Job) => {
logger.info(`secretRotationQueue.process: [rotationDocument=${job.data.rotationDocId}]`);
const rotationStratDocId = job.data.rotationDocId;
const secretRotation = await SecretRotation.findById(rotationStratDocId)
.select("+encryptedData +encryptedDataTag +encryptedDataIV +keyEncoding")
.populate<{
outputs: [
{
key: string;
secret: ISecret;
}
];
}>("outputs.secret");
const infisicalRotationProvider = rotationTemplates.find(
({ name }) => name === secretRotation?.provider
);
try {
if (!infisicalRotationProvider || !secretRotation)
throw new Error("Failed to find rotation strategy");
if (secretRotation.outputs.some(({ secret }) => !secret))
throw new Error("Secrets not found in dashboard");
const workspaceId = secretRotation.workspace;
// deep copy
const provider = JSON.parse(
JSON.stringify(infisicalRotationProvider)
) as ISecretRotationProviderTemplate;
// decrypt user provided inputs for secret rotation
const encryptionKey = await getEncryptionKey();
const rootEncryptionKey = await getRootEncryptionKey();
let decryptedData = "";
if (rootEncryptionKey && secretRotation.keyEncoding === ENCODING_SCHEME_BASE64) {
// case: encoding scheme is base64
decryptedData = client.decryptSymmetric(
secretRotation.encryptedData,
rootEncryptionKey,
secretRotation.encryptedDataIV,
secretRotation.encryptedDataTag
);
} else if (encryptionKey && secretRotation.keyEncoding === ENCODING_SCHEME_UTF8) {
// case: encoding scheme is utf8
decryptedData = decryptSymmetric128BitHexKeyUTF8({
ciphertext: secretRotation.encryptedData,
iv: secretRotation.encryptedDataIV,
tag: secretRotation.encryptedDataTag,
key: encryptionKey
});
}
const variables = JSON.parse(decryptedData) as ISecretRotationEncData;
// rotation set cycle
const newCredential: ISecretRotationData = {
inputs: variables.inputs,
outputs: {},
internal: {}
};
// special glue code for database
if (provider.template.functions.set.type === TProviderFunctionTypes.DB) {
const lastCred = variables.creds.at(-1);
if (lastCred && variables.creds.length === 1) {
newCredential.internal.username =
lastCred.internal.username === variables.inputs.username1
? variables.inputs.username2
: variables.inputs.username1;
} else {
newCredential.internal.username = lastCred
? lastCred.internal.username
: variables.inputs.username1;
}
}
if (provider.template.functions.set?.pre) {
secretRotationPreSetFn(provider.template.functions.set.pre, newCredential);
}
await secretRotationSetFn(provider.template.functions.set, newCredential);
await secretRotationTestFn(provider.template.functions.test, newCredential);
if (variables.creds.length === 2) {
const deleteCycleCred = variables.creds.pop();
if (deleteCycleCred && provider.template.functions.remove) {
const deleteCycleVar = { inputs: variables.inputs, ...deleteCycleCred };
await secretRotationRemoveFn(provider.template.functions.remove, deleteCycleVar);
}
}
variables.creds.unshift({ outputs: newCredential.outputs, internal: newCredential.internal });
const { ciphertext, iv, tag } = client.encryptSymmetric(
JSON.stringify(variables),
rootEncryptionKey
);
// save the rotation state
await SecretRotation.findByIdAndUpdate(rotationStratDocId, {
encryptedData: ciphertext,
encryptedDataIV: iv,
encryptedDataTag: tag,
status: "success",
statusMessage: "Rotated successfully",
lastRotatedAt: new Date().toUTCString()
});
const key = await BotService.getWorkspaceKeyWithBot({
workspaceId: secretRotation.workspace
});
const encryptedSecrets = secretRotation.outputs.map(({ key: outputKey, secret }) => ({
secret,
value: encryptSymmetric128BitHexKeyUTF8({
plaintext:
typeof newCredential.outputs[outputKey] === "object"
? JSON.stringify(newCredential.outputs[outputKey])
: String(newCredential.outputs[outputKey]),
key
})
}));
// now save the secret do a bulk update
// can't use the updateSecret function due to various parameter required issue
// REFACTOR(akhilmhdh): secret module should be lot more flexible. Ability to update bulk or individually by blindIndex, by id etc
await Secret.bulkWrite(
encryptedSecrets.map(({ secret, value }) => ({
updateOne: {
filter: {
workspace: workspaceId,
environment: secretRotation.environment,
_id: secret._id,
type: SECRET_SHARED
},
update: {
$inc: {
version: 1
},
secretValueCiphertext: value.ciphertext,
secretValueIV: value.iv,
secretValueTag: value.tag
}
}
}))
);
await EESecretService.addSecretVersions({
secretVersions: encryptedSecrets.map(({ secret, value }) => {
const {
_id,
version,
workspace,
type,
folder,
secretBlindIndex,
secretKeyIV,
secretKeyTag,
secretKeyCiphertext,
skipMultilineEncoding,
environment,
algorithm,
keyEncoding
} = secret;
return new SecretVersion({
secret: _id,
version: version + 1,
workspace: workspace,
type,
folder,
environment,
isDeleted: false,
secretBlindIndex: secretBlindIndex,
secretKeyCiphertext: secretKeyCiphertext,
secretKeyIV: secretKeyIV,
secretKeyTag: secretKeyTag,
secretValueCiphertext: value.ciphertext,
secretValueIV: value.iv,
secretValueTag: value.tag,
algorithm,
keyEncoding,
skipMultilineEncoding
});
})
});
// akhilmhdh: @tony need to do something about this as its depend on authData which is not possibile in here
// await EEAuditLogService.createAuditLog(
// {actor:ActorType.Machine},
// {
// type: EventType.UPDATE_SECRETS,
// metadata: {
// environment,
// secretPath,
// secrets: secretsToBeUpdated.map(({ _id, version, secretBlindIndex }) => ({
// secretId: _id.toString(),
// secretKey: secretBlindIndexToKey[secretBlindIndex || ""],
// secretVersion: version + 1
// }))
// }
// },
// {
// workspaceId
// }
// );
const folderId = encryptedSecrets?.[0]?.secret?.folder;
// (EE) take a secret snapshot
await EESecretService.takeSecretSnapshot({
workspaceId,
environment: secretRotation.environment,
folderId
});
await EventService.handleEvent({
event: eventPushSecrets({
workspaceId: secretRotation.workspace,
environment: secretRotation.environment,
secretPath: secretRotation.secretPath
})
});
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient) {
postHogClient.capture({
event: "secrets rotated",
properties: {
numberOfSecrets: encryptedSecrets.length,
environment: secretRotation.environment,
workspaceId,
folderId
}
});
}
} catch (err) {
logger.error(err);
await SecretRotation.findByIdAndUpdate(rotationStratDocId, {
status: "failed",
statusMessage: (err as Error).message,
lastRotatedAt: new Date().toUTCString()
});
}
return Promise.resolve();
});
const daysToMillisecond = (days: number) => days * 24 * 60 * 60 * 1000;
export const startSecretRotationQueue = async (rotationDocId: string, interval: number) => {
// when migration to bull mq just use the option immedite to trigger repeatable immediately
secretRotationQueue.add({ rotationDocId }, { jobId: rotationDocId, removeOnComplete: true });
return secretRotationQueue.add(
{ rotationDocId },
{ repeat: { every: daysToMillisecond(interval) }, jobId: rotationDocId }
);
};
export const removeSecretRotationQueue = async (rotationDocId: string, interval: number) => {
return secretRotationQueue.removeRepeatable({ every: interval * 1000, jobId: rotationDocId });
};

View File

@ -0,0 +1,179 @@
import axios from "axios";
import jmespath from "jmespath";
import { customAlphabet } from "nanoid";
import { Client as PgClient } from "pg";
import mysql from "mysql2";
import {
ISecretRotationData,
TAssignOp,
TDbProviderClients,
TDbProviderFunction,
TDirectAssignOp,
THttpProviderFunction,
TProviderFunction,
TProviderFunctionTypes
} from "../types";
const REGEX = /\${([^}]+)}/g;
const SLUG_ALPHABETS = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz";
const nanoId = customAlphabet(SLUG_ALPHABETS, 10);
export const interpolate = (data: any, getValue: (key: string) => unknown) => {
if (!data) return;
if (typeof data === "number") return data;
if (typeof data === "string") {
return data.replace(REGEX, (_a, b) => getValue(b) as string);
}
if (typeof data === "object" && Array.isArray(data)) {
data.forEach((el, index) => {
data[index] = interpolate(el, getValue);
});
}
if (typeof data === "object") {
if ((data as { ref: string })?.ref) return getValue((data as { ref: string }).ref);
const temp = data as Record<string, unknown>; // for converting ts object to record type
Object.keys(temp).forEach((key) => {
temp[key as keyof typeof temp] = interpolate(data[key as keyof typeof temp], getValue);
});
}
return data;
};
const getInterpolationValue = (variables: ISecretRotationData) => (key: string) => {
if (key.includes("|")) {
const [keyword, ...arg] = key.split("|").map((el) => el.trim());
switch (keyword) {
case "random": {
return nanoId(parseInt(arg[0], 10));
}
default: {
throw Error(`Interpolation key not found - ${key}`);
}
}
}
const [type, keyName] = key.split(".").map((el) => el.trim());
return variables[type as keyof ISecretRotationData][keyName];
};
export const secretRotationHttpFn = async (
func: THttpProviderFunction,
variables: ISecretRotationData
) => {
// string interpolation
const headers = interpolate(func.header, getInterpolationValue(variables));
const url = interpolate(func.url, getInterpolationValue(variables));
const body = interpolate(func.body, getInterpolationValue(variables));
// axios will automatically throw error if req status is not between 2xx range
return axios({ method: func.method, url, headers, data: body });
};
export const secretRotationDbFn = async (
func: TDbProviderFunction,
variables: ISecretRotationData
) => {
const { type, client, pre, ...dbConnection } = func;
const { username, password, host, database, port, query, ca } = interpolate(
dbConnection,
getInterpolationValue(variables)
);
const ssl = ca ? { rejectUnauthorized: false, ca } : undefined;
if (host === "localhost" || host === "127.0.0.1") throw new Error("Invalid db host");
if (client === TDbProviderClients.Pg) {
const pgClient = new PgClient({ user: username, password, host, database, port, ssl });
await pgClient.connect();
const res = await pgClient.query(query);
await pgClient.end();
return res.rows[0];
} else if (client === TDbProviderClients.Sql) {
const sqlClient = mysql.createPool({
user: username,
password,
host,
database,
port,
connectionLimit: 1,
ssl
});
const res = await new Promise((resolve, reject) => {
sqlClient.query(query, (err, data) => {
if (err) return reject(err);
resolve(data);
});
});
await new Promise((resolve, reject) => {
sqlClient.end(function (err) {
if (err) return reject(err);
return resolve({});
});
});
return (res as any)?.[0];
}
};
export const secretRotationPreSetFn = (
op: Record<string, TDirectAssignOp>,
variables: ISecretRotationData
) => {
const getValFn = getInterpolationValue(variables);
Object.entries(op || {}).forEach(([key, assignFn]) => {
const [type, keyName] = key.split(".") as [keyof ISecretRotationData, string];
variables[type][keyName] = interpolate(assignFn.value, getValFn);
});
};
export const secretRotationSetFn = async (
func: TProviderFunction,
variables: ISecretRotationData
) => {
const getValFn = getInterpolationValue(variables);
// http setter
if (func.type === TProviderFunctionTypes.HTTP) {
const res = await secretRotationHttpFn(func, variables);
Object.entries(func.setter || {}).forEach(([key, assignFn]) => {
const [type, keyName] = key.split(".") as [keyof ISecretRotationData, string];
if (assignFn.assign === TAssignOp.JmesPath) {
variables[type][keyName] = jmespath.search(res.data, assignFn.path);
} else if (assignFn.value) {
variables[type][keyName] = interpolate(assignFn.value, getValFn);
}
});
// db setter
} else if (func.type === TProviderFunctionTypes.DB) {
const data = await secretRotationDbFn(func, variables);
Object.entries(func.setter || {}).forEach(([key, assignFn]) => {
const [type, keyName] = key.split(".") as [keyof ISecretRotationData, string];
if (assignFn.assign === TAssignOp.JmesPath) {
if (typeof data === "object") {
variables[type][keyName] = jmespath.search(data, assignFn.path);
}
} else if (assignFn.value) {
variables[type][keyName] = interpolate(assignFn.value, getValFn);
}
});
}
};
export const secretRotationTestFn = async (
func: TProviderFunction,
variables: ISecretRotationData
) => {
if (func.type === TProviderFunctionTypes.HTTP) {
await secretRotationHttpFn(func, variables);
} else if (func.type === TProviderFunctionTypes.DB) {
await secretRotationDbFn(func, variables);
}
};
export const secretRotationRemoveFn = async (
func: TProviderFunction,
variables: ISecretRotationData
) => {
if (!func) return;
if (func.type === TProviderFunctionTypes.HTTP) {
// string interpolation
return await secretRotationHttpFn(func, variables);
}
};

View File

@ -0,0 +1,130 @@
import { ISecretRotationEncData, TCreateSecretRotation, TGetProviderTemplates } from "./types";
import { rotationTemplates } from "./templates";
import { SecretRotation } from "./models";
import { client, getEncryptionKey, getRootEncryptionKey } from "../../config";
import { BadRequestError } from "../../utils/errors";
import Ajv from "ajv";
import { removeSecretRotationQueue, startSecretRotationQueue } from "./queue/queue";
import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8
} from "../../variables";
import { encryptSymmetric128BitHexKeyUTF8 } from "../../utils/crypto";
const ajv = new Ajv({ strict: false });
export const getProviderTemplate = async ({ workspaceId }: TGetProviderTemplates) => {
return {
custom: [],
providers: rotationTemplates
};
};
export const createSecretRotation = async ({
workspaceId,
secretPath,
environment,
provider,
interval,
inputs,
outputs
}: TCreateSecretRotation) => {
const rotationTemplate = rotationTemplates.find(({ name }) => name === provider);
if (!rotationTemplate) throw BadRequestError({ message: "Provider not found" });
const formattedInputs: Record<string, unknown> = {};
Object.entries(inputs).forEach(([key, value]) => {
const type = rotationTemplate.template.inputs.properties[key].type;
if (type === "string") {
formattedInputs[key] = value;
return;
}
if (type === "integer") {
formattedInputs[key] = parseInt(value as string, 10);
return;
}
formattedInputs[key] = JSON.parse(value as string);
});
// ensure input one follows the correct schema
const valid = ajv.validate(rotationTemplate.template.inputs, formattedInputs);
if (!valid) {
throw BadRequestError({ message: ajv.errors?.[0].message });
}
const encData: Partial<ISecretRotationEncData> = {
inputs: formattedInputs,
creds: []
};
const secretRotation = new SecretRotation({
workspace: workspaceId,
provider,
environment,
secretPath,
interval,
outputs: Object.entries(outputs).map(([key, secret]) => ({ key, secret }))
});
const encryptionKey = await getEncryptionKey();
const rootEncryptionKey = await getRootEncryptionKey();
if (rootEncryptionKey) {
const { ciphertext, iv, tag } = client.encryptSymmetric(
JSON.stringify(encData),
rootEncryptionKey
);
secretRotation.encryptedDataIV = iv;
secretRotation.encryptedDataTag = tag;
secretRotation.encryptedData = ciphertext;
secretRotation.algorithm = ALGORITHM_AES_256_GCM;
secretRotation.keyEncoding = ENCODING_SCHEME_BASE64;
} else if (encryptionKey) {
const { ciphertext, iv, tag } = encryptSymmetric128BitHexKeyUTF8({
plaintext: JSON.stringify(encData),
key: encryptionKey
});
secretRotation.encryptedDataIV = iv;
secretRotation.encryptedDataTag = tag;
secretRotation.encryptedData = ciphertext;
secretRotation.algorithm = ALGORITHM_AES_256_GCM;
secretRotation.keyEncoding = ENCODING_SCHEME_UTF8;
}
await secretRotation.save();
await startSecretRotationQueue(secretRotation._id.toString(), interval);
return secretRotation;
};
export const deleteSecretRotation = async ({ id }: { id: string }) => {
const doc = await SecretRotation.findByIdAndRemove(id);
if (!doc) throw BadRequestError({ message: "Rotation not found" });
await removeSecretRotationQueue(doc._id.toString(), doc.interval);
return doc;
};
export const restartSecretRotation = async ({ id }: { id: string }) => {
const secretRotation = await SecretRotation.findById(id);
if (!secretRotation) throw BadRequestError({ message: "Rotation not found" });
await removeSecretRotationQueue(secretRotation._id.toString(), secretRotation.interval);
await startSecretRotationQueue(secretRotation._id.toString(), secretRotation.interval);
return secretRotation;
};
export const getSecretRotationById = async ({ id }: { id: string }) => {
const doc = await SecretRotation.findById(id);
if (!doc) throw BadRequestError({ message: "Rotation not found" });
return doc;
};
export const getSecretRotationOfWorkspace = async (workspaceId: string) => {
const secretRotations = await SecretRotation.find({
workspace: workspaceId
}).populate("outputs.secret");
return secretRotations;
};

View File

@ -0,0 +1,28 @@
import { ISecretRotationProviderTemplate } from "../types";
import { MYSQL_TEMPLATE } from "./mysql";
import { POSTGRES_TEMPLATE } from "./postgres";
import { SENDGRID_TEMPLATE } from "./sendgrid";
export const rotationTemplates: ISecretRotationProviderTemplate[] = [
{
name: "sendgrid",
title: "Twilio Sendgrid",
image: "sendgrid.png",
description: "Rotate Twilio Sendgrid API keys",
template: SENDGRID_TEMPLATE
},
{
name: "postgres",
title: "PostgreSQL",
image: "postgres.png",
description: "Rotate PostgreSQL/CockroachDB user credentials",
template: POSTGRES_TEMPLATE
},
{
name: "mysql",
title: "MySQL",
image: "mysql.png",
description: "Rotate MySQL@7/MariaDB user credentials",
template: MYSQL_TEMPLATE
}
];

View File

@ -0,0 +1,83 @@
import { TAssignOp, TDbProviderClients, TProviderFunctionTypes } from "../types";
export const MYSQL_TEMPLATE = {
inputs: {
type: "object" as const,
properties: {
admin_username: { type: "string" as const },
admin_password: { type: "string" as const },
host: { type: "string" as const },
database: { type: "string" as const },
port: { type: "integer" as const, default: "3306" },
username1: {
type: "string",
default: "infisical-sql-user1",
desc: "This user must be created in your database"
},
username2: {
type: "string",
default: "infisical-sql-user2",
desc: "This user must be created in your database"
},
ca: { type: "string", desc: "SSL certificate for db auth(string)" }
},
required: [
"admin_username",
"admin_password",
"host",
"database",
"username1",
"username2",
"port"
],
additionalProperties: false
},
outputs: {
db_username: { type: "string" },
db_password: { type: "string" }
},
internal: {
rotated_password: { type: "string" },
username: { type: "string" }
},
functions: {
set: {
type: TProviderFunctionTypes.DB as const,
client: TDbProviderClients.Sql,
username: "${inputs.admin_username}",
password: "${inputs.admin_password}",
host: "${inputs.host}",
database: "${inputs.database}",
port: "${inputs.port}",
ca: "${inputs.ca}",
query: "ALTER USER ${internal.username} IDENTIFIED BY '${internal.rotated_password}'",
setter: {
"outputs.db_username": {
assign: TAssignOp.Direct as const,
value: "${internal.username}"
},
"outputs.db_password": {
assign: TAssignOp.Direct as const,
value: "${internal.rotated_password}"
}
},
pre: {
"internal.rotated_password": {
assign: TAssignOp.Direct as const,
value: "${random | 32}"
}
}
},
test: {
type: TProviderFunctionTypes.DB as const,
client: TDbProviderClients.Sql,
username: "${internal.username}",
password: "${internal.rotated_password}",
host: "${inputs.host}",
database: "${inputs.database}",
port: "${inputs.port}",
ca: "${inputs.ca}",
query: "SELECT NOW()"
}
}
};

View File

@ -0,0 +1,83 @@
import { TAssignOp, TDbProviderClients, TProviderFunctionTypes } from "../types";
export const POSTGRES_TEMPLATE = {
inputs: {
type: "object" as const,
properties: {
admin_username: { type: "string" as const },
admin_password: { type: "string" as const },
host: { type: "string" as const },
database: { type: "string" as const },
port: { type: "integer" as const, default: "5432" },
username1: {
type: "string",
default: "infisical-pg-user1",
desc: "This user must be created in your database"
},
username2: {
type: "string",
default: "infisical-pg-user2",
desc: "This user must be created in your database"
},
ca: { type: "string", desc: "SSL certificate for db auth(string)" }
},
required: [
"admin_username",
"admin_password",
"host",
"database",
"username1",
"username2",
"port"
],
additionalProperties: false
},
outputs: {
db_username: { type: "string" },
db_password: { type: "string" }
},
internal: {
rotated_password: { type: "string" },
username: { type: "string" }
},
functions: {
set: {
type: TProviderFunctionTypes.DB as const,
client: TDbProviderClients.Pg,
username: "${inputs.admin_username}",
password: "${inputs.admin_password}",
host: "${inputs.host}",
database: "${inputs.database}",
port: "${inputs.port}",
ca: "${inputs.ca}",
query: "ALTER USER ${internal.username} WITH PASSWORD '${internal.rotated_password}'",
setter: {
"outputs.db_username": {
assign: TAssignOp.Direct as const,
value: "${internal.username}"
},
"outputs.db_password": {
assign: TAssignOp.Direct as const,
value: "${internal.rotated_password}"
}
},
pre: {
"internal.rotated_password": {
assign: TAssignOp.Direct as const,
value: "${random | 32}"
}
}
},
test: {
type: TProviderFunctionTypes.DB as const,
client: TDbProviderClients.Pg,
username: "${internal.username}",
password: "${internal.rotated_password}",
host: "${inputs.host}",
database: "${inputs.database}",
port: "${inputs.port}",
ca: "${inputs.ca}",
query: "SELECT NOW()"
}
}
};

View File

@ -0,0 +1,63 @@
import { TAssignOp, TProviderFunctionTypes } from "../types";
export const SENDGRID_TEMPLATE = {
inputs: {
type: "object" as const,
properties: {
admin_api_key: { type: "string" as const, desc: "Sendgrid admin api key to create new keys" },
api_key_scopes: {
type: "array",
items: { type: "string" as const },
desc: "Scopes for created tokens by rotation(Array)"
}
},
required: ["admin_api_key", "api_key_scopes"],
additionalProperties: false
},
outputs: {
api_key: { type: "string" }
},
internal: {
api_key_id: { type: "string" }
},
functions: {
set: {
type: TProviderFunctionTypes.HTTP as const,
url: "https://api.sendgrid.com/v3/api_keys",
method: "POST",
header: {
Authorization: "Bearer ${inputs.admin_api_key}"
},
body: {
name: "infisical-${random | 16}",
scopes: { ref: "inputs.api_key_scopes" }
},
setter: {
"outputs.api_key": {
assign: TAssignOp.JmesPath as const,
path: "api_key"
},
"internal.api_key_id": {
assign: TAssignOp.JmesPath as const,
path: "api_key_id"
}
}
},
remove: {
type: TProviderFunctionTypes.HTTP as const,
url: "https://api.sendgrid.com/v3/api_keys/${internal.api_key_id}",
header: {
Authorization: "Bearer ${inputs.admin_api_key}"
},
method: "DELETE"
},
test: {
type: TProviderFunctionTypes.HTTP as const,
url: "https://api.sendgrid.com/v3/api_keys/${internal.api_key_id}",
header: {
Authorization: "Bearer ${inputs.admin_api_key}"
},
method: "GET"
}
}
};

View File

@ -0,0 +1,131 @@
import { Document, Types } from "mongoose";
export interface ISecretRotation extends Document {
_id: Types.ObjectId;
name: string;
interval: number;
provider: string;
customProvider: Types.ObjectId;
workspace: Types.ObjectId;
environment: string;
secretPath: string;
outputs: Array<{
key: string;
secret: Types.ObjectId;
}>;
status?: "success" | "failed";
lastRotatedAt?: string;
statusMessage?: string;
encryptedData: string;
encryptedDataIV: string;
encryptedDataTag: string;
algorithm: string;
keyEncoding: string;
}
export type ISecretRotationEncData = {
inputs: Record<string, unknown>;
creds: Array<{
outputs: Record<string, unknown>;
internal: Record<string, unknown>;
}>;
};
export type ISecretRotationData = {
inputs: Record<string, unknown>;
outputs: Record<string, unknown>;
internal: Record<string, unknown>;
};
export type ISecretRotationProviderTemplate = {
name: string;
title: string;
image?: string;
description?: string;
template: TProviderTemplate;
};
export enum TProviderFunctionTypes {
HTTP = "http",
DB = "database"
}
export enum TDbProviderClients {
// postgres, cockroack db, amazon red shift
Pg = "pg",
// mysql and maria db
Sql = "sql"
}
export enum TAssignOp {
Direct = "direct",
JmesPath = "jmesopath"
}
export type TJmesPathAssignOp = {
assign: TAssignOp.JmesPath;
path: string;
};
export type TDirectAssignOp = {
assign: TAssignOp.Direct;
value: string;
};
export type TAssignFunction = TJmesPathAssignOp | TDirectAssignOp;
export type THttpProviderFunction = {
type: TProviderFunctionTypes.HTTP;
url: string;
method: string;
header?: Record<string, string>;
query?: Record<string, string>;
body?: Record<string, unknown>;
setter?: Record<string, TAssignFunction>;
pre?: Record<string, TDirectAssignOp>;
};
export type TDbProviderFunction = {
type: TProviderFunctionTypes.DB;
client: TDbProviderClients;
username: string;
password: string;
host: string;
database: string;
port: string;
query: string;
setter?: Record<string, TAssignFunction>;
pre?: Record<string, TDirectAssignOp>;
};
export type TProviderFunction = THttpProviderFunction | TDbProviderFunction;
export type TProviderTemplate = {
inputs: {
type: "object";
properties: Record<string, { type: string; [x: string]: unknown; desc?: string }>;
required?: string[];
};
outputs: Record<string, unknown>;
functions: {
set: TProviderFunction;
remove?: TProviderFunction;
test: TProviderFunction;
};
};
// function type args
export type TGetProviderTemplates = {
workspaceId: string;
};
export type TCreateSecretRotation = {
provider: string;
customProvider?: string;
workspaceId: string;
secretPath: string;
environment: string;
interval: number;
inputs: Record<string, unknown>;
outputs: Record<string, string>;
};

View File

@ -38,6 +38,7 @@ interface FeatureSet {
trial_end: number | null;
has_used_trial: boolean;
secretApproval: boolean;
secretRotation: boolean;
}
/**
@ -74,7 +75,8 @@ class EELicenseService {
status: null,
trial_end: null,
has_used_trial: true,
secretApproval: false
secretApproval: false,
secretRotation: true,
}
public localFeatureSet: NodeCache;

View File

@ -50,7 +50,8 @@ export enum ProjectPermissionSub {
Workspace = "workspace",
Secrets = "secrets",
SecretRollback = "secret-rollback",
SecretApproval = "secret-approval"
SecretApproval = "secret-approval",
SecretRotation = "secret-rotation"
}
type SubjectFields = {
@ -74,6 +75,7 @@ export type ProjectPermissionSet =
| [ProjectPermissionActions, ProjectPermissionSub.Settings]
| [ProjectPermissionActions, ProjectPermissionSub.ServiceTokens]
| [ProjectPermissionActions, ProjectPermissionSub.SecretApproval]
| [ProjectPermissionActions, ProjectPermissionSub.SecretRotation]
| [ProjectPermissionActions.Delete, ProjectPermissionSub.Workspace]
| [ProjectPermissionActions.Edit, ProjectPermissionSub.Workspace]
| [ProjectPermissionActions.Read, ProjectPermissionSub.SecretRollback]
@ -92,6 +94,11 @@ const buildAdminPermission = () => {
can(ProjectPermissionActions.Edit, ProjectPermissionSub.SecretApproval);
can(ProjectPermissionActions.Delete, ProjectPermissionSub.SecretApproval);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretRotation);
can(ProjectPermissionActions.Create, ProjectPermissionSub.SecretRotation);
can(ProjectPermissionActions.Edit, ProjectPermissionSub.SecretRotation);
can(ProjectPermissionActions.Delete, ProjectPermissionSub.SecretRotation);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretRollback);
can(ProjectPermissionActions.Create, ProjectPermissionSub.SecretRollback);
@ -162,6 +169,7 @@ const buildMemberPermission = () => {
can(ProjectPermissionActions.Delete, ProjectPermissionSub.Secrets);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretApproval);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretRotation);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretRollback);
can(ProjectPermissionActions.Create, ProjectPermissionSub.SecretRollback);
@ -214,6 +222,7 @@ const buildViewerPermission = () => {
can(ProjectPermissionActions.Read, ProjectPermissionSub.Secrets);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretApproval);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretRollback);
can(ProjectPermissionActions.Read, ProjectPermissionSub.SecretRotation);
can(ProjectPermissionActions.Read, ProjectPermissionSub.Member);
can(ProjectPermissionActions.Read, ProjectPermissionSub.Role);
can(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);

View File

@ -0,0 +1,32 @@
import { z } from "zod";
export const createSecretRotationV1 = z.object({
body: z.object({
workspaceId: z.string().trim(),
secretPath: z.string().trim(),
environment: z.string().trim(),
interval: z.number().min(1),
provider: z.string().trim(),
customProvider: z.string().trim().optional(),
inputs: z.record(z.unknown()),
outputs: z.record(z.string())
})
});
export const restartSecretRotationV1 = z.object({
body: z.object({
id: z.string().trim()
})
});
export const getSecretRotationV1 = z.object({
query: z.object({
workspaceId: z.string().trim()
})
});
export const removeSecretRotationV1 = z.object({
params: z.object({
id: z.string().trim()
})
});

View File

@ -0,0 +1,7 @@
import { z } from "zod";
export const getSecretRotationProvidersV1 = z.object({
params: z.object({
workspaceId: z.string()
})
});

View File

@ -4,6 +4,7 @@ import jwt from "jsonwebtoken";
import bcrypt from "bcrypt";
import {
APIKeyData,
APIKeyDataV2,
ITokenVersion,
IUser,
ServiceTokenData,
@ -19,15 +20,14 @@ import {
UnauthorizedRequestError,
} from "../utils/errors";
import {
getAuthSecret,
getJwtAuthLifetime,
getJwtAuthSecret,
getJwtProviderAuthSecret,
getJwtRefreshLifetime,
getJwtRefreshSecret,
getJwtServiceTokenSecret
} from "../config";
import {
AuthMode
AuthMode,
AuthTokenType
} from "../variables";
import {
ServiceTokenAuthData,
@ -51,8 +51,6 @@ export const validateAuthMode = ({
acceptedAuthModes: AuthMode[]
}) => {
// TODO: update this to accept service token v3
const apiKey = headers["x-api-key"];
const authHeader = headers["authorization"];
@ -108,6 +106,7 @@ export const validateAuthMode = ({
/**
* Return user payload corresponding to JWT token [authTokenValue]
* that is either for browser / CLI or API Key
* @param {Object} obj
* @param {String} obj.authTokenValue - JWT token value
* @returns {User} user - user corresponding to JWT token
@ -120,9 +119,45 @@ export const getAuthUserPayload = async ({
authTokenValue: string;
}): Promise<UserAuthData> => {
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(authTokenValue, await getJwtAuthSecret())
jwt.verify(authTokenValue, await getAuthSecret())
);
if (
decodedToken.authTokenType !== AuthTokenType.ACCESS_TOKEN &&
decodedToken.authTokenType !== AuthTokenType.API_KEY
) {
throw UnauthorizedRequestError();
}
if (decodedToken.authTokenType === AuthTokenType.ACCESS_TOKEN) {
const tokenVersion = await TokenVersion.findOneAndUpdate({
_id: new Types.ObjectId(decodedToken.tokenVersionId),
user: decodedToken.userId
}, {
lastUsed: new Date(),
});
if (!tokenVersion) throw UnauthorizedRequestError();
if (decodedToken.accessVersion !== tokenVersion.accessVersion) throw UnauthorizedRequestError();
} else if (decodedToken.authTokenType === AuthTokenType.API_KEY) {
const apiKeyData = await APIKeyDataV2.findOneAndUpdate(
{
_id: new Types.ObjectId(decodedToken.apiKeyDataId),
user: new Types.ObjectId(decodedToken.userId)
},
{
lastUsed: new Date(),
$inc: { usageCount: 1 }
},
{
new: true
}
);
if (!apiKeyData) throw UnauthorizedRequestError();
}
const user = await User.findOne({
_id: new Types.ObjectId(decodedToken.userId),
}).select("+publicKey +accessVersion");
@ -131,21 +166,6 @@ export const getAuthUserPayload = async ({
if (!user?.publicKey) throw UnauthorizedRequestError({ message: "Failed to authenticate user with partially set up account" });
const tokenVersion = await TokenVersion.findOneAndUpdate({
_id: new Types.ObjectId(decodedToken.tokenVersionId),
user: user._id,
}, {
lastUsed: new Date(),
});
if (!tokenVersion) throw UnauthorizedRequestError({
message: "Failed to validate access token",
});
if (decodedToken.accessVersion !== tokenVersion.accessVersion) throw UnauthorizedRequestError({
message: "Failed to validate access token",
});
return {
actor: {
type: ActorType.USER,
@ -159,11 +179,6 @@ export const getAuthUserPayload = async ({
userAgent: req.headers["user-agent"] ?? "",
userAgentType: getUserAgentType(req.headers["user-agent"])
}
// return ({
// user,
// tokenVersionId: tokenVersion._id, // what to do with this? // move this out
// });
}
/**
@ -404,22 +419,24 @@ export const issueAuthTokens = async ({
// issue tokens
const token = createToken({
payload: {
authTokenType: AuthTokenType.ACCESS_TOKEN,
userId,
tokenVersionId: tokenVersion._id.toString(),
accessVersion: tokenVersion.accessVersion,
},
expiresIn: await getJwtAuthLifetime(),
secret: await getJwtAuthSecret(),
secret: await getAuthSecret(),
});
const refreshToken = createToken({
payload: {
authTokenType: AuthTokenType.REFRESH_TOKEN,
userId,
tokenVersionId: tokenVersion._id.toString(),
refreshVersion: tokenVersion.refreshVersion,
},
expiresIn: await getJwtRefreshLifetime(),
secret: await getJwtRefreshSecret(),
secret: await getAuthSecret(),
});
return {
@ -451,7 +468,7 @@ export const clearTokens = async (tokenVersionId: Types.ObjectId): Promise<void>
* bearer/auth, refresh, and temporary signup tokens
* @param {Object} obj
* @param {Object} obj.payload - payload of (JWT) token
* @param {String} obj.secret - (JWT) secret such as [JWT_AUTH_SECRET]
* @param {String} obj.secret - (JWT) secret such as [AUTH_SECRET]
* @param {String} obj.expiresIn - string describing time span such as '10h' or '7d'
*/
export const createToken = ({
@ -479,13 +496,16 @@ export const validateProviderAuthToken = async ({
email: string;
providerAuthToken?: string;
}) => {
if (!providerAuthToken) {
throw new Error("Invalid authentication request.");
}
const decodedToken = <jwt.ProviderAuthJwtPayload>(
jwt.verify(providerAuthToken, await getJwtProviderAuthSecret())
jwt.verify(providerAuthToken, await getAuthSecret())
);
if (decodedToken.authTokenType !== AuthTokenType.PROVIDER_TOKEN) throw UnauthorizedRequestError();
if (decodedToken.email !== email) {
throw new Error("Invalid authentication credentials.")

View File

@ -1,5 +1,5 @@
import mongoose from "mongoose";
import { getLogger } from "../utils/logger";
import { logger } from "../utils/logging";
/**
* Initialize database connection
@ -18,10 +18,10 @@ export const initDatabaseHelper = async ({
// allow empty strings to pass the required validator
mongoose.Schema.Types.String.checkRequired(v => typeof v === "string");
(await getLogger("database")).info("Database connection established");
logger.info("Database connection established");
} catch (err) {
(await getLogger("database")).error(`Unable to establish Database connection due to the error.\n${err}`);
logger.error(err, "Unable to establish database connection");
}
return mongoose.connection;

View File

@ -1,4 +1,4 @@
import mongoose, { Types, mongo } from "mongoose";
import { Types } from "mongoose";
import {
Bot,
BotKey,
@ -55,7 +55,7 @@ import {
import {
createBotOrg
} from "./botOrg";
import { InternalServerError, ResourceNotFoundError } from "../utils/errors";
import { ResourceNotFoundError } from "../utils/errors";
/**
* Create an organization with name [name]
@ -111,311 +111,215 @@ export const createOrganization = async ({
* @returns
*/
export const deleteOrganization = async ({
organizationId,
existingSession
organizationId
}: {
organizationId: Types.ObjectId;
existingSession?: mongo.ClientSession;
}) => {
let session;
if (existingSession) {
session = existingSession;
} else {
session = await mongoose.startSession();
session.startTransaction();
}
const organization = await Organization.findByIdAndDelete(
organizationId
);
try {
const organization = await Organization.findByIdAndDelete(
organizationId,
{
session
}
if (!organization) throw ResourceNotFoundError();
await MembershipOrg.deleteMany({
organization: organization._id
});
await BotOrg.deleteMany({
organization: organization._id
});
await SSOConfig.deleteMany({
organization: organization._id
});
await Role.deleteMany({
organization: organization._id
});
await IncidentContactOrg.deleteMany({
organization: organization._id
});
await GitRisks.deleteMany({
organization: organization._id
});
await GitAppInstallationSession.deleteMany({
organization: organization._id
});
await GitAppOrganizationInstallation.deleteMany({
organization: organization._id
});
const workspaceIds = await Workspace.distinct("_id", {
organization: organization._id
});
await Workspace.deleteMany({
organization: organization._id
});
await Membership.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Key.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Bot.deleteMany({
workspace: {
$in: workspaceIds
}
});
await BotKey.deleteMany({
workspace: {
$in: workspaceIds
}
});
await SecretBlindIndexData.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Secret.deleteMany({
workspace: {
$in: workspaceIds
}
});
await SecretVersion.deleteMany({
workspace: {
$in: workspaceIds
}
});
await SecretSnapshot.deleteMany({
workspace: {
$in: workspaceIds
}
});
await SecretImport.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Folder.deleteMany({
workspace: {
$in: workspaceIds
}
});
await FolderVersion.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Webhook.deleteMany({
workspace: {
$in: workspaceIds
}
});
await TrustedIP.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Tag.deleteMany({
workspace: {
$in: workspaceIds
}
});
await IntegrationAuth.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Integration.deleteMany({
workspace: {
$in: workspaceIds
}
});
await ServiceToken.deleteMany({
workspace: {
$in: workspaceIds
}
});
await ServiceTokenData.deleteMany({
workspace: {
$in: workspaceIds
}
});
await ServiceTokenDataV3.deleteMany({
workspace: {
$in: workspaceIds
}
});
await ServiceTokenDataV3Key.deleteMany({
workspace: {
$in: workspaceIds
}
});
await AuditLog.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Log.deleteMany({
workspace: {
$in: workspaceIds
}
});
await Action.deleteMany({
workspace: {
$in: workspaceIds
}
});
await SecretApprovalPolicy.deleteMany({
workspace: {
$in: workspaceIds
}
});
await SecretApprovalRequest.deleteMany({
workspace: {
$in: workspaceIds
}
});
if (organization.customerId) {
// delete from stripe here
await licenseServerKeyRequest.delete(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${organization.customerId}`
);
if (!organization) throw ResourceNotFoundError();
await MembershipOrg.deleteMany({
organization: organization._id
}, {
session
});
await BotOrg.deleteMany({
organization: organization._id
}, {
session
});
await SSOConfig.deleteMany({
organization: organization._id
}, {
session
});
await Role.deleteMany({
organization: organization._id
}, {
session
});
await IncidentContactOrg.deleteMany({
organization: organization._id
}, {
session
});
await GitRisks.deleteMany({
organization: organization._id
}, {
session
});
await GitAppInstallationSession.deleteMany({
organization: organization._id
}, {
session
});
await GitAppOrganizationInstallation.deleteMany({
organization: organization._id
}, {
session
});
const workspaceIds = await Workspace.distinct("_id", {
organization: organization._id
});
await Workspace.deleteMany({
organization: organization._id
}, {
session
});
await Membership.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Key.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Bot.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await BotKey.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await SecretBlindIndexData.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Secret.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await SecretVersion.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await SecretSnapshot.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await SecretImport.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Folder.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await FolderVersion.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Webhook.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await TrustedIP.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Tag.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await IntegrationAuth.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Integration.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await ServiceToken.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await ServiceTokenData.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await ServiceTokenDataV3.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await ServiceTokenDataV3Key.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await AuditLog.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Log.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await Action.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await SecretApprovalPolicy.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
await SecretApprovalRequest.deleteMany({
workspace: {
$in: workspaceIds
}
}, {
session
});
if (organization.customerId) {
// delete from stripe here
await licenseServerKeyRequest.delete(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${organization.customerId}`
);
}
return organization;
} catch (err) {
if (!existingSession) {
await session.abortTransaction();
}
throw InternalServerError({
message: "Failed to delete organization"
});
} finally {
if (!existingSession) {
await session.commitTransaction();
session.endSession();
}
}
return organization;
}
/**

View File

@ -553,14 +553,22 @@ export const getSecretsHelper = async ({
workspaceId,
environment,
authData,
folderId,
secretPath = "/"
}: GetSecretsParams) => {
let secrets: ISecret[] = [];
// if using service token filter towards the folderId by secretpath
if (!folderId) {
folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
const folders = await Folder.findOne({
workspace: workspaceId,
environment
});
let folderId = "root";
if (!folders && folderId !== "root") return [];
// get folder from folder tree
if (folders) {
const folder = getFolderByPath(folders.nodes, secretPath);
if (!folder) return [];
folderId = folder?.id;
}
// get personal secrets first
@ -1761,6 +1769,22 @@ export const deleteSecretBatchHelper = async ({
secretIds: deletedSecrets.map((secret) => secret._id)
});
const action = await EELogService.createAction({
name: ACTION_DELETE_SECRETS,
...getAuthDataPayloadIdObj(authData),
workspaceId,
secretIds: deletedSecrets.map((secret) => secret._id)
});
action &&
(await EELogService.createLog({
...getAuthDataPayloadIdObj(authData),
workspaceId,
actions: [action],
channel: authData.userAgentType,
ipAddress: authData.ipAddress
}));
await EEAuditLogService.createAuditLog(
authData,
{

View File

@ -1,4 +1,4 @@
import mongoose, { Types, mongo } from "mongoose";
import { Types } from "mongoose";
import {
APIKeyData,
BackupPrivateKey,
@ -222,141 +222,92 @@ const checkDeleteUserConditions = async ({
* @returns {User} user - deleted user
*/
export const deleteUser = async ({
userId,
existingSession
userId
}: {
userId: Types.ObjectId;
existingSession?: mongo.ClientSession;
}) => {
const user = await User.findByIdAndDelete(userId);
let session;
if (!user) throw ResourceNotFoundError();
await checkDeleteUserConditions({
userId: user._id
});
if (existingSession) {
session = existingSession;
} else {
session = await mongoose.startSession();
session.startTransaction();
await UserAction.deleteMany({
user: user._id
});
await BackupPrivateKey.deleteMany({
user: user._id
});
await APIKeyData.deleteMany({
user: user._id
});
await Action.deleteMany({
user: user._id
});
await Log.deleteMany({
user: user._id
});
await TokenVersion.deleteMany({
user: user._id
});
await Key.deleteMany({
receiver: user._id
});
const membershipOrgs = await MembershipOrg.find({
user: userId
});
// delete organizations where user is only member
for await (const membershipOrg of membershipOrgs) {
const memberCount = await MembershipOrg.countDocuments({
organization: membershipOrg.organization
});
if (memberCount === 1) {
// organization only has 1 member (the current user)
await deleteOrganization({
organizationId: membershipOrg.organization
});
}
}
try {
const user = await User.findByIdAndDelete(userId, {
session
const memberships = await Membership.find({
user: userId
});
// delete workspaces where user is only member
for await (const membership of memberships) {
const memberCount = await Membership.countDocuments({
workspace: membership.workspace
});
if (!user) throw ResourceNotFoundError();
if (memberCount === 1) {
// workspace only has 1 member (the current user) -> delete workspace
await checkDeleteUserConditions({
userId: user._id
});
await UserAction.deleteMany({
user: user._id
}, {
session
});
await BackupPrivateKey.deleteMany({
user: user._id
}, {
session
});
await APIKeyData.deleteMany({
user: user._id
}, {
session
});
await Action.deleteMany({
user: user._id
}, {
session
});
await Log.deleteMany({
user: user._id
}, {
session
});
await TokenVersion.deleteMany({
user: user._id
});
await Key.deleteMany({
receiver: user._id
}, {
session
});
const membershipOrgs = await MembershipOrg.find({
user: userId
}, null, {
session
});
// delete organizations where user is only member
for await (const membershipOrg of membershipOrgs) {
const memberCount = await MembershipOrg.countDocuments({
organization: membershipOrg.organization
await deleteWorkspace({
workspaceId: membership.workspace
});
if (memberCount === 1) {
// organization only has 1 member (the current user)
await deleteOrganization({
organizationId: membershipOrg.organization,
existingSession: session
});
}
}
const memberships = await Membership.find({
user: userId
}, null, {
session
});
// delete workspaces where user is only member
for await (const membership of memberships) {
const memberCount = await Membership.countDocuments({
workspace: membership.workspace
});
if (memberCount === 1) {
// workspace only has 1 member (the current user) -> delete workspace
await deleteWorkspace({
workspaceId: membership.workspace,
existingSession: session
});
}
}
await MembershipOrg.deleteMany({
user: userId
}, {
session
});
await Membership.deleteMany({
user: userId
}, {
session
});
return user;
} catch (err) {
if (!existingSession) {
await session.abortTransaction();
}
throw InternalServerError({
message: "Failed to delete account"
})
} finally {
if (!existingSession) {
await session.commitTransaction();
session.endSession();
}
}
await MembershipOrg.deleteMany({
user: userId
});
await Membership.deleteMany({
user: userId
});
return user;
}

View File

@ -1,4 +1,4 @@
import mongoose, { Types, mongo } from "mongoose";
import { Types } from "mongoose";
import {
Bot,
BotKey,
@ -33,8 +33,7 @@ import {
import { createBot } from "../helpers/bot";
import { EELicenseService } from "../ee/services";
import { SecretService } from "../services";
import {
InternalServerError,
import {
ResourceNotFoundError
} from "../utils/errors";
@ -102,189 +101,113 @@ export const createWorkspace = async ({
* @param {String} obj.id - id of workspace to delete
*/
export const deleteWorkspace = async ({
workspaceId,
existingSession
workspaceId
}: {
workspaceId: Types.ObjectId;
existingSession?: mongo.ClientSession;
}) => {
let session;
if (existingSession) {
session = existingSession;
} else {
session = await mongoose.startSession();
session.startTransaction();
}
const workspace = await Workspace.findByIdAndDelete(workspaceId);
try {
const workspace = await Workspace.findByIdAndDelete(workspaceId, { session });
if (!workspace) throw ResourceNotFoundError();
await Membership.deleteMany({
workspace: workspace._id
}, {
session
});
await Key.deleteMany({
workspace: workspace._id
}, {
session
});
await Bot.deleteMany({
workspace: workspace._id
}, {
session
});
if (!workspace) throw ResourceNotFoundError();
await Membership.deleteMany({
workspace: workspace._id
});
await Key.deleteMany({
workspace: workspace._id
});
await Bot.deleteMany({
workspace: workspace._id
});
await BotKey.deleteMany({
workspace: workspace._id
}, {
session
});
await BotKey.deleteMany({
workspace: workspace._id
});
await SecretBlindIndexData.deleteMany({
workspace: workspace._id
}, {
session
});
await SecretBlindIndexData.deleteMany({
workspace: workspace._id
});
await Secret.deleteMany({
workspace: workspace._id
}, {
session
});
await SecretVersion.deleteMany({
workspace: workspace._id
}, {
session
});
await Secret.deleteMany({
workspace: workspace._id
});
await SecretVersion.deleteMany({
workspace: workspace._id
});
await SecretSnapshot.deleteMany({
workspace: workspace._id
}, {
session
});
await SecretSnapshot.deleteMany({
workspace: workspace._id
});
await SecretImport.deleteMany({
workspace: workspace._id
}, {
session
});
await SecretImport.deleteMany({
workspace: workspace._id
});
await Folder.deleteMany({
workspace: workspace._id
}, {
session
});
await Folder.deleteMany({
workspace: workspace._id
});
await FolderVersion.deleteMany({
workspace: workspace._id
}, {
session
});
await FolderVersion.deleteMany({
workspace: workspace._id
});
await Webhook.deleteMany({
workspace: workspace._id
}, {
session
});
await Webhook.deleteMany({
workspace: workspace._id
});
await TrustedIP.deleteMany({
workspace: workspace._id
}, {
session
});
await TrustedIP.deleteMany({
workspace: workspace._id
});
await Tag.deleteMany({
workspace: workspace._id
}, {
session
});
await Tag.deleteMany({
workspace: workspace._id
});
await IntegrationAuth.deleteMany({
workspace: workspace._id
}, {
session
});
await IntegrationAuth.deleteMany({
workspace: workspace._id
});
await Integration.deleteMany({
workspace: workspace._id
}, {
session
});
await Integration.deleteMany({
workspace: workspace._id
});
await ServiceToken.deleteMany({
workspace: workspace._id
}, {
session
});
await ServiceToken.deleteMany({
workspace: workspace._id
});
await ServiceTokenData.deleteMany({
workspace: workspace._id
}, {
session
});
await ServiceTokenData.deleteMany({
workspace: workspace._id
});
await ServiceTokenDataV3.deleteMany({
workspace: workspace._id
}, {
session
});
await ServiceTokenDataV3.deleteMany({
workspace: workspace._id
});
await ServiceTokenDataV3Key.deleteMany({
workspace: workspace._id
}, {
session
});
await ServiceTokenDataV3Key.deleteMany({
workspace: workspace._id
});
await AuditLog.deleteMany({
workspace: workspace._id
}, {
session
});
await AuditLog.deleteMany({
workspace: workspace._id
});
await Log.deleteMany({
workspace: workspace._id
}, {
session
});
await Log.deleteMany({
workspace: workspace._id
});
await Action.deleteMany({
workspace: workspace._id
}, {
session
});
await Action.deleteMany({
workspace: workspace._id
});
await SecretApprovalPolicy.deleteMany({
workspace: workspace._id
}, {
session
});
await SecretApprovalPolicy.deleteMany({
workspace: workspace._id
});
await SecretApprovalRequest.deleteMany({
workspace: workspace._id
}, {
session
});
return workspace;
} catch (err) {
if (!existingSession) {
await session.abortTransaction();
}
throw InternalServerError({
message: "Failed to delete organization"
});
} finally {
if (!existingSession) {
await session.commitTransaction();
session.endSession();
}
}
await SecretApprovalRequest.deleteMany({
workspace: workspace._id
});
return workspace;
};

View File

@ -5,6 +5,8 @@ import express from "express";
require("express-async-errors");
import helmet from "helmet";
import cors from "cors";
import { logger } from "./utils/logging";
import httpLogger from "pino-http";
import { DatabaseService } from "./services";
import { EELicenseService, GithubSecretScanningService } from "./ee/services";
import { setUpHealthEndpoint } from "./services/health";
@ -25,10 +27,13 @@ import {
users as eeUsersRouter,
workspace as eeWorkspaceRouter,
roles as v1RoleRouter,
secretApprovalPolicy as v1SecretApprovalPolicy,
secretApprovalRequest as v1SecretApprovalRequest,
secretApprovalPolicy as v1SecretApprovalPolicyRouter,
secretApprovalRequest as v1SecretApprovalRequestRouter,
secretRotation as v1SecretRotation,
secretRotationProvider as v1SecretRotationProviderRouter,
secretScanning as v1SecretScanningRouter
} from "./ee/routes/v1";
import { apiKeyData as v3apiKeyDataRouter } from "./ee/routes/v3";
import { serviceTokenData as v3ServiceTokenDataRouter } from "./ee/routes/v3";
import {
auth as v1AuthRouter,
@ -68,10 +73,11 @@ import {
auth as v3AuthRouter,
secrets as v3SecretsRouter,
signup as v3SignupRouter,
users as v3UsersRouter,
workspaces as v3WorkspacesRouter
} from "./routes/v3";
import { healthCheck } from "./routes/status";
import { getLogger } from "./utils/logger";
// import { getLogger } from "./utils/logger";
import { RouteNotFoundError } from "./utils/errors";
import { requestErrorHandler } from "./middleware/requestErrorHandler";
import {
@ -81,7 +87,7 @@ import {
getSecretScanningPrivateKey,
getSecretScanningWebhookProxy,
getSecretScanningWebhookSecret,
getSiteURL
getSiteURL,
} from "./config";
import { setup } from "./utils/setup";
import { syncSecretsToThirdPartyServices } from "./queues/integrations/syncSecretsToThirdPartyServices";
@ -92,12 +98,20 @@ import path from "path";
let handler: null | any = null;
const main = async () => {
const port = await getPort();
await setup();
await EELicenseService.initGlobalFeatureSet();
const app = express();
app.enable("trust proxy");
app.use(httpLogger({
logger,
autoLogging: false
}));
app.use(express.json());
app.use(express.urlencoded({ extended: false }));
app.use(cookieParser());
@ -162,7 +176,7 @@ const main = async () => {
const nextApp = new NextServer({
dev: false,
dir: nextJsBuildPath,
port: await getPort(),
port,
conf,
hostname: "local",
customServer: false
@ -180,7 +194,10 @@ const main = async () => {
app.use("/api/v1/organizations", eeOrganizationsRouter);
app.use("/api/v1/sso", eeSSORouter);
app.use("/api/v1/cloud-products", eeCloudProductsRouter);
app.use("/api/v3/service-token", v3ServiceTokenDataRouter);
app.use("/api/v3/api-key", v3apiKeyDataRouter); // new
app.use("/api/v3/service-token", v3ServiceTokenDataRouter); // new
app.use("/api/v1/secret-rotation-providers", v1SecretRotationProviderRouter);
app.use("/api/v1/secret-rotations", v1SecretRotation);
// v1 routes
app.use("/api/v1/signup", v1SignupRouter);
@ -204,9 +221,9 @@ const main = async () => {
app.use("/api/v1/webhooks", v1WebhooksRouter);
app.use("/api/v1/secret-imports", v1SecretImpsRouter);
app.use("/api/v1/roles", v1RoleRouter);
app.use("/api/v1/secret-approvals", v1SecretApprovalPolicy);
app.use("/api/v1/secret-approvals", v1SecretApprovalPolicyRouter);
app.use("/api/v1/sso", v1SSORouter);
app.use("/api/v1/secret-approval-requests", v1SecretApprovalRequest);
app.use("/api/v1/secret-approval-requests", v1SecretApprovalRequestRouter);
// v2 routes (improvements)
app.use("/api/v2/signup", v2SignupRouter);
@ -226,6 +243,7 @@ const main = async () => {
app.use("/api/v3/secrets", v3SecretsRouter);
app.use("/api/v3/workspaces", v3WorkspacesRouter);
app.use("/api/v3/signup", v3SignupRouter);
app.use("/api/v3/users", v3UsersRouter);
// api docs
app.use("/api-docs", swaggerUi.serve, swaggerUi.setup(swaggerFile));
@ -251,8 +269,8 @@ const main = async () => {
app.use(requestErrorHandler);
const server = app.listen(await getPort(), async () => {
(await getLogger("backend-main")).info(`Server started listening at port ${await getPort()}`);
const server = app.listen(port, async () => {
logger.info(`Server started listening at port ${port}`);
});
// await createTestUserForDevelopment();

File diff suppressed because it is too large Load Diff

View File

@ -32,6 +32,8 @@ import {
INTEGRATION_GITLAB,
INTEGRATION_GITLAB_API_URL,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HASURA_CLOUD,
INTEGRATION_HASURA_CLOUD_API_URL,
INTEGRATION_HEROKU,
INTEGRATION_HEROKU_API_URL,
INTEGRATION_LARAVELFORGE,
@ -63,6 +65,10 @@ import { Octokit } from "@octokit/rest";
import _ from "lodash";
import sodium from "libsodium-wrappers";
import { standardRequest } from "../config/request";
import {
ZGetTenantEnv,
ZUpdateTenantEnv
} from "../validation/hasuraCloudIntegration";
const getSecretKeyValuePair = (
secrets: Record<string, { value: string | null; comment?: string } | null>
@ -95,7 +101,7 @@ const syncSecrets = async ({
secrets: Record<string, { value: string; comment?: string }>;
accessId: string | null;
accessToken: string;
appendices?: { prefix: string, suffix: string };
appendices?: { prefix: string; suffix: string };
}) => {
switch (integration.integration) {
case INTEGRATION_GCP_SECRET_MANAGER:
@ -306,6 +312,14 @@ const syncSecrets = async ({
accessToken
});
break;
case INTEGRATION_HASURA_CLOUD:
await syncSecretsHasuraCloud({
integration,
secrets,
accessToken
});
break;
}
};
@ -963,8 +977,9 @@ const syncSecretsVercel = async ({
: {}),
...(integration?.path
? {
gitBranch: integration?.path
} : {})
gitBranch: integration?.path
}
: {})
};
const vercelSecrets: VercelSecret[] = (
@ -992,7 +1007,7 @@ const syncSecretsVercel = async ({
return true;
});
const res: { [key: string]: VercelSecret } = {};
for await (const vercelSecret of vercelSecrets) {
@ -1352,7 +1367,7 @@ const syncSecretsGitHub = async ({
integration: IIntegration;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
appendices?: { prefix: string, suffix: string };
appendices?: { prefix: string; suffix: string };
}) => {
interface GitHubRepoKey {
key_id: string;
@ -1395,14 +1410,23 @@ const syncSecretsGitHub = async ({
{}
);
encryptedSecrets = Object.keys(encryptedSecrets).reduce((result: {
[key: string]: GitHubSecret;
}, key) => {
if ((appendices?.prefix !== undefined ? key.startsWith(appendices?.prefix) : true) && (appendices?.suffix !== undefined ? key.endsWith(appendices?.suffix) : true)) {
result[key] = encryptedSecrets[key];
}
return result;
}, {});
encryptedSecrets = Object.keys(encryptedSecrets).reduce(
(
result: {
[key: string]: GitHubSecret;
},
key
) => {
if (
(appendices?.prefix !== undefined ? key.startsWith(appendices?.prefix) : true) &&
(appendices?.suffix !== undefined ? key.endsWith(appendices?.suffix) : true)
) {
result[key] = encryptedSecrets[key];
}
return result;
},
{}
);
Object.keys(encryptedSecrets).map(async (key) => {
if (!(key in secrets)) {
@ -2080,7 +2104,7 @@ const syncSecretsSupabase = async ({
};
/**
* Sync/push [secrets] to Checkly app
* Sync/push [secrets] to Checkly app/group
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
@ -2095,87 +2119,156 @@ const syncSecretsCheckly = async ({
integration: IIntegration;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
appendices?: { prefix: string, suffix: string };
appendices?: { prefix: string; suffix: string };
}) => {
let getSecretsRes = (
await standardRequest.get(`${INTEGRATION_CHECKLY_API_URL}/v1/variables`, {
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
"X-Checkly-Account": integration.appId
}
})
).data.reduce(
(obj: any, secret: any) => ({
...obj,
[secret.key]: secret.value
}),
{}
);
getSecretsRes = Object.keys(getSecretsRes).reduce((result: {
[key: string]: string;
}, key) => {
if ((appendices?.prefix !== undefined ? key.startsWith(appendices?.prefix) : true) && (appendices?.suffix !== undefined ? key.endsWith(appendices?.suffix) : true)) {
result[key] = getSecretsRes[key];
}
return result;
}, {});
if (integration.targetServiceId) {
// sync secrets to checkly group envars
// add secrets
for await (const key of Object.keys(secrets)) {
if (!(key in getSecretsRes)) {
// case: secret does not exist in checkly
// -> add secret
await standardRequest.post(
`${INTEGRATION_CHECKLY_API_URL}/v1/variables`,
{
key,
value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"Content-Type": "application/json",
"X-Checkly-Account": integration.appId
}
}
);
} else {
// case: secret exists in checkly
// -> update/set secret
if (secrets[key] !== getSecretsRes[key]) {
await standardRequest.put(
`${INTEGRATION_CHECKLY_API_URL}/v1/variables/${key}`,
{
value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json",
Accept: "application/json",
"X-Checkly-Account": integration.appId
}
}
);
}
}
}
for await (const key of Object.keys(getSecretsRes)) {
if (!(key in secrets)) {
// delete secret
await standardRequest.delete(`${INTEGRATION_CHECKLY_API_URL}/v1/variables/${key}`, {
let getGroupSecretsRes = (
await standardRequest.get(`${INTEGRATION_CHECKLY_API_URL}/v1/check-groups/${integration.targetServiceId}`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"X-Checkly-Account": integration.appId
}
});
})
).data.environmentVariables.reduce(
(obj: any, secret: any) => ({
...obj,
[secret.key]: secret.value
}),
{}
);
getGroupSecretsRes = Object.keys(getGroupSecretsRes).reduce(
(
result: {
[key: string]: string;
},
key
) => {
if (
(appendices?.prefix !== undefined ? key.startsWith(appendices?.prefix) : true) &&
(appendices?.suffix !== undefined ? key.endsWith(appendices?.suffix) : true)
) {
result[key] = getGroupSecretsRes[key];
}
return result;
},
{}
);
const groupEnvironmentVariables = Object.keys(secrets).map(key => ({
key,
value: secrets[key].value
}));
await standardRequest.put(
`${INTEGRATION_CHECKLY_API_URL}/v1/check-groups/${integration.targetServiceId}`,
{
environmentVariables: groupEnvironmentVariables
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"X-Checkly-Account": integration.appId
}
}
);
} else {
// sync secrets to checkly global envars
let getSecretsRes = (
await standardRequest.get(`${INTEGRATION_CHECKLY_API_URL}/v1/variables`, {
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
"X-Checkly-Account": integration.appId
}
})
).data.reduce(
(obj: any, secret: any) => ({
...obj,
[secret.key]: secret.value
}),
{}
);
getSecretsRes = Object.keys(getSecretsRes).reduce(
(
result: {
[key: string]: string;
},
key
) => {
if (
(appendices?.prefix !== undefined ? key.startsWith(appendices?.prefix) : true) &&
(appendices?.suffix !== undefined ? key.endsWith(appendices?.suffix) : true)
) {
result[key] = getSecretsRes[key];
}
return result;
},
{}
);
// add secrets
for await (const key of Object.keys(secrets)) {
if (!(key in getSecretsRes)) {
// case: secret does not exist in checkly
// -> add secret
await standardRequest.post(
`${INTEGRATION_CHECKLY_API_URL}/v1/variables`,
{
key,
value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"Content-Type": "application/json",
"X-Checkly-Account": integration.appId
}
}
);
} else {
// case: secret exists in checkly
// -> update/set secret
if (secrets[key] !== getSecretsRes[key]) {
await standardRequest.put(
`${INTEGRATION_CHECKLY_API_URL}/v1/variables/${key}`,
{
value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json",
Accept: "application/json",
"X-Checkly-Account": integration.appId
}
}
);
}
}
}
for await (const key of Object.keys(getSecretsRes)) {
if (!(key in secrets)) {
// delete secret
await standardRequest.delete(`${INTEGRATION_CHECKLY_API_URL}/v1/variables/${key}`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"X-Checkly-Account": integration.appId
}
});
}
}
}
};
@ -2195,18 +2288,20 @@ const syncSecretsQovery = async ({
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
const getSecretsRes = (
await standardRequest.get(`${INTEGRATION_QOVERY_API_URL}/${integration.scope}/${integration.appId}/environmentVariable`, {
headers: {
Authorization: `Token ${accessToken}`,
"Accept-Encoding": "application/json"
await standardRequest.get(
`${INTEGRATION_QOVERY_API_URL}/${integration.scope}/${integration.appId}/environmentVariable`,
{
headers: {
Authorization: `Token ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
})
)
).data.results.reduce(
(obj: any, secret: any) => ({
...obj,
[secret.key]: {"id": secret.id, "value": secret.value}
[secret.key]: { id: secret.id, value: secret.value }
}),
{}
);
@ -3076,4 +3171,111 @@ const syncSecretsNorthflank = async ({
);
};
/** Sync/push [secrets] to Hasura Cloud
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for Hasura Cloud integration
*/
const syncSecretsHasuraCloud = async ({
integration,
secrets,
accessToken
}: {
integration: IIntegration;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
const res = await standardRequest.post(
INTEGRATION_HASURA_CLOUD_API_URL,
{
query:
"query MyQuery($tenantId: uuid!) { getTenantEnv(tenantId: $tenantId) { hash envVars } }",
variables: {
tenantId: integration.appId
}
},
{
headers: {
Authorization: `pat ${accessToken}`,
"Content-Type": "application/json"
}
}
);
const {
data: {
getTenantEnv: { hash, envVars }
}
} = ZGetTenantEnv.parse(res.data);
let currentHash = hash;
const secretsToUpdate = Object.keys(secrets).map((key) => {
return ({
key,
value: secrets[key].value
});
});
if (secretsToUpdate.length) {
// update secrets
const addRequest = await standardRequest.post(
INTEGRATION_HASURA_CLOUD_API_URL,
{
query:
"mutation MyQuery($currentHash: String!, $envs: [UpdateEnvObject!]!, $tenantId: uuid!) { updateTenantEnv(currentHash: $currentHash, envs: $envs, tenantId: $tenantId) { hash envVars} }",
variables: {
currentHash,
envs: secretsToUpdate,
tenantId: integration.appId
}
},
{
headers: {
Authorization: `pat ${accessToken}`,
"Content-Type": "application/json"
}
}
);
const addRequestResponse = ZUpdateTenantEnv.safeParse(addRequest.data);
if (addRequestResponse.success) {
currentHash = addRequestResponse.data.data.updateTenantEnv.hash;
}
}
const secretsToDelete = envVars.environment
? Object.keys(envVars.environment).filter((key) => !(key in secrets))
: [];
if (secretsToDelete.length) {
await standardRequest.post(
INTEGRATION_HASURA_CLOUD_API_URL,
{
query: `
mutation deleteTenantEnv($id: uuid!, $currentHash: String!, $env: [String!]!) {
deleteTenantEnv(tenantId: $id, currentHash: $currentHash, deleteEnvs: $env) {
hash
envVars
}
}
`,
variables: {
id: integration.appId,
currentHash,
env: secretsToDelete
}
},
{
headers: {
Authorization: `pat ${accessToken}`,
"Content-Type": "application/json"
}
}
);
}
};
export { syncSecrets };

View File

@ -1,39 +1,27 @@
import { Types } from "mongoose";
import {
IServiceTokenData,
IServiceTokenDataV3,
IUser,
} from "../../models";
import {
ServiceActor,
ServiceActorV3,
UserActor,
UserAgentType
} from "../../ee/models";
import { IServiceTokenData, IServiceTokenDataV3, IUser } from "../../models";
import { ServiceActor, ServiceActorV3, UserActor, UserAgentType } from "../../ee/models";
interface BaseAuthData {
ipAddress: string;
userAgent: string;
userAgentType: UserAgentType;
tokenVersionId?: Types.ObjectId;
ipAddress: string;
userAgent: string;
userAgentType: UserAgentType;
tokenVersionId?: Types.ObjectId;
}
export interface UserAuthData extends BaseAuthData {
actor: UserActor;
authPayload: IUser;
actor: UserActor;
authPayload: IUser;
}
export interface ServiceTokenV3AuthData extends BaseAuthData {
actor: ServiceActorV3;
authPayload: IServiceTokenDataV3;
actor: ServiceActorV3;
authPayload: IServiceTokenDataV3;
}
export interface ServiceTokenAuthData extends BaseAuthData {
actor: ServiceActor;
authPayload: IServiceTokenData;
actor: ServiceActor;
authPayload: IServiceTokenData;
}
export type AuthData =
| UserAuthData
| ServiceTokenV3AuthData
| ServiceTokenAuthData;
export type AuthData = UserAuthData | ServiceTokenV3AuthData | ServiceTokenAuthData;

View File

@ -26,7 +26,6 @@ export interface CreateSecretParams {
export interface GetSecretsParams {
workspaceId: Types.ObjectId;
environment: string;
folderId?: string;
secretPath: string;
authData: AuthData;
}

View File

@ -2,49 +2,45 @@ import * as Sentry from "@sentry/node";
import { ErrorRequestHandler } from "express";
import { TokenExpiredError } from "jsonwebtoken";
import { InternalServerError, UnauthorizedRequestError } from "../utils/errors";
import { getLogger } from "../utils/logger";
import RequestError from "../utils/requestError";
import { logger } from "../utils/logging";
import RequestError, { mapToPinoLogLevel } from "../utils/requestError";
import { ForbiddenError } from "@casl/ability";
export const requestErrorHandler: ErrorRequestHandler = async (
error: RequestError | Error,
err: RequestError | Error,
req,
res,
next
) => {
if (res.headersSent) return next();
const logAndCaptureException = async (error: RequestError) => {
(await getLogger("backend-main")).log(
(<RequestError>error).levelName.toLowerCase(),
`${error.stack}\n${error.message}`
);
//* Set Sentry user identification if req.user is populated
if (req.user !== undefined && req.user !== null) {
Sentry.setUser({ email: (req.user as any).email });
}
Sentry.captureException(error);
};
if (error instanceof RequestError) {
if (error instanceof TokenExpiredError) {
error = UnauthorizedRequestError({ stack: error.stack, message: "Token expired" });
}
await logAndCaptureException((<RequestError>error));
} else {
if (error instanceof ForbiddenError) {
error = UnauthorizedRequestError({ context: { exception: error.message }, stack: error.stack })
} else {
error = InternalServerError({ context: { exception: error.message }, stack: error.stack });
}
await logAndCaptureException((<RequestError>error));
let error: RequestError;
switch (true) {
case err instanceof TokenExpiredError:
error = UnauthorizedRequestError({ stack: err.stack, message: "Token expired" });
break;
case err instanceof ForbiddenError:
error = UnauthorizedRequestError({ context: { exception: err.message }, stack: err.stack })
break;
case err instanceof RequestError:
error = err as RequestError;
break;
default:
error = InternalServerError({ context: { exception: err.message }, stack: err.stack });
break;
}
logger[mapToPinoLogLevel(error.level)](error);
if (req.user) {
Sentry.setUser({ email: (req.user as any).email });
}
Sentry.captureException(error);
delete (<any>error).stacktrace // remove stack trace from being sent to client
res.status((<RequestError>error).statusCode).json(error);
res.status((<RequestError>error).statusCode).json(error); // revise json part here
next();
};

View File

@ -2,7 +2,8 @@ import jwt from "jsonwebtoken";
import { NextFunction, Request, Response } from "express";
import { User } from "../models";
import { BadRequestError, UnauthorizedRequestError } from "../utils/errors";
import { getJwtMfaSecret } from "../config";
import { getAuthSecret } from "../config";
import { AuthTokenType } from "../variables";
declare module "jsonwebtoken" {
export interface UserIDJwtPayload extends jwt.JwtPayload {
@ -26,8 +27,10 @@ const requireMfaAuth = async (
if(AUTH_TOKEN_VALUE === null) return next(BadRequestError({message: "Missing Authorization Body in the request header"}))
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(AUTH_TOKEN_VALUE, await getJwtMfaSecret())
jwt.verify(AUTH_TOKEN_VALUE, await getAuthSecret())
);
if (decodedToken.authTokenType !== AuthTokenType.MFA_TOKEN) throw UnauthorizedRequestError();
const user = await User.findOne({
_id: decodedToken.userId,

View File

@ -2,7 +2,8 @@ import jwt from "jsonwebtoken";
import { NextFunction, Request, Response } from "express";
import { User } from "../models";
import { BadRequestError, UnauthorizedRequestError } from "../utils/errors";
import { getJwtSignupSecret } from "../config";
import { getAuthSecret } from "../config";
import { AuthTokenType } from "../variables";
declare module "jsonwebtoken" {
export interface UserIDJwtPayload extends jwt.JwtPayload {
@ -27,8 +28,10 @@ const requireSignupAuth = async (
if(AUTH_TOKEN_VALUE === null) return next(BadRequestError({message: "Missing Authorization Body in the request header"}))
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(AUTH_TOKEN_VALUE, await getJwtSignupSecret())
jwt.verify(AUTH_TOKEN_VALUE, await getAuthSecret())
);
if (decodedToken.authTokenType !== AuthTokenType.SIGNUP_TOKEN) throw UnauthorizedRequestError();
const user = await User.findOne({
_id: decodedToken.userId,

View File

@ -0,0 +1,38 @@
import { Document, Schema, Types, model } from "mongoose";
export interface IAPIKeyDataV2 extends Document {
_id: Types.ObjectId;
name: string;
user: Types.ObjectId;
lastUsed?: Date
usageCount: number;
expiresAt?: Date;
}
const apiKeyDataV2Schema = new Schema(
{
name: {
type: String,
required: true
},
user: {
type: Schema.Types.ObjectId,
ref: "User",
required: true
},
lastUsed: {
type: Date,
required: false
},
usageCount: {
type: Number,
default: 0,
required: true
}
},
{
timestamps: true
}
);
export const APIKeyDataV2 = model<IAPIKeyDataV2>("APIKeyDataV2", apiKeyDataV2Schema);

View File

@ -24,9 +24,10 @@ export * from "./user";
export * from "./userAction";
export * from "./workspace";
export * from "./serviceTokenData"; // TODO: deprecate
export * from "./apiKeyData";
export * from "./serviceTokenDataV3";
export * from "./serviceTokenDataV3Key";
export * from "./apiKeyData"; // TODO: deprecate
export * from "./apiKeyDataV2";
export * from "./loginSRPDetail";
export * from "./tokenVersion";
export * from "./webhooks";
export * from "./serviceTokenDataV3";
export * from "./serviceTokenDataV3Key";

View File

@ -14,6 +14,7 @@ import {
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HASURA_CLOUD,
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
@ -76,7 +77,8 @@ export interface IIntegration {
| "cloud-66"
| "northflank"
| "windmill"
| "gcp-secret-manager";
| "gcp-secret-manager"
| "hasura-cloud";
integrationAuth: Types.ObjectId;
metadata: Metadata;
}
@ -86,67 +88,67 @@ const integrationSchema = new Schema<IIntegration>(
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace",
required: true,
required: true
},
environment: {
type: String,
required: true,
required: true
},
isActive: {
type: Boolean,
required: true,
required: true
},
url: {
// for custom self-hosted integrations (e.g. self-hosted GitHub enterprise)
type: String,
default: null,
default: null
},
app: {
// name of app in provider
type: String,
default: null,
default: null
},
appId: {
// id of app in provider
type: String,
default: null,
default: null
},
targetEnvironment: {
// target environment
type: String,
default: null,
default: null
},
targetEnvironmentId: {
type: String,
default: null,
default: null
},
targetService: {
// railway-specific service
// qovery-specific project
type: String,
default: null,
default: null
},
targetServiceId: {
// railway-specific service
// qovery specific project
type: String,
default: null,
default: null
},
owner: {
// github-specific repo owner-login
type: String,
default: null,
default: null
},
path: {
// aws-parameter-store-specific path
// (also) vercel preview-branch
type: String,
default: null,
default: null
},
region: {
// aws-parameter-store-specific path
type: String,
default: null,
default: null
},
scope: {
// qovery-specific scope
@ -183,19 +185,20 @@ const integrationSchema = new Schema<IIntegration>(
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK,
INTEGRATION_GCP_SECRET_MANAGER
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_HASURA_CLOUD
],
required: true,
required: true
},
integrationAuth: {
type: Schema.Types.ObjectId,
ref: "IntegrationAuth",
required: true,
required: true
},
secretPath: {
type: String,
required: true,
default: "/",
default: "/"
},
metadata: {
type: Schema.Types.Mixed,
@ -203,8 +206,8 @@ const integrationSchema = new Schema<IIntegration>(
}
},
{
timestamps: true,
timestamps: true
}
);
export const Integration = model<IIntegration>("Integration", integrationSchema);
export const Integration = model<IIntegration>("Integration", integrationSchema);

View File

@ -1,205 +1,203 @@
import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_BITBUCKET,
INTEGRATION_CIRCLECI,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CLOUD_66,
INTEGRATION_CODEFRESH,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_FLYIO,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
INTEGRATION_NORTHFLANK,
INTEGRATION_RAILWAY,
INTEGRATION_RENDER,
INTEGRATION_SUPABASE,
INTEGRATION_TEAMCITY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL,
INTEGRATION_WINDMILL
} from "../../variables";
import { Document, Schema, Types, model } from "mongoose";
import { IntegrationAuthMetadata } from "./types";
export interface IIntegrationAuth extends Document {
_id: Types.ObjectId;
workspace: Types.ObjectId;
integration:
| "heroku"
| "vercel"
| "netlify"
| "github"
| "gitlab"
| "render"
| "railway"
| "flyio"
| "azure-key-vault"
| "laravel-forge"
| "circleci"
| "travisci"
| "supabase"
| "aws-parameter-store"
| "aws-secret-manager"
| "checkly"
| "qovery"
| "cloudflare-pages"
| "codefresh"
| "digital-ocean-app-platform"
| "bitbucket"
| "cloud-66"
| "terraform-cloud"
| "teamcity"
| "northflank"
| "windmill"
| "gcp-secret-manager";
teamId: string;
accountId: string;
url: string;
namespace: string;
refreshCiphertext?: string;
refreshIV?: string;
refreshTag?: string;
accessIdCiphertext?: string;
accessIdIV?: string;
accessIdTag?: string;
accessCiphertext?: string;
accessIV?: string;
accessTag?: string;
algorithm?: "aes-256-gcm";
keyEncoding?: "utf8" | "base64";
accessExpiresAt?: Date;
metadata?: IntegrationAuthMetadata;
}
const integrationAuthSchema = new Schema<IIntegrationAuth>(
{
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace",
required: true,
},
integration: {
type: String,
enum: [
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_RENDER,
INTEGRATION_RAILWAY,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_TEAMCITY,
INTEGRATION_SUPABASE,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK,
INTEGRATION_GCP_SECRET_MANAGER
],
required: true,
},
teamId: {
// vercel-specific integration param
type: String,
},
url: {
// for any self-hosted integrations (e.g. self-hosted hashicorp-vault)
type: String,
},
namespace: {
// hashicorp-vault-specific integration param
type: String,
},
accountId: {
// netlify-specific integration param
type: String,
},
refreshCiphertext: {
type: String,
select: false,
},
refreshIV: {
type: String,
select: false,
},
refreshTag: {
type: String,
select: false,
},
accessIdCiphertext: {
type: String,
select: false,
},
accessIdIV: {
type: String,
select: false,
},
accessIdTag: {
type: String,
select: false,
},
accessCiphertext: {
type: String,
select: false,
},
accessIV: {
type: String,
select: false,
},
accessTag: {
type: String,
select: false,
},
accessExpiresAt: {
type: Date,
select: false,
},
algorithm: { // the encryption algorithm used
type: String,
enum: [ALGORITHM_AES_256_GCM],
required: true,
},
keyEncoding: {
type: String,
enum: [
ENCODING_SCHEME_UTF8,
ENCODING_SCHEME_BASE64,
],
required: true,
},
metadata: {
type: Schema.Types.Mixed
}
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_BITBUCKET,
INTEGRATION_CIRCLECI,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CLOUD_66,
INTEGRATION_CODEFRESH,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_FLYIO,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_HASURA_CLOUD,
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
INTEGRATION_NORTHFLANK,
INTEGRATION_RAILWAY,
INTEGRATION_RENDER,
INTEGRATION_SUPABASE,
INTEGRATION_TEAMCITY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL,
INTEGRATION_WINDMILL
} from "../../variables";
import { Document, Schema, Types, model } from "mongoose";
import { IntegrationAuthMetadata } from "./types";
export interface IIntegrationAuth extends Document {
_id: Types.ObjectId;
workspace: Types.ObjectId;
integration:
| "heroku"
| "vercel"
| "netlify"
| "github"
| "gitlab"
| "render"
| "railway"
| "flyio"
| "azure-key-vault"
| "laravel-forge"
| "circleci"
| "travisci"
| "supabase"
| "aws-parameter-store"
| "aws-secret-manager"
| "checkly"
| "qovery"
| "cloudflare-pages"
| "codefresh"
| "digital-ocean-app-platform"
| "bitbucket"
| "cloud-66"
| "terraform-cloud"
| "teamcity"
| "northflank"
| "windmill"
| "gcp-secret-manager"
| "hasura-cloud";
teamId: string;
accountId: string;
url: string;
namespace: string;
refreshCiphertext?: string;
refreshIV?: string;
refreshTag?: string;
accessIdCiphertext?: string;
accessIdIV?: string;
accessIdTag?: string;
accessCiphertext?: string;
accessIV?: string;
accessTag?: string;
algorithm?: "aes-256-gcm";
keyEncoding?: "utf8" | "base64";
accessExpiresAt?: Date;
metadata?: IntegrationAuthMetadata;
}
const integrationAuthSchema = new Schema<IIntegrationAuth>(
{
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace",
required: true
},
{
timestamps: true,
integration: {
type: String,
enum: [
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
INTEGRATION_AWS_SECRET_MANAGER,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_RENDER,
INTEGRATION_RAILWAY,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_TEAMCITY,
INTEGRATION_SUPABASE,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_HASURA_CLOUD
],
required: true
},
teamId: {
// vercel-specific integration param
type: String
},
url: {
// for any self-hosted integrations (e.g. self-hosted hashicorp-vault)
type: String
},
namespace: {
// hashicorp-vault-specific integration param
type: String
},
accountId: {
// netlify-specific integration param
type: String
},
refreshCiphertext: {
type: String,
select: false
},
refreshIV: {
type: String,
select: false
},
refreshTag: {
type: String,
select: false
},
accessIdCiphertext: {
type: String,
select: false
},
accessIdIV: {
type: String,
select: false
},
accessIdTag: {
type: String,
select: false
},
accessCiphertext: {
type: String,
select: false
},
accessIV: {
type: String,
select: false
},
accessTag: {
type: String,
select: false
},
accessExpiresAt: {
type: Date,
select: false
},
algorithm: {
// the encryption algorithm used
type: String,
enum: [ALGORITHM_AES_256_GCM],
required: true
},
keyEncoding: {
type: String,
enum: [ENCODING_SCHEME_UTF8, ENCODING_SCHEME_BASE64],
required: true
},
metadata: {
type: Schema.Types.Mixed
}
);
export const IntegrationAuth = model<IIntegrationAuth>(
"IntegrationAuth",
integrationAuthSchema
);
},
{
timestamps: true
}
);
export const IntegrationAuth = model<IIntegrationAuth>("IntegrationAuth", integrationAuthSchema);

View File

@ -54,6 +54,7 @@ const serviceTokenDataV3Schema = new Schema(
},
isActive: {
type: Boolean,
default: true,
required: true
},
lastUsed: {

View File

@ -65,13 +65,11 @@ const WebhookSchema = new Schema<IWebhook>(
// the encryption algorithm used
type: String,
enum: [ALGORITHM_AES_256_GCM],
required: true,
select: false
},
keyEncoding: {
type: String,
enum: [ENCODING_SCHEME_UTF8, ENCODING_SCHEME_BASE64],
required: true,
select: false
}
},
@ -80,4 +78,4 @@ const WebhookSchema = new Schema<IWebhook>(
}
);
export const Webhook = model<IWebhook>("Webhook", WebhookSchema);
export const Webhook = model<IWebhook>("Webhook", WebhookSchema);

View File

@ -60,6 +60,14 @@ router.get(
integrationAuthController.getIntegrationAuthVercelBranches
);
router.get(
"/:integrationAuthId/checkly/groups",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
integrationAuthController.getIntegrationAuthChecklyGroups
);
router.get(
"/:integrationAuthId/qovery/orgs",
requireAuth({

View File

@ -6,7 +6,7 @@ import {
import { AuthMode } from "../../variables";
import { serviceTokenDataController } from "../../controllers/v2";
router.get(
router.get( // TODO: deprecate (moving to ST V3)
"/",
requireAuth({
acceptedAuthModes: [AuthMode.SERVICE_TOKEN]
@ -14,7 +14,7 @@ router.get(
serviceTokenDataController.getServiceTokenData
);
router.post(
router.post( // TODO: deprecate (moving to ST V3)
"/",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
@ -22,7 +22,7 @@ router.post(
serviceTokenDataController.createServiceTokenData
);
router.delete(
router.delete( // TODO: deprecate (moving to ST V3)
"/:serviceTokenDataId",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
@ -30,4 +30,4 @@ router.delete(
serviceTokenDataController.deleteServiceTokenData
);
export default router;
export default router;

View File

@ -36,7 +36,7 @@ router.get(
usersController.getMyOrganizations
);
router.get(
router.get( // TODO: deprecate (moving to API Key V2)
"/me/api-keys",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]

View File

@ -1,10 +1,12 @@
import auth from "./auth";
import users from "./users";
import secrets from "./secrets";
import workspaces from "./workspaces";
import signup from "./signup";
export {
auth,
users,
secrets,
signup,
workspaces

View File

@ -0,0 +1,15 @@
import express from "express";
const router = express.Router();
import { requireAuth } from "../../middleware";
import { AuthMode } from "../../variables";
import { usersController } from "../../controllers/v3";
router.get(
"/me/api-keys",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
usersController.getMyAPIKeys
);
export default router;

View File

@ -1,11 +1,12 @@
import Redis, { Redis as TRedis } from "ioredis";
import { logger } from "../utils/logging";
let redisClient: TRedis | null;
if (process.env.REDIS_URL) {
redisClient = new Redis(process.env.REDIS_URL as string);
} else {
console.warn("Redis URL not set, skipping Redis initialization.");
logger.warn("Redis URL not set, skipping Redis initialization.");
redisClient = null;
}

View File

@ -1,5 +1,5 @@
import { PostHog } from "posthog-node";
import { getLogger } from "../utils/logger";
import { logger } from "../utils/logging";
import { AuthData } from "../interfaces/middleware";
import {
getNodeEnv,
@ -22,13 +22,13 @@ class Telemetry {
* Logs telemetry enable/disable notice.
*/
static logTelemetryMessage = async () => {
if(!(await getTelemetryEnabled())){
(await getLogger("backend-main")).info([
"",
[
"To improve, Infisical collects telemetry data about general usage.",
"This helps us understand how the product is doing and guide our product development to create the best possible platform; it also helps us demonstrate growth as we support Infisical as open-source software.",
"To opt into telemetry, you can set `TELEMETRY_ENABLED=true` within the environment variables.",
].join("\n"))
].forEach(line => logger.info(line));
}
}

View File

@ -2,26 +2,42 @@ import axios from "axios";
import crypto from "crypto";
import { Types } from "mongoose";
import picomatch from "picomatch";
import { client, getRootEncryptionKey } from "../config";
import { client, getEncryptionKey, getRootEncryptionKey } from "../config";
import { IWebhook, Webhook } from "../models";
import { decryptSymmetric128BitHexKeyUTF8 } from "../utils/crypto";
import { ENCODING_SCHEME_BASE64, ENCODING_SCHEME_UTF8 } from "../variables";
export const triggerWebhookRequest = async (
{ url, encryptedSecretKey, iv, tag }: IWebhook,
{ url, encryptedSecretKey, iv, tag, keyEncoding }: IWebhook,
payload: Record<string, unknown>
) => {
const headers: Record<string, string> = {};
payload["timestamp"] = Date.now();
if (encryptedSecretKey) {
const encryptionKey = await getEncryptionKey();
const rootEncryptionKey = await getRootEncryptionKey();
const secretKey = client.decryptSymmetric(encryptedSecretKey, rootEncryptionKey, iv, tag);
const webhookSign = crypto
.createHmac("sha256", secretKey)
.update(JSON.stringify(payload))
.digest("hex");
headers["x-infisical-signature"] = `t=${payload["timestamp"]};${webhookSign}`;
let secretKey;
if (rootEncryptionKey && keyEncoding === ENCODING_SCHEME_BASE64) {
// case: encoding scheme is base64
secretKey = client.decryptSymmetric(encryptedSecretKey, rootEncryptionKey, iv, tag);
} else if (encryptionKey && keyEncoding === ENCODING_SCHEME_UTF8) {
// case: encoding scheme is utf8
secretKey = decryptSymmetric128BitHexKeyUTF8({
ciphertext: encryptedSecretKey,
iv: iv,
tag: tag,
key: encryptionKey
});
}
if (secretKey) {
const webhookSign = crypto
.createHmac("sha256", secretKey)
.update(JSON.stringify(payload))
.digest("hex");
headers["x-infisical-signature"] = `t=${payload["timestamp"]};${webhookSign}`;
}
}
const req = await axios.post(url, payload, { headers });
return req;
};

View File

@ -1,10 +1,10 @@
import mongoose from "mongoose";
import { createTerminus } from "@godaddy/terminus";
import { getLogger } from "../utils/logger";
import { logger } from "../utils/logging";
export const setUpHealthEndpoint = <T>(server: T) => {
const onSignal = async () => {
(await getLogger("backend-main")).info("Server is starting clean-up");
logger.info("Server is starting clean-up");
return Promise.all([
new Promise((resolve) => {
if (mongoose.connection && mongoose.connection.readyState == 1) {

View File

@ -16,7 +16,7 @@ import {
getSmtpSecure,
getSmtpUsername,
} from "../config";
import { getLogger } from "../utils/logger";
import { logger } from "../utils/logging";
export const initSmtp = async () => {
const mailOpts: SMTPConnection.Options = {
@ -84,15 +84,14 @@ export const initSmtp = async () => {
.then(async () => {
Sentry.setUser(null);
Sentry.captureMessage("SMTP - Successfully connected");
(await getLogger("backend-main")).info(
"SMTP - Successfully connected"
);
logger.info("SMTP - Successfully connected");
})
.catch(async (err) => {
Sentry.setUser(null);
Sentry.captureException(
`SMTP - Failed to connect to ${await getSmtpHost()}:${await getSmtpPort()} \n\t${err}`
);
logger.error(err, `SMTP - Failed to connect to ${await getSmtpHost()}:${await getSmtpPort()}`);
});
return transporter;

View File

@ -13,6 +13,7 @@ import {
} from "../models";
import { createToken } from "../helpers/auth";
import {
getAuthSecret,
getClientIdGitHubLogin,
getClientIdGitLabLogin,
getClientIdGoogleLogin,
@ -20,13 +21,12 @@ import {
getClientSecretGitLabLogin,
getClientSecretGoogleLogin,
getJwtProviderAuthLifetime,
getJwtProviderAuthSecret,
getSiteURL,
getUrlGitLabLogin
} from "../config";
import { getSSOConfigHelper } from "../ee/helpers/organizations";
import { InternalServerError, OrganizationNotFoundError } from "./errors";
import { ACCEPTED, INTEGRATION_GITHUB_API_URL, INVITED, MEMBER } from "../variables";
import { ACCEPTED, AuthTokenType, INTEGRATION_GITHUB_API_URL, INVITED, MEMBER } from "../variables";
import { standardRequest } from "../config/request";
// eslint-disable-next-line @typescript-eslint/no-var-requires
@ -131,6 +131,7 @@ const initializePassport = async () => {
const isUserCompleted = !!user.publicKey;
const providerAuthToken = createToken({
payload: {
authTokenType: AuthTokenType.PROVIDER_TOKEN,
userId: user._id.toString(),
email: user.email,
firstName: user.firstName,
@ -143,7 +144,7 @@ const initializePassport = async () => {
} : {})
},
expiresIn: await getJwtProviderAuthLifetime(),
secret: await getJwtProviderAuthSecret(),
secret: await getAuthSecret(),
});
req.isUserCompleted = isUserCompleted;
@ -204,6 +205,7 @@ const initializePassport = async () => {
const isUserCompleted = !!user.publicKey;
const providerAuthToken = createToken({
payload: {
authTokenType: AuthTokenType.PROVIDER_TOKEN,
userId: user._id.toString(),
email: user.email,
firstName: user.firstName,
@ -216,7 +218,7 @@ const initializePassport = async () => {
} : {})
},
expiresIn: await getJwtProviderAuthLifetime(),
secret: await getJwtProviderAuthSecret(),
secret: await getAuthSecret(),
});
req.isUserCompleted = isUserCompleted;
@ -258,6 +260,7 @@ const initializePassport = async () => {
const isUserCompleted = !!user.publicKey;
const providerAuthToken = createToken({
payload: {
authTokenType: AuthTokenType.PROVIDER_TOKEN,
userId: user._id.toString(),
email: user.email,
firstName: user.firstName,
@ -270,7 +273,7 @@ const initializePassport = async () => {
} : {})
},
expiresIn: await getJwtProviderAuthLifetime(),
secret: await getJwtProviderAuthSecret(),
secret: await getAuthSecret(),
});
req.isUserCompleted = isUserCompleted;
@ -401,6 +404,7 @@ const initializePassport = async () => {
const isUserCompleted = !!user.publicKey;
const providerAuthToken = createToken({
payload: {
authTokenType: AuthTokenType.PROVIDER_TOKEN,
userId: user._id.toString(),
email: user.email,
firstName,
@ -413,7 +417,7 @@ const initializePassport = async () => {
} : {})
},
expiresIn: await getJwtProviderAuthLifetime(),
secret: await getJwtProviderAuthSecret(),
secret: await getAuthSecret(),
});
req.isUserCompleted = isUserCompleted;

View File

@ -29,7 +29,7 @@ export const UnauthorizedRequestError = (error?: Partial<RequestErrorContext>) =
});
export const ForbiddenRequestError = (error?: Partial<RequestErrorContext>) => new RequestError({
logLevel: error?.logLevel ?? LogLevel.INFO,
logLevel: error?.logLevel ?? LogLevel.WARN,
statusCode: error?.statusCode ?? 403,
type: error?.type ?? "forbidden",
message: error?.message ?? "You are not allowed to access this resource",

View File

@ -1,67 +0,0 @@
/* eslint-disable no-console */
import { createLogger, format, transports } from "winston";
import LokiTransport from "winston-loki";
import { getLokiHost, getNodeEnv } from "../config";
const { combine, colorize, label, printf, splat, timestamp } = format;
const logFormat = (prefix: string) => combine(
timestamp(),
splat(),
label({ label: prefix }),
printf((info) => `${info.timestamp} ${info.label} ${info.level}: ${info.message}`)
);
const createLoggerWithLabel = async (level: string, label: string) => {
const _level = level.toLowerCase() || "info"
//* Always add Console output to transports
const _transports: any[] = [
new transports.Console({
format: combine(
colorize(),
logFormat(label),
// format.json()
),
}),
]
//* Add LokiTransport if it's enabled
if((await getLokiHost()) !== undefined){
_transports.push(
new LokiTransport({
host: await getLokiHost(),
handleExceptions: true,
handleRejections: true,
batching: true,
level: _level,
timeout: 30000,
format: format.combine(
format.json()
),
labels: {
app: process.env.npm_package_name,
version: process.env.npm_package_version,
environment: await getNodeEnv(),
},
onConnectionError: (err: Error)=> console.error("Connection error while connecting to Loki Server.\n", err),
})
)
}
return createLogger({
level: _level,
transports: _transports,
format: format.combine(
logFormat(label),
format.metadata({ fillExcept: ["message", "level", "timestamp", "label"] })
),
});
}
export const getLogger = async (loggerName: "backend-main" | "database") => {
const logger = {
"backend-main": await createLoggerWithLabel("info", "[IFSC:backend-main]"),
"database": await createLoggerWithLabel("info", "[IFSC:database]"),
}
return logger[loggerName]
}

View File

@ -0,0 +1 @@
export { logger } from "./logger";

View File

@ -0,0 +1,15 @@
import pino from "pino";
export const logger = pino({
level: process.env.PINO_LOG_LEVEL || "trace",
timestamp: pino.stdTimeFunctions.isoTime,
formatters: {
bindings: (bindings) => {
return {
pid: bindings.pid,
hostname: bindings.hostname
// node_version: process.version
};
},
}
});

View File

@ -2,34 +2,30 @@ import { Request } from "express"
import { getVerboseErrorOutput } from "../config";
export enum LogLevel {
DEBUG = 100,
INFO = 200,
NOTICE = 250,
WARNING = 300,
ERROR = 400,
CRITICAL = 500,
ALERT = 550,
EMERGENCY = 600,
TRACE = 10,
DEBUG = 20,
INFO = 30,
WARN = 40,
ERROR = 50,
FATAL = 60
}
export const mapToWinstonLogLevel = (customLogLevel: LogLevel): string => {
type PinoLogLevel = "trace" | "debug" | "info" | "warn" | "error" | "fatal";
export const mapToPinoLogLevel = (customLogLevel: LogLevel): PinoLogLevel => {
switch (customLogLevel) {
case LogLevel.TRACE:
return "trace";
case LogLevel.DEBUG:
return "debug";
case LogLevel.INFO:
return "info";
case LogLevel.NOTICE:
return "notice";
case LogLevel.WARNING:
case LogLevel.WARN:
return "warn";
case LogLevel.ERROR:
return "error";
case LogLevel.CRITICAL:
return "crit";
case LogLevel.ALERT:
return "alert";
case LogLevel.EMERGENCY:
return "emerg";
case LogLevel.FATAL:
return "fatal";
}
}
@ -42,10 +38,10 @@ export type RequestErrorContext = {
stack?: string|undefined
}
export default class RequestError extends Error{
export default class RequestError extends Error {
private _logLevel: LogLevel
private _logName: string
private _logName: string;
statusCode: number
type: string
context: Record<string, unknown>
@ -55,9 +51,10 @@ export default class RequestError extends Error{
constructor(
{logLevel, statusCode, type, message, context, stack} : RequestErrorContext
){
super(message)
this._logLevel = logLevel || LogLevel.INFO
this._logName = LogLevel[this._logLevel]
this._logName = LogLevel[this._logLevel];
this.statusCode = statusCode
this.type = type
this.context = context || {}
@ -83,8 +80,12 @@ export default class RequestError extends Error{
})
}
get level(){ return this._logLevel }
get levelName(){ return this._logName }
get level(){
return this._logLevel
}
get levelName(){
return this._logName
}
withTags(...tags: string[]|number[]){
this.context["tags"] = Object.assign(tags, this.context["tags"])

View File

@ -1,4 +1,3 @@
/* eslint-disable no-console */
import crypto from "crypto";
import { Types } from "mongoose";
import { encryptSymmetric128BitHexKeyUTF8 } from "../crypto";
@ -47,6 +46,7 @@ import {
ProjectPermissionSub,
memberProjectPermissions
} from "../../ee/services/ProjectRoleService";
import { logger } from "../logging";
/**
* Backfill secrets to ensure that they're all versioned and have
@ -88,7 +88,7 @@ export const backfillSecretVersions = async () => {
)
});
}
console.log("Migration: Secret version migration v1 complete");
logger.info("Migration: Secret version migration v1 complete");
};
/**
@ -518,7 +518,7 @@ export const backfillSecretFolders = async () => {
.limit(50);
}
console.log("Migration: Folder migration v1 complete");
logger.info("Migration: Folder migration v1 complete");
};
export const backfillServiceToken = async () => {
@ -534,7 +534,7 @@ export const backfillServiceToken = async () => {
}
}
);
console.log("Migration: Service token migration v1 complete");
logger.info("Migration: Service token migration v1 complete");
};
export const backfillIntegration = async () => {
@ -550,7 +550,7 @@ export const backfillIntegration = async () => {
}
}
);
console.log("Migration: Integration migration v1 complete");
logger.info("Migration: Integration migration v1 complete");
};
export const backfillServiceTokenMultiScope = async () => {
@ -575,7 +575,7 @@ export const backfillServiceTokenMultiScope = async () => {
}
}
console.log("Migration: Service token migration v2 complete");
logger.info("Migration: Service token migration v2 complete");
};
/**
@ -650,7 +650,7 @@ export const backfillTrustedIps = async () => {
});
await TrustedIP.bulkWrite(operations);
console.log("Backfill: Trusted IPs complete");
logger.info("Backfill: Trusted IPs complete");
}
};
@ -698,7 +698,7 @@ export const backfillPermission = async () => {
if (lock) {
try {
console.info("Lock acquired for script [backfillPermission]");
logger.info("Lock acquired for script [backfillPermission]");
const memberships = await Membership.find({
deniedPermissions: {
@ -801,7 +801,7 @@ export const backfillPermission = async () => {
}
}
console.info("Backfill: Finished converting old denied permission in workspace to viewers");
logger.info("Backfill: Finished converting old denied permission in workspace to viewers");
await MembershipOrg.updateMany(
{
@ -814,14 +814,14 @@ export const backfillPermission = async () => {
}
);
console.info("Backfill: Finished converting owner role to member");
logger.info("Backfill: Finished converting owner role to member");
} catch (error) {
console.error("An error occurred when running script [backfillPermission]:", error);
logger.error(error, "An error occurred when running script [backfillPermission]");
}
} else {
console.info("Could not acquire lock for script [backfillPermission], skipping");
logger.info("Could not acquire lock for script [backfillPermission], skipping");
}
};
@ -837,5 +837,5 @@ export const migrateRoleFromOwnerToAdmin = async () => {
}
);
console.info("Backfill: Finished converting owner role to member");
logger.info("Backfill: Finished converting owner role to member");
}

View File

@ -11,7 +11,6 @@ import {
backfillBots,
backfillEncryptionMetadata,
backfillIntegration,
backfillPermission,
backfillSecretBlindIndexData,
backfillSecretFolders,
backfillSecretVersions,
@ -28,6 +27,7 @@ import {
} from "./reencryptData";
import { getMongoURL, getNodeEnv, getRedisUrl, getSentryDSN } from "../../config";
import { initializePassport } from "../auth";
import { logger } from "../logging";
/**
* Prepare Infisical upon startup. This includes tasks like:
@ -41,7 +41,7 @@ import { initializePassport } from "../auth";
*/
export const setup = async () => {
if ((await getRedisUrl()) === undefined || (await getRedisUrl()) === "") {
console.error(
logger.error(
"WARNING: Redis is not yet configured. Infisical may not function as expected without it."
);
}

View File

@ -0,0 +1,22 @@
import { z } from "zod";
export const CreateAPIKeyV3 = z.object({
body: z.object({
name: z.string().trim()
})
});
export const UpdateAPIKeyV3 = z.object({
params: z.object({
apiKeyDataId: z.string().trim()
}),
body: z.object({
name: z.string().trim()
})
});
export const DeleteAPIKeyV3 = z.object({
params: z.object({
apiKeyDataId: z.string().trim()
})
});

View File

@ -0,0 +1,21 @@
import * as z from "zod";
export const ZGetTenantEnv = z.object({
data: z.object({
getTenantEnv: z.object({
hash: z.string(),
envVars: z.object({
environment: z.record(z.any()).optional()
})
})
})
});
export const ZUpdateTenantEnv = z.object({
data: z.object({
updateTenantEnv: z.object({
hash: z.string(),
envVars: z.record(z.any())
})
})
});

View File

@ -10,3 +10,4 @@ export * from "./secrets";
export * from "./serviceAccount";
export * from "./serviceTokenData";
export * from "./serviceTokenDataV3";
export * from "./apiKeyDataV3";

View File

@ -117,6 +117,15 @@ export const GetIntegrationAuthVercelBranchesV1 = z.object({
})
});
export const GetIntegrationAuthChecklyGroupsV1 = z.object({
params: z.object({
integrationAuthId: z.string().trim()
}),
query: z.object({
accountId: z.string().trim()
})
});
export const GetIntegrationAuthQoveryOrgsV1 = z.object({
params: z.object({
integrationAuthId: z.string().trim()

View File

@ -228,7 +228,6 @@ export const GetSecretsRawV3 = z.object({
workspaceId: z.string().trim().optional(),
environment: z.string().trim().optional(),
secretPath: z.string().trim().default("/"),
folderId: z.string().trim().optional(),
include_imports: z
.enum(["true", "false"])
.default("false")
@ -302,7 +301,6 @@ export const GetSecretsV3 = z.object({
workspaceId: z.string().trim(),
environment: z.string().trim(),
secretPath: z.string().trim().default("/"),
folderId: z.string().trim().optional(),
include_imports: z
.enum(["true", "false"])
.default("false")

View File

@ -1,3 +1,12 @@
export enum AuthTokenType {
ACCESS_TOKEN = "accessToken",
REFRESH_TOKEN = "refreshToken",
SIGNUP_TOKEN = "signupToken",
MFA_TOKEN = "mfaToken",
PROVIDER_TOKEN = "providerToken",
API_KEY = "apiKey"
}
export enum AuthMode {
JWT = "jwt",
SERVICE_TOKEN = "serviceToken",

View File

@ -1,12 +1,12 @@
import {
getClientIdAzure,
getClientIdBitBucket,
getClientIdGCPSecretManager,
getClientIdGitHub,
getClientIdGitLab,
getClientIdHeroku,
getClientIdNetlify,
getClientSlugVercel
getClientIdAzure,
getClientIdBitBucket,
getClientIdGCPSecretManager,
getClientIdGitHub,
getClientIdGitLab,
getClientIdHeroku,
getClientIdNetlify,
getClientSlugVercel
} from "../config";
// integrations
@ -22,7 +22,7 @@ export const INTEGRATION_GITLAB = "gitlab";
export const INTEGRATION_RENDER = "render";
export const INTEGRATION_RAILWAY = "railway";
export const INTEGRATION_FLYIO = "flyio";
export const INTEGRATION_LARAVELFORGE = "laravel-forge"
export const INTEGRATION_LARAVELFORGE = "laravel-forge";
export const INTEGRATION_CIRCLECI = "circleci";
export const INTEGRATION_TRAVISCI = "travisci";
export const INTEGRATION_TEAMCITY = "teamcity";
@ -38,32 +38,34 @@ export const INTEGRATION_WINDMILL = "windmill";
export const INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM = "digital-ocean-app-platform";
export const INTEGRATION_CLOUD_66 = "cloud-66";
export const INTEGRATION_NORTHFLANK = "northflank";
export const INTEGRATION_HASURA_CLOUD = "hasura-cloud";
export const INTEGRATION_SET = new Set([
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_TEAMCITY,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
INTEGRATION_QOVERY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_TEAMCITY,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
INTEGRATION_QOVERY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK,
INTEGRATION_HASURA_CLOUD
]);
// integration types
@ -71,15 +73,14 @@ export const INTEGRATION_OAUTH2 = "oauth2";
// integration oauth endpoints
export const INTEGRATION_GCP_TOKEN_URL = "https://oauth2.googleapis.com/token";
export const INTEGRATION_AZURE_TOKEN_URL = "https://login.microsoftonline.com/common/oauth2/v2.0/token";
export const INTEGRATION_AZURE_TOKEN_URL =
"https://login.microsoftonline.com/common/oauth2/v2.0/token";
export const INTEGRATION_HEROKU_TOKEN_URL = "https://id.heroku.com/oauth/token";
export const INTEGRATION_VERCEL_TOKEN_URL =
"https://api.vercel.com/v2/oauth/access_token";
export const INTEGRATION_VERCEL_TOKEN_URL = "https://api.vercel.com/v2/oauth/access_token";
export const INTEGRATION_NETLIFY_TOKEN_URL = "https://api.netlify.com/oauth/token";
export const INTEGRATION_GITHUB_TOKEN_URL =
"https://github.com/login/oauth/access_token";
export const INTEGRATION_GITHUB_TOKEN_URL = "https://github.com/login/oauth/access_token";
export const INTEGRATION_GITLAB_TOKEN_URL = "https://gitlab.com/oauth/token";
export const INTEGRATION_BITBUCKET_TOKEN_URL = "https://bitbucket.org/site/oauth2/access_token"
export const INTEGRATION_BITBUCKET_TOKEN_URL = "https://bitbucket.org/site/oauth2/access_token";
// integration apps endpoints
export const INTEGRATION_GCP_API_URL = "https://cloudresourcemanager.googleapis.com";
@ -106,268 +107,279 @@ export const INTEGRATION_WINDMILL_API_URL = "https://app.windmill.dev/api";
export const INTEGRATION_DIGITAL_OCEAN_API_URL = "https://api.digitalocean.com";
export const INTEGRATION_CLOUD_66_API_URL = "https://app.cloud66.com/api";
export const INTEGRATION_NORTHFLANK_API_URL = "https://api.northflank.com";
export const INTEGRATION_HASURA_CLOUD_API_URL = "https://data.pro.hasura.io/v1/graphql";
export const INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME = "secretmanager.googleapis.com"
export const INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME = "secretmanager.googleapis.com";
export const INTEGRATION_GCP_SECRET_MANAGER_URL = `https://${INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME}`;
export const INTEGRATION_GCP_SERVICE_USAGE_URL = "https://serviceusage.googleapis.com";
export const INTEGRATION_GCP_CLOUD_PLATFORM_SCOPE = "https://www.googleapis.com/auth/cloud-platform";
export const INTEGRATION_GCP_CLOUD_PLATFORM_SCOPE =
"https://www.googleapis.com/auth/cloud-platform";
export const getIntegrationOptions = async () => {
const INTEGRATION_OPTIONS = [
{
name: "Heroku",
slug: "heroku",
image: "Heroku.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdHeroku(),
docsLink: "",
},
{
name: "Vercel",
slug: "vercel",
image: "Vercel.png",
isAvailable: true,
type: "oauth",
clientId: "",
clientSlug: await getClientSlugVercel(),
docsLink: "",
},
{
name: "Netlify",
slug: "netlify",
image: "Netlify.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdNetlify(),
docsLink: "",
},
{
name: "GitHub",
slug: "github",
image: "GitHub.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdGitHub(),
docsLink: "",
},
{
name: "Render",
slug: "render",
image: "Render.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Railway",
slug: "railway",
image: "Railway.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Fly.io",
slug: "flyio",
image: "Flyio.svg",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "AWS Parameter Store",
slug: "aws-parameter-store",
image: "Amazon Web Services.png",
isAvailable: true,
type: "custom",
clientId: "",
docsLink: "",
},
{
name: "Laravel Forge",
slug: "laravel-forge",
image: "Laravel Forge.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "AWS Secrets Manager",
slug: "aws-secret-manager",
image: "Amazon Web Services.png",
isAvailable: true,
type: "custom",
clientId: "",
docsLink: "",
},
{
name: "Azure Key Vault",
slug: "azure-key-vault",
image: "Microsoft Azure.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdAzure(),
docsLink: "",
},
{
name: "Circle CI",
slug: "circleci",
image: "Circle CI.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "GitLab",
slug: "gitlab",
image: "GitLab.png",
isAvailable: true,
type: "custom",
clientId: await getClientIdGitLab(),
docsLink: "",
},
{
name: "Terraform Cloud",
slug: "terraform-cloud",
image: "Terraform Cloud.png",
isAvailable: true,
type: "pat",
cliendId: "",
docsLink: "",
},
{
name: "Travis CI",
slug: "travisci",
image: "Travis CI.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "TeamCity",
slug: "teamcity",
image: "TeamCity.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Supabase",
slug: "supabase",
image: "Supabase.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Checkly",
slug: "checkly",
image: "Checkly.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Qovery",
slug: "qovery",
image: "Qovery.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "HashiCorp Vault",
slug: "hashicorp-vault",
image: "Vault.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "GCP Secret Manager",
slug: "gcp-secret-manager",
image: "Google Cloud Platform.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdGCPSecretManager(),
docsLink: ""
},
{
name: "Cloudflare Pages",
slug: "cloudflare-pages",
image: "Cloudflare.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "BitBucket",
slug: "bitbucket",
image: "BitBucket.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdBitBucket(),
docsLink: ""
},
{
name: "Codefresh",
slug: "codefresh",
image: "Codefresh.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Windmill",
slug: "windmill",
image: "Windmill.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Digital Ocean App Platform",
slug: "digital-ocean-app-platform",
image: "Digital Ocean.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Cloud 66",
slug: "cloud-66",
image: "Cloud 66.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Northflank",
slug: "northflank",
image: "Northflank.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
]
return INTEGRATION_OPTIONS;
}
const INTEGRATION_OPTIONS = [
{
name: "Heroku",
slug: "heroku",
image: "Heroku.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdHeroku(),
docsLink: ""
},
{
name: "Vercel",
slug: "vercel",
image: "Vercel.png",
isAvailable: true,
type: "oauth",
clientId: "",
clientSlug: await getClientSlugVercel(),
docsLink: ""
},
{
name: "Netlify",
slug: "netlify",
image: "Netlify.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdNetlify(),
docsLink: ""
},
{
name: "GitHub",
slug: "github",
image: "GitHub.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdGitHub(),
docsLink: ""
},
{
name: "Render",
slug: "render",
image: "Render.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Railway",
slug: "railway",
image: "Railway.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Fly.io",
slug: "flyio",
image: "Flyio.svg",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "AWS Parameter Store",
slug: "aws-parameter-store",
image: "Amazon Web Services.png",
isAvailable: true,
type: "custom",
clientId: "",
docsLink: ""
},
{
name: "Laravel Forge",
slug: "laravel-forge",
image: "Laravel Forge.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "AWS Secrets Manager",
slug: "aws-secret-manager",
image: "Amazon Web Services.png",
isAvailable: true,
type: "custom",
clientId: "",
docsLink: ""
},
{
name: "Azure Key Vault",
slug: "azure-key-vault",
image: "Microsoft Azure.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdAzure(),
docsLink: ""
},
{
name: "Circle CI",
slug: "circleci",
image: "Circle CI.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "GitLab",
slug: "gitlab",
image: "GitLab.png",
isAvailable: true,
type: "custom",
clientId: await getClientIdGitLab(),
docsLink: ""
},
{
name: "Terraform Cloud",
slug: "terraform-cloud",
image: "Terraform Cloud.png",
isAvailable: true,
type: "pat",
cliendId: "",
docsLink: ""
},
{
name: "Travis CI",
slug: "travisci",
image: "Travis CI.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "TeamCity",
slug: "teamcity",
image: "TeamCity.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Supabase",
slug: "supabase",
image: "Supabase.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Checkly",
slug: "checkly",
image: "Checkly.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Qovery",
slug: "qovery",
image: "Qovery.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "HashiCorp Vault",
slug: "hashicorp-vault",
image: "Vault.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "GCP Secret Manager",
slug: "gcp-secret-manager",
image: "Google Cloud Platform.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdGCPSecretManager(),
docsLink: ""
},
{
name: "Cloudflare Pages",
slug: "cloudflare-pages",
image: "Cloudflare.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "BitBucket",
slug: "bitbucket",
image: "BitBucket.png",
isAvailable: true,
type: "oauth",
clientId: await getClientIdBitBucket(),
docsLink: ""
},
{
name: "Codefresh",
slug: "codefresh",
image: "Codefresh.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Windmill",
slug: "windmill",
image: "Windmill.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Digital Ocean App Platform",
slug: "digital-ocean-app-platform",
image: "Digital Ocean.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Cloud 66",
slug: "cloud-66",
image: "Cloud 66.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Northflank",
slug: "northflank",
image: "Northflank.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "Hasura Cloud",
slug: "hasura-cloud",
image: "Hasura.svg",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
}
];
return INTEGRATION_OPTIONS;
};

View File

@ -4,6 +4,7 @@ Copyright (c) 2023 Infisical Inc.
package cmd
import (
"crypto/sha256"
"encoding/base64"
"fmt"
"regexp"
@ -11,8 +12,6 @@ import (
"strings"
"unicode"
"crypto/sha256"
"github.com/Infisical/infisical-merge/packages/api"
"github.com/Infisical/infisical-merge/packages/crypto"
"github.com/Infisical/infisical-merge/packages/models"
@ -441,6 +440,11 @@ func generateExampleEnv(cmd *cobra.Command, args []string) {
}
}
secretsPath, err := cmd.Flags().GetString("path")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
infisicalToken, err := cmd.Flags().GetString("token")
if err != nil {
util.HandleError(err, "Unable to parse flag")
@ -451,7 +455,7 @@ func generateExampleEnv(cmd *cobra.Command, args []string) {
util.HandleError(err, "Unable to parse flag")
}
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs})
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs, SecretsPath: secretsPath})
if err != nil {
util.HandleError(err, "To fetch all secrets")
}
@ -650,8 +654,8 @@ func getSecretsByKeys(secrets []models.SingleEnvironmentVariable) map[string]mod
}
func init() {
secretsGenerateExampleEnvCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")
secretsGenerateExampleEnvCmd.Flags().String("path", "/", "Fetch secrets from within a folder path")
secretsCmd.AddCommand(secretsGenerateExampleEnvCmd)
secretsGetCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")

View File

@ -1,47 +0,0 @@
/// <reference types="cypress" />
describe('organization Overview', () => {
beforeEach(() => {
cy.login(`test@localhost.local`, `testInfisical1`)
})
const projectName = "projectY"
it('can`t create projects with empty names', () => {
cy.get('.button').click()
cy.get('input[placeholder="Type your project name"]').type('abc').clear()
cy.intercept('*').as('anyRequest');
cy.get('@anyRequest').should('not.exist');
})
it('can delete a newly-created project', () => {
// Create a project
cy.get('.button').click()
cy.get('input[placeholder="Type your project name"]').type(`${projectName}`)
cy.contains('button', 'Create Project').click()
cy.url().should('include', '/project')
// Delete a project
cy.get(`[href^="/project/"][href$="/settings"] > a > .group`).click()
cy.contains('button', `Delete ${projectName}`).click()
cy.contains('button', 'Delete Project').should('have.attr', 'disabled')
cy.get('input[placeholder="Type to delete..."]').type('confirm')
cy.contains('button', 'Delete Project').should('not.have.attr', 'disabled')
cy.url().then((currentUrl) => {
let projectId = currentUrl.split("/")[4]
cy.intercept('DELETE', `/api/v1/workspace/${projectId}`).as('deleteProject');
cy.contains('button', 'Delete Project').click();
cy.get('@deleteProject').should('have.property', 'response').and('have.property', 'statusCode', 200);
})
})
it('can display no projects', () => {
cy.intercept('/api/v1/workspace', {
body: {
"workspaces": []
},
})
cy.get('.border-mineshaft-700 > :nth-child(2)').should('have.text', 'You are not part of any projects in this organization yet. When you are, they will appear here.')
})
})

View File

@ -1,24 +0,0 @@
/// <reference types="cypress" />
describe('Organization Settings', () => {
let orgId;
beforeEach(() => {
cy.login(`test@localhost.local`, `testInfisical1`)
cy.url().then((currentUrl) => {
orgId = currentUrl.split("/")[4]
cy.visit(`org/${orgId}/settings`)
})
})
it('can rename org', () => {
cy.get('input[placeholder="Acme Corp"]').clear().type('ABC')
cy.intercept('PATCH', `/api/v1/organization/${orgId}/name`).as('renameOrg');
cy.get('form.p-4 > .button').click()
cy.get('@renameOrg').should('have.property', 'response').and('have.property', 'statusCode', 200);
cy.get('.pl-3').should("have.text", "ABC ")
})
})

View File

@ -1,84 +0,0 @@
/// <reference types="cypress" />
describe('Project Overview', () => {
const projectName = "projectY"
let projectId;
let isFirstTest = true;
before(() => {
cy.login(`test@localhost.local`, `testInfisical1`)
// Create a project
cy.get('.button').click()
cy.get('input[placeholder="Type your project name"]').type(`${projectName}`)
cy.contains('button', 'Create Project').click()
cy.url().should('include', '/project').then((currentUrl) => {
projectId = currentUrl.split("/")[4]
})
})
beforeEach(() => {
if (isFirstTest) {
isFirstTest = false;
return; // Skip the rest of the beforeEach for the first test
}
cy.login(`test@localhost.local`, `testInfisical1`)
cy.visit(`/project/${projectId}/secrets/overview`)
})
it('can create secrets', () => {
cy.contains('button', 'Go to Development').click()
cy.contains('button', 'Add a new secret').click()
cy.get('input[placeholder="Type your secret name"]').type('SECRET_A')
cy.contains('button', 'Create Secret').click()
cy.get('.w-80 > .inline-flex > .input').should('have.value', 'SECRET_A')
cy.get(':nth-child(6) > .button > .w-min').should('have.text', '1 Commit')
})
it('can update secrets', () => {
cy.get(':nth-child(2) > .flex > .button').click()
cy.get('.overflow-auto > .relative > .absolute').type('VALUE_A')
cy.get('.button.text-primary > .svg-inline--fa').click()
cy.get(':nth-child(6) > .button > .w-min').should('have.text', '2 Commits')
})
it('can`t create duplicate-name secrets', () => {
cy.get(':nth-child(2) > .flex > .button').click()
cy.contains('button', 'Add Secret').click()
cy.get('input[placeholder="Type your secret name"]').type('SECRET_A')
cy.intercept('POST', `/api/v3/secrets/SECRET_A`).as('createSecret');
cy.contains('button', 'Create Secret').click()
cy.get('@createSecret').should('have.property', 'response').and('have.property', 'statusCode', 400);
})
it('can add another secret', () => {
cy.get(':nth-child(2) > .flex > .button').click()
cy.contains('button', 'Add Secret').click()
cy.get('input[placeholder="Type your secret name"]').type('SECRET_B')
cy.contains('button', 'Create Secret').click()
cy.get(':nth-child(6) > .button > .w-min').should('have.text', '3 Commits')
})
it('can delete a secret', () => {
cy.get(':nth-child(2) > .flex > .button').click()
// cy.get(':nth-child(3) > .shadow-none').trigger('mouseover')
cy.get(':nth-child(3) > .shadow-none > .group > .h-10 > .border-red').click()
cy.contains('button', 'Delete Secret').should('have.attr', 'disabled')
cy.get('input[placeholder="Type to delete..."]').type('SECRET_B')
cy.intercept('DELETE', `/api/v3/secrets/SECRET_B`).as('deleteSecret');
cy.contains('button', 'Delete Secret').should('not.have.attr', 'disabled')
cy.contains('button', 'Delete Secret').click();
cy.get('@deleteSecret').should('have.property', 'response').and('have.property', 'statusCode', 200);
})
it('can add a comment', () => {
return;
cy.get(':nth-child(2) > .flex > .button').click()
// for some reason this hover does not want to work
cy.get('.overflow-auto').trigger('mouseover').then(() => {
cy.get('.shadow-none > .group > .pl-4 > .h-8 > button[aria-label="add-comment"]').should('be.visible').click()
});
})
})

View File

@ -1,5 +0,0 @@
{
"name": "Using fixtures to represent data",
"email": "hello@cypress.io",
"body": "Fixtures are a great way to mock data for responses to routes"
}

View File

@ -1,51 +0,0 @@
// ***********************************************
// This example commands.js shows you how to
// create various custom commands and overwrite
// existing commands.
//
// For more comprehensive examples of custom
// commands please read more here:
// https://on.cypress.io/custom-commands
// ***********************************************
//
//
// -- This is a parent command --
// Cypress.Commands.add('login', (email, password) => { ... })
//
//
// -- This is a child command --
// Cypress.Commands.add('drag', { prevSubject: 'element'}, (subject, options) => { ... })
//
//
// -- This is a dual command --
// Cypress.Commands.add('dismiss', { prevSubject: 'optional'}, (subject, options) => { ... })
//
//
// -- This will overwrite an existing command --
// Cypress.Commands.overwrite('visit', (originalFn, url, options) => { ... })
Cypress.Commands.add('login', (username, password) => {
cy.visit('/login')
cy.get('input[placeholder="Enter your email..."]').type(username)
cy.get('input[placeholder="Enter your password..."]').type(password)
cy.contains('Continue with Email').click()
cy.url().should('include', '/overview')
// Need to make this work for CSRF tokens; Cypress has an example in the docs
// cy.session(
// username,
// () => {
// cy.visit('/login')
// cy.get('input[placeholder="Enter your email..."]').type(username)
// cy.get('input[placeholder="Enter your password..."]').type(password)
// cy.contains('Continue with Email').click()
// cy.url().should('include', '/overview')
// },
// {
// validate: () => {
// cy.getCookie('jid').should('exist')
// },
// }
// )
})

View File

@ -1,20 +0,0 @@
// ***********************************************************
// This example support/e2e.js is processed and
// loaded automatically before your test files.
//
// This is a great place to put global configuration and
// behavior that modifies Cypress.
//
// You can change the location of this file or turn off
// automatically serving support files with the
// 'supportFile' configuration option.
//
// You can read more here:
// https://on.cypress.io/configuration
// ***********************************************************
// Import commands.js using ES2015 syntax:
import './commands'
// Alternatively you can use CommonJS syntax:
// require('./commands')

24
docs/CONTRIBUTING.MD Normal file
View File

@ -0,0 +1,24 @@
# Contributing to the documentation
## Getting familiar with Mintlify
New to Mintlify. [Start Here](https://mintlify.com/docs/quickstart)
## 👩‍💻 Development
Install the [Mintlify CLI](https://www.npmjs.com/package/mintlify) to preview the documentation changes locally. To install, use the following command
```
npm i -g mintlify
```
Run the following command at the root of your documentation (where mint.json is)
```
mintlify dev
```
## Troubleshooting
- Mintlify dev isn't running - Run `mintlify install` it'll re-install dependencies.
- Page loads as a 404 - Make sure you are running in a folder with `mint.json`. Check the `/docs` folder

View File

@ -0,0 +1,37 @@
---
title: "MySQL/MariaDB"
description: "Rotated database user password of a MySQL or MariaDB"
---
Infisical will update periodically the provided database user's password.
<Warning>
At present Infisical do require access to your database. We will soon be released Infisical agent based rotation which would help you rotate without direct database access from Infisical cloud.
</Warning>
## Working
1. User's has to create the two user's for Infisical to rotate and provide them required database access
2. Infisical will connect with your database with admin access
3. If last rotated one was username1, then username2 is chosen to be rotated
5. Update it's password with random value
6. After testing it gets saved to the provided secret mapping
## Rotation Configuration
1. Head over to Secret Rotation configuration page of your project by clicking on side bar `Secret Rotation`
2. Click on `MySQL`
3. Provide the inputs
- Admin Username: DB admin username
- Admin Password: DB admin password
- Host: DB host
- Port: DB port(number)
- Username1: The first username in two to rotate
- Username2: The second username in two to rotate
- CA: Certificate to connect with database(string)
4. Final step
- Select `Environment`, `Secret Path` and `Interval` to rotate the secrets
- Finally select the secrets in your provided board to replace with new secret after each rotation
- Your done and good to go.
Congrats. You have 10x your MySQL/MariaDB access security.

View File

@ -0,0 +1,44 @@
# Secret Rotation Overview
## Introduction
Secret rotation is a process that involves updating secret credentials periodically to minimize the risk of their compromise.
Rotating secrets helps prevent unauthorized access to systems and sensitive data by ensuring that old credentials are replaced with new ones regularly.
Rotated secrets may include, but are not limited to:
1. API keys for external services
2. Database credentials for various platforms
## Rotation Process
The practice of rotating secrets is a systematic and interval-based operation, carried out in four fundamental phases.
### 1. Creation
The system initiates the rotation process by either making an API call to an external service or generating a new secret value internally.
Upon successful creation, the system will temporarily have three versions of the secret:
- **Current active secret**: The one currently in use.
- **Future active secret (pending)**: The newly created secret, awaiting validation.
- **Previous active secret**: The old secret, soon to be retired.
### 2. Testing
The newly generated secret is subjected to a verification process to ensure its validity and functionality.
This involves conducting checks or tests that simulate actual operations the secret would perform.
Only the current active and the future active (pending) secrets are considered operational at this stage, while the previous active secret remains in standby mode.
### 3. Deletion
Post-verification, the system deactivates and deletes the previous active secret, leaving only the current and future active (pending) secrets in the system.
### 4. Activation
Finally, the system promotes the future active (pending) secret to be the new current active secret. It then triggers necessary side effects, such as invoking webhooks and generating events, to notify other services of the change.
## Infisical Secret Rotation Strategies
1. [SendGrid Integration](./sendgrid)
2. [PostgreSQL/CockroachDB Implementation](./postgres)
3. [MySQL/MariaDB Configuration](./mysql)

Some files were not shown because too many files have changed in this diff Show More