Compare commits

...

143 Commits

Author SHA1 Message Date
086ce0d2a6 Merge pull request #918 from Infisical/revert-917-snyk-fix-29828c58f69ea88c3d50dad65d7767d2
Revert "[Snyk] Fix for 1 vulnerabilities"
2023-08-30 16:36:22 -04:00
06dec29773 Revert "[Snyk] Fix for 1 vulnerabilities" 2023-08-30 16:35:44 -04:00
ed8e942a5d Update low entropy password error message 2023-08-30 21:28:35 +01:00
e770bdde24 Update low entropy password error message 2023-08-30 21:27:31 +01:00
a84dab1219 Merge pull request #917 from Infisical/snyk-fix-29828c58f69ea88c3d50dad65d7767d2
[Snyk] Fix for 1 vulnerabilities
2023-08-30 16:26:20 -04:00
02d9d7b6a4 fix: backend/package.json & backend/package-lock.json to reduce vulnerabilities
The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-JS-MONGODB-5871303
2023-08-30 20:05:26 +00:00
f21eb3b7c8 Patch GCP secret manager integration edge-case 2023-08-30 21:04:39 +01:00
219e3884e7 Merge pull request #912 from Infisical/integration-suffixes
Added suffixes to the Checkly integration
2023-08-30 10:29:08 +01:00
41cd8b7408 Move secretSuffix to separate metadata field 2023-08-30 10:04:44 +01:00
f6be86a26b Added suffixes to integrations 2023-08-29 22:17:48 -07:00
85e5822ece Merge pull request #908 from akhilmhdh/fix/sec-override-fail
fix: resolved personal override not showing up
2023-08-29 14:07:09 -04:00
5c9e89a8e2 Merge pull request #904 from Infisical/dashboard-get-secrets
Rewire dashboard to pull from v3/secrets with folderId support
2023-08-29 13:54:37 -04:00
46a77d5e58 Merge pull request #909 from Infisical/team-city-branch-config
Add support for build-configuration environment variable sync for TeamCity integration
2023-08-29 14:43:17 +01:00
a6e9643464 Finish adding support for build-configuration level syncs for TeamCity integration 2023-08-29 14:37:58 +01:00
affa2ee695 fix: resolved personal override not showing up 2023-08-29 12:23:12 +05:30
dc0d577cbb Patch TeamCity integration 2023-08-29 07:46:11 +01:00
9e8ddd2956 Merge pull request #907 from ragnarbull/patch-1
Update overview.mdx
2023-08-28 17:41:26 -07:00
b40b876fb2 Update overview.mdx
New password criteria + keep formatting consistent
2023-08-29 10:20:15 +10:00
2ba6a65da4 Change order of password check 2023-08-28 11:43:40 +01:00
76cf79d201 Merge pull request #885 from ragnarbull/ragnarbull-auth-pwd-fixes
Password fixes - enforce max length, add checks (pwd breach, PII, low entropy), improved UX, deprecate common-passwords api
2023-08-28 11:33:57 +01:00
a79c6227b1 Fix frontend lint issues 2023-08-28 11:25:50 +01:00
f1f64e6ff5 Fix flaky regex g flag causing unexpected validation password validation issue 2023-08-28 11:08:00 +01:00
d72ddfe315 Rewire dashboard to pull from v3/secrets with folderId support 2023-08-28 09:12:04 +01:00
f924d0c02c Update kubernetes-helm.mdx 2023-08-27 22:39:19 -07:00
ef1b75d890 remove the use of aggregation for documentDB compatibility 2023-08-27 14:41:35 -04:00
d8094b2ab1 Merge pull request #903 from Infisical/integration-setup-docs
Add self-hosted setup/configuration docs for OAuth2 integrations
2023-08-27 12:16:26 +01:00
ad61fa845c Add self-hosted configuration docs for GitHub, GitLab, GCP SM, Vercel, Heroku, Netlify, Azure KV 2023-08-27 12:14:17 +01:00
6bb5e7078f Merge pull request #902 from Infisical/gcp-integration
GCP Secret Manager Integration
2023-08-26 17:42:59 +01:00
a07ddb806d Finish GCP secret manager integration 2023-08-26 17:36:20 +01:00
6e7d3d6912 Merge pull request #901 from Infisical/environment-api
Expose CRUD Environment Operations to Public REST API
2023-08-26 08:49:35 +01:00
84a866eb88 Add API Key auth method to environment endpoints, add endpoints to public REST API docs 2023-08-26 08:47:36 +01:00
9416fca832 update to doc5.0 engine 2023-08-25 17:23:31 -04:00
2ea518b107 add redis to cloud formation 2023-08-25 15:35:11 -04:00
62399dd293 Merge pull request #897 from akhilmhdh/fix/sec-v3-fail
fix: moved backend get sec to v2 for dashboard
2023-08-25 12:09:04 -04:00
16f1360550 fix: moved backend get sec to v2 for dashboard 2023-08-25 21:37:05 +05:30
a99751eb72 Moved pwd checks into a subfolder 2023-08-25 12:36:53 +10:00
9ea414fb25 Merge pull request #894 from akhilmhdh/fix/multi-line-html-encode
fix(multi-line): resolved breaking ui when secret value contains < or >
2023-08-24 22:12:42 -04:00
a9fa3ebab2 update post hog event name 2023-08-24 19:01:59 -04:00
293a62b632 update secrets posthog event logic 2023-08-24 18:48:46 -04:00
a1f08b064e add tags support in secret imports 2023-08-24 17:21:14 -04:00
50977cf788 reduce k8 events 2023-08-24 15:41:29 -04:00
fccec083a9 fix(multi-line): resolved breaking ui when secret value contains < or > 2023-08-24 23:07:58 +05:30
63af7d4a15 Merge remote-tracking branch 'origin' into gcp-integration 2023-08-25 00:35:11 +07:00
ab3533ce1c Checkpoint GCP secret manager integration 2023-08-25 00:34:46 +07:00
4d6a8f0476 Fixed form (error messages too long). Consolidated tests & errors. Moved regexes to another file. Added regex to check for PII & reject pwd if true. Confirmed hashing & encryption/decryption works with top 50 languages, emojis etc (screen videos & unit tests to come). 2023-08-25 01:44:02 +10:00
688cf91eb7 Removed unnecessary validator library & @types/validator in favor of yup 2023-08-24 14:08:11 +10:00
8ee6710e9b Merge pull request #889 from EBEN4REAL/custom-tag-colors
Custom tag colors
2023-08-23 21:03:46 -07:00
14fc78eaaf Switched to crypto.subtle, cleaned up code, added types & properly cleared sensitive data from memory (even if error) 2023-08-24 14:01:26 +10:00
9fa28f5b5e Fix: added empty string as default for tag color and added regex to resolve issue with multiple spacing in tag names. 2023-08-24 03:59:49 +01:00
368855a44e >>> yup for email & url validation, fixed minor err in error msgs 2023-08-24 12:59:24 +10:00
ae375916e8 Fix: added nullable check for adding tag color in project settings 2023-08-24 03:39:46 +01:00
21f1648998 Merge pull request #887 from Infisical/signup-secret-tagging
Update signup secret distinction/tagging for better telemetry
2023-08-23 19:23:44 -07:00
88695a2f8c Merge pull request #884 from monto7926/sortable-secrets-overview
feat: make secrets overview sortable
2023-08-23 17:47:34 -07:00
77114e02cf fixed the import linting issues 2023-08-23 17:42:29 -07:00
3ac1795a5b Update kubernetes-helm.mdx 2023-08-23 17:42:07 -04:00
8d6f59b253 up infisical chart version 2023-08-23 17:15:30 -04:00
7fd77b14ff print default connection string in helm 2023-08-23 17:14:09 -04:00
8d3d7d98e3 chore: updated style for tag color label 2023-08-23 18:50:24 +01:00
6cac879ed0 chore: removed console log 2023-08-23 16:46:06 +01:00
ac66834daa chore: fixed error with typings 2023-08-23 16:36:48 +01:00
0616f24923 Merge pull request #866 from Killian-Smith/email-case-sensitive
fix: normalize email when inviting memebers and logging in.
2023-08-23 18:08:28 +07:00
4e1abc6eba Add login email lowercasing to backend 2023-08-23 18:02:18 +07:00
8f57377130 Merge remote-tracking branch 'origin' into email-case-sensitive 2023-08-23 17:50:46 +07:00
2d7c7f075e Remove metadata from SecretVersion schema 2023-08-23 17:47:25 +07:00
c342b22d49 Fix telemetry issue for signup secrets 2023-08-23 17:37:01 +07:00
b8120f7512 Merge pull request #886 from Infisical/audit-log-paywall
Add paywall to Audit Logs V2
2023-08-23 17:00:27 +07:00
ca18883bd3 Add paywall for audit logs v2 2023-08-23 16:55:07 +07:00
8b381b2b80 Checkpoint add metadata to secret and secret version data structure 2023-08-23 16:30:42 +07:00
6bcf5cb54c override secrets before expand 2023-08-22 23:37:32 -04:00
51b425dceb swap out v2 login 2023-08-22 23:37:32 -04:00
7ec00475c6 +maxRetryAttempts, padding & safer error handling. Improved readability & comments. 2023-08-23 12:59:00 +10:00
84840bddb5 Merge branch 'main' of https://github.com/Infisical/infisical 2023-08-22 15:10:30 -07:00
93640c9d69 added tooltips to the sercret overview 2023-08-22 15:10:18 -07:00
ec856f0bcc remove return from integration loop 2023-08-22 21:18:18 +00:00
3e46bec6f7 add simple api to trigger integration sync 2023-08-22 14:55:08 -04:00
25fc508d5e Fixed spelling 2023-08-23 02:56:03 +10:00
ea262da505 Added check that password is not an email address 2023-08-23 02:14:22 +10:00
954806d950 chore: code cleanup 2023-08-22 17:59:11 +02:00
2960f86647 Fix comments explaining "international" password requirements 2023-08-23 01:41:37 +10:00
b2888272f2 Added password criterion support for multiple languages and emojis 2023-08-23 01:27:30 +10:00
d6d3302659 feat: make secrets overview sortable 2023-08-22 17:21:21 +02:00
e5c87442e5 Changed to use ES2018 rather than load scripts 2023-08-23 01:04:52 +10:00
be08417c8b internationalize password requirements 2023-08-23 00:48:45 +10:00
61e44e152c optimised import 2023-08-22 23:47:33 +10:00
52c4f64655 Removed log and fixed comments 2023-08-22 23:36:24 +10:00
81743d55ab fix infisical radar app name 2023-08-22 09:35:31 -04:00
3e36adcf5c Removed all references to commonPasswords & the data file. This api route can be deprecated in favor of the client-side secure call to the haveIBeenPwnd password API. Further the datafile contains no passwords that meet the minimum password criteria. 2023-08-22 23:30:24 +10:00
1f60a3d73e fixed more error handling for password checks & translations 2023-08-22 22:42:02 +10:00
00089a6bba Added breached pwd error translations 2023-08-22 20:57:12 +10:00
026ea29847 further fixes to password check logic 2023-08-22 20:42:07 +10:00
1242d88acb Fixed breached pwd error messages 2023-08-22 20:20:54 +10:00
f47a119474 fixed breached pwd error messages 2023-08-22 20:20:13 +10:00
0b359cd797 Made breached pwd API comments clearer 2023-08-22 19:45:35 +10:00
c5ae402787 Added comments to explain breach passwords API 2023-08-22 18:14:03 +10:00
e288402ec4 Properly added pwndpasswords API to CSP 2023-08-22 17:58:10 +10:00
196beb8355 removed logs & added pwndpasswords.com api to CSP 2023-08-22 17:50:43 +10:00
d6222d5cee attempt to fix crypto.subtle issue 2023-08-22 17:33:35 +10:00
e855d4a0ba added types for crypto 2023-08-22 17:26:00 +10:00
20f34b4764 removed async in crypto.subtle 2023-08-22 17:14:18 +10:00
0eb21919fb Password breach check 2023-08-22 16:49:17 +10:00
fbeb210965 add to pwd length issue 2023-08-22 15:34:45 +10:00
0d1aa713ea added translations for error messges (used Google translate) 2023-08-22 14:57:02 +10:00
9a1b453c86 Feat: added tag color widgt and changed tag popover design 2023-08-22 05:12:23 +01:00
534d96ffb6 Set max password length (100 chars) to help prevent DDOS attack 2023-08-22 14:05:00 +10:00
5b342409e3 Merge pull request #815 from Infisical/snyk-fix-477e109149f5e5a943a435c5bf8814b7
[Snyk] Security upgrade winston-loki from 6.0.6 to 6.0.7
2023-08-21 16:02:02 -04:00
a9f54009b8 Merge pull request #848 from narindraditantyo/fix/rootless-frontend-image
fix: frontend image displaying some errors due to sed write permission
2023-08-21 15:54:29 -04:00
82947e183c Merge pull request #851 from sreehari2003/main
fix: form not submitting on keyboard enter
2023-08-21 15:53:15 -04:00
eb7ef2196a Merge pull request #872 from iamunnip/blogs
added blog link for setting up infisical in developement cluster
2023-08-21 14:09:18 -04:00
ad3801ce36 Merge pull request #882 from akhilmhdh/feat/integration-var-not-found
fix(integration): instead of throwing error console and return empty string on interpolation
2023-08-21 13:51:16 -04:00
b7aac1a465 fix(integration): instead of throwing error console and return empty string on interpolation 2023-08-21 20:06:24 +05:30
e28ced8eed Provide default path for logging dashboard secrets event 2023-08-21 18:27:18 +07:00
4a95f936ea Correct enable blind-indexing web ui rendering condition 2023-08-21 17:27:32 +07:00
85a39c60bb Fix query condition on delete secret v3 2023-08-21 16:51:31 +07:00
66ea3ba172 feat: added custom design for tags 2023-08-20 10:02:40 +01:00
01d91c0dc7 update helm version 2023-08-19 17:19:42 -04:00
dedd27a781 remove unsed redis template 2023-08-19 17:19:07 -04:00
57a6d1fff6 fix syntax error in helm chart 2023-08-19 14:47:46 -04:00
554f0c79a4 update redis doc 2023-08-19 14:31:28 -04:00
2af88d4c99 Merge pull request #843 from Infisical/add-bull-queue
add bull queue
2023-08-19 14:13:34 -04:00
fc8b567352 fix syntax error in /api/status 2023-08-19 14:03:02 -04:00
ec234e198a Merge branch 'main' into add-bull-queue 2023-08-19 13:46:26 -04:00
6e1cc12e3a update redis banner text 2023-08-19 13:43:01 -04:00
1b4b7a967b fix docs typos 2023-08-19 13:42:33 -04:00
e47d6b7f2f added blog link for setting up infisical in developement cluster 2023-08-19 08:59:58 +05:30
45a13d06b5 add redis why docs & update redis notice 2023-08-18 21:20:20 -04:00
4a48c088df Merge pull request #868 from daninge98/custom-environment-sorting
Adds user customizable environment ordering
2023-08-18 17:05:37 -07:00
2b65f65063 Rename things and fix bug in error checking 2023-08-18 17:33:59 -04:00
c0ce92cf3d Formattting fix 2023-08-16 17:42:39 -04:00
0073fe459e Fix typo 2023-08-16 17:37:41 -04:00
a7f52a9298 Small formatting fixes 2023-08-16 17:36:07 -04:00
29c0d8ab57 Enable users to change the ordering of environments 2023-08-16 17:30:50 -04:00
cb42db3de4 Normalize email when inviting memebers and logging in. 2023-08-15 15:57:27 +01:00
90517258a2 added redis note 2023-08-14 18:30:40 -07:00
d78b37c632 add redis docs 2023-08-14 16:25:16 -04:00
4a6fc9e84f remove console.log and add redis to /status api 2023-08-14 16:24:43 -04:00
8030104c02 update helm read me with redis config details 2023-08-14 15:02:22 -04:00
9652d534b6 fix: moved handler to form submission 2023-08-13 14:00:30 +05:30
f650cd3925 fix: form not submitting on keyboard enter 2023-08-13 00:54:22 +05:30
8a514e329f fix: frontend image displaying some errors due to sed write permission 2023-08-12 21:53:12 +07:00
01e613301a console.log queue errors 2023-08-11 19:47:15 -04:00
b11cd29943 close all queues 2023-08-10 19:13:09 -04:00
dfe95ac773 add bull queue 2023-08-10 17:22:20 -04:00
bb466dbe1c fix: backend/package.json & backend/package-lock.json to reduce vulnerabilities
The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-JS-PROTOBUFJS-5756498
2023-08-01 15:50:51 +00:00
245 changed files with 8506 additions and 3472 deletions

View File

@ -25,6 +25,9 @@ JWT_PROVIDER_AUTH_LIFETIME=
# Required
MONGO_URL=mongodb://root:example@mongo:27017/?authSource=admin
# Redis
REDIS_URL=redis://redis:6379
# Optional credentials for MongoDB container instance and Mongo-Express
MONGO_USERNAME=root
MONGO_PASSWORD=example

View File

@ -1,3 +0,0 @@
{
"workbench.editor.wrapTabs": true
}

4289
backend/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -50,7 +50,7 @@
"typescript": "^4.9.3",
"utility-types": "^3.10.0",
"winston": "^3.8.2",
"winston-loki": "^6.0.6"
"winston-loki": "^6.0.7"
},
"name": "infisical-api",
"version": "1.0.0",
@ -84,6 +84,7 @@
"@posthog/plugin-scaffold": "^1.3.4",
"@types/bcrypt": "^5.0.0",
"@types/bcryptjs": "^2.4.2",
"@types/bull": "^4.10.0",
"@types/cookie-parser": "^1.4.3",
"@types/cors": "^2.8.12",
"@types/express": "^4.17.14",

View File

@ -3203,6 +3203,9 @@
"name": {
"example": "any"
},
"tagColor": {
"example": "any"
},
"slug": {
"example": "any"
}

View File

@ -37,6 +37,7 @@ export const getClientIdNetlify = async () => (await client.getSecret("CLIENT_ID
export const getClientIdGitHub = async () => (await client.getSecret("CLIENT_ID_GITHUB")).secretValue;
export const getClientIdGitLab = async () => (await client.getSecret("CLIENT_ID_GITLAB")).secretValue;
export const getClientIdBitBucket = async () => (await client.getSecret("CLIENT_ID_BITBUCKET")).secretValue;
export const getClientIdGCPSecretManager = async () => (await client.getSecret("CLIENT_ID_GCP_SECRET_MANAGER")).secretValue;
export const getClientSecretAzure = async () => (await client.getSecret("CLIENT_SECRET_AZURE")).secretValue;
export const getClientSecretHeroku = async () => (await client.getSecret("CLIENT_SECRET_HEROKU")).secretValue;
export const getClientSecretVercel = async () => (await client.getSecret("CLIENT_SECRET_VERCEL")).secretValue;
@ -44,6 +45,7 @@ export const getClientSecretNetlify = async () => (await client.getSecret("CLIEN
export const getClientSecretGitHub = async () => (await client.getSecret("CLIENT_SECRET_GITHUB")).secretValue;
export const getClientSecretGitLab = async () => (await client.getSecret("CLIENT_SECRET_GITLAB")).secretValue;
export const getClientSecretBitBucket = async () => (await client.getSecret("CLIENT_SECRET_BITBUCKET")).secretValue;
export const getClientSecretGCPSecretManager = async () => (await client.getSecret("CLIENT_SECRET_GCP_SECRET_MANAGER")).secretValue;
export const getClientSlugVercel = async () => (await client.getSecret("CLIENT_SLUG_VERCEL")).secretValue;
export const getClientIdGoogleLogin = async () => (await client.getSecret("CLIENT_ID_GOOGLE_LOGIN")).secretValue;
@ -68,6 +70,8 @@ export const getSecretScanningWebhookSecret = async () => (await client.getSecre
export const getSecretScanningGitAppId = async () => (await client.getSecret("SECRET_SCANNING_GIT_APP_ID")).secretValue;
export const getSecretScanningPrivateKey = async () => (await client.getSecret("SECRET_SCANNING_PRIVATE_KEY")).secretValue;
export const getRedisUrl = async () => (await client.getSecret("REDIS_URL")).secretValue;
export const getLicenseKey = async () => {
const secretValue = (await client.getSecret("LICENSE_KEY")).secretValue;
return secretValue === "" ? undefined : secretValue;

View File

@ -1,32 +1,20 @@
import { Request, Response } from "express";
import fs from "fs";
import path from "path";
import jwt from "jsonwebtoken";
import * as bigintConversion from "bigint-conversion";
// eslint-disable-next-line @typescript-eslint/no-var-requires
const jsrp = require("jsrp");
import {
LoginSRPDetail,
TokenVersion,
User,
} from "../../models";
import { LoginSRPDetail, TokenVersion, User } from "../../models";
import { clearTokens, createToken, issueAuthTokens } from "../../helpers/auth";
import { checkUserDevice } from "../../helpers/user";
import {
ACTION_LOGIN,
ACTION_LOGOUT,
} from "../../variables";
import {
BadRequestError,
UnauthorizedRequestError,
} from "../../utils/errors";
import { ACTION_LOGIN, ACTION_LOGOUT } from "../../variables";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
import { EELogService } from "../../ee/services";
import { getUserAgentType } from "../../utils/posthog";
import {
getHttpsEnabled,
getJwtAuthLifetime,
getJwtAuthSecret,
getJwtRefreshSecret,
getJwtRefreshSecret
} from "../../config";
import { ActorType } from "../../ee/models";
@ -44,13 +32,10 @@ declare module "jsonwebtoken" {
* @returns
*/
export const login1 = async (req: Request, res: Response) => {
const {
email,
clientPublicKey,
}: { email: string; clientPublicKey: string } = req.body;
const { email, clientPublicKey }: { email: string; clientPublicKey: string } = req.body;
const user = await User.findOne({
email,
email
}).select("+salt +verifier");
if (!user) throw new Error("Failed to find user");
@ -59,21 +44,25 @@ export const login1 = async (req: Request, res: Response) => {
server.init(
{
salt: user.salt,
verifier: user.verifier,
verifier: user.verifier
},
async () => {
// generate server-side public key
const serverPublicKey = server.getPublicKey();
await LoginSRPDetail.findOneAndReplace({ email: email }, {
email: email,
clientPublicKey: clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt),
}, { upsert: true, returnNewDocument: false })
await LoginSRPDetail.findOneAndReplace(
{ email: email },
{
email: email,
clientPublicKey: clientPublicKey,
serverBInt: bigintConversion.bigintToBuf(server.bInt)
},
{ upsert: true, returnNewDocument: false }
);
return res.status(200).send({
serverPublicKey,
salt: user.salt,
salt: user.salt
});
}
);
@ -89,15 +78,19 @@ export const login1 = async (req: Request, res: Response) => {
export const login2 = async (req: Request, res: Response) => {
const { email, clientProof } = req.body;
const user = await User.findOne({
email,
email
}).select("+salt +verifier +publicKey +encryptedPrivateKey +iv +tag");
if (!user) throw new Error("Failed to find user");
const loginSRPDetailFromDB = await LoginSRPDetail.findOneAndDelete({ email: email })
const loginSRPDetailFromDB = await LoginSRPDetail.findOneAndDelete({ email: email });
if (!loginSRPDetailFromDB) {
return BadRequestError(Error("It looks like some details from the first login are not found. Please try login one again"))
return BadRequestError(
Error(
"It looks like some details from the first login are not found. Please try login one again"
)
);
}
const server = new jsrp.server();
@ -105,7 +98,7 @@ export const login2 = async (req: Request, res: Response) => {
{
salt: user.salt,
verifier: user.verifier,
b: loginSRPDetailFromDB.serverBInt,
b: loginSRPDetailFromDB.serverBInt
},
async () => {
server.setClientPublicKey(loginSRPDetailFromDB.clientPublicKey);
@ -117,13 +110,13 @@ export const login2 = async (req: Request, res: Response) => {
await checkUserDevice({
user,
ip: req.realIP,
userAgent: req.headers["user-agent"] ?? "",
userAgent: req.headers["user-agent"] ?? ""
});
const tokens = await issueAuthTokens({
const tokens = await issueAuthTokens({
userId: user._id,
ip: req.realIP,
userAgent: req.headers["user-agent"] ?? "",
userAgent: req.headers["user-agent"] ?? ""
});
// store (refresh) token in httpOnly cookie
@ -131,20 +124,21 @@ export const login2 = async (req: Request, res: Response) => {
httpOnly: true,
path: "/",
sameSite: "strict",
secure: await getHttpsEnabled(),
secure: await getHttpsEnabled()
});
const loginAction = await EELogService.createAction({
name: ACTION_LOGIN,
userId: user._id,
userId: user._id
});
loginAction && await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getUserAgentType(req.headers["user-agent"]),
ipAddress: req.realIP,
});
loginAction &&
(await EELogService.createLog({
userId: user._id,
actions: [loginAction],
channel: getUserAgentType(req.headers["user-agent"]),
ipAddress: req.realIP
}));
// return (access) token in response
return res.status(200).send({
@ -152,12 +146,12 @@ export const login2 = async (req: Request, res: Response) => {
publicKey: user.publicKey,
encryptedPrivateKey: user.encryptedPrivateKey,
iv: user.iv,
tag: user.tag,
tag: user.tag
});
}
return res.status(400).send({
message: "Failed to authenticate. Try again?",
message: "Failed to authenticate. Try again?"
});
}
);
@ -171,7 +165,7 @@ export const login2 = async (req: Request, res: Response) => {
*/
export const logout = async (req: Request, res: Response) => {
if (req.authData.actor.type === ActorType.USER && req.authData.tokenVersionId) {
await clearTokens(req.authData.tokenVersionId)
await clearTokens(req.authData.tokenVersionId);
}
// clear httpOnly cookie
@ -179,49 +173,44 @@ export const logout = async (req: Request, res: Response) => {
httpOnly: true,
path: "/",
sameSite: "strict",
secure: (await getHttpsEnabled()) as boolean,
secure: (await getHttpsEnabled()) as boolean
});
const logoutAction = await EELogService.createAction({
name: ACTION_LOGOUT,
userId: req.user._id,
userId: req.user._id
});
logoutAction && await EELogService.createLog({
userId: req.user._id,
actions: [logoutAction],
channel: getUserAgentType(req.headers["user-agent"]),
ipAddress: req.realIP,
});
logoutAction &&
(await EELogService.createLog({
userId: req.user._id,
actions: [logoutAction],
channel: getUserAgentType(req.headers["user-agent"]),
ipAddress: req.realIP
}));
return res.status(200).send({
message: "Successfully logged out.",
message: "Successfully logged out."
});
};
export const getCommonPasswords = async (req: Request, res: Response) => {
const commonPasswords = fs.readFileSync(
path.resolve(__dirname, "../../data/" + "common_passwords.txt"),
"utf8"
).split("\n");
return res.status(200).send(commonPasswords);
}
export const revokeAllSessions = async (req: Request, res: Response) => {
await TokenVersion.updateMany({
user: req.user._id,
}, {
$inc: {
refreshVersion: 1,
accessVersion: 1,
await TokenVersion.updateMany(
{
user: req.user._id
},
});
{
$inc: {
refreshVersion: 1,
accessVersion: 1
}
}
);
return res.status(200).send({
message: "Successfully revoked all sessions.",
});
}
message: "Successfully revoked all sessions."
});
};
/**
* Return user is authenticated
@ -231,9 +220,9 @@ export const revokeAllSessions = async (req: Request, res: Response) => {
*/
export const checkAuth = async (req: Request, res: Response) => {
return res.status(200).send({
message: "Authenticated",
message: "Authenticated"
});
}
};
/**
* Return new JWT access token by first validating the refresh token
@ -244,47 +233,47 @@ export const checkAuth = async (req: Request, res: Response) => {
export const getNewToken = async (req: Request, res: Response) => {
const refreshToken = req.cookies.jid;
if (!refreshToken) throw BadRequestError({
message: "Failed to find refresh token in request cookies"
});
if (!refreshToken)
throw BadRequestError({
message: "Failed to find refresh token in request cookies"
});
const decodedToken = <jwt.UserIDJwtPayload>(
jwt.verify(refreshToken, await getJwtRefreshSecret())
);
const decodedToken = <jwt.UserIDJwtPayload>jwt.verify(refreshToken, await getJwtRefreshSecret());
const user = await User.findOne({
_id: decodedToken.userId,
_id: decodedToken.userId
}).select("+publicKey +refreshVersion +accessVersion");
if (!user) throw new Error("Failed to authenticate unfound user");
if (!user?.publicKey)
throw new Error("Failed to authenticate not fully set up account");
if (!user?.publicKey) throw new Error("Failed to authenticate not fully set up account");
const tokenVersion = await TokenVersion.findById(decodedToken.tokenVersionId);
if (!tokenVersion) throw UnauthorizedRequestError({
message: "Failed to validate refresh token",
});
if (!tokenVersion)
throw UnauthorizedRequestError({
message: "Failed to validate refresh token"
});
if (decodedToken.refreshVersion !== tokenVersion.refreshVersion) throw BadRequestError({
message: "Failed to validate refresh token",
});
if (decodedToken.refreshVersion !== tokenVersion.refreshVersion)
throw BadRequestError({
message: "Failed to validate refresh token"
});
const token = createToken({
payload: {
userId: decodedToken.userId,
tokenVersionId: tokenVersion._id.toString(),
accessVersion: tokenVersion.refreshVersion,
accessVersion: tokenVersion.refreshVersion
},
expiresIn: await getJwtAuthLifetime(),
secret: await getJwtAuthSecret(),
secret: await getJwtAuthSecret()
});
return res.status(200).send({
token,
token
});
};
export const handleAuthProviderCallback = (req: Request, res: Response) => {
res.redirect(`/login/provider/success?token=${encodeURIComponent(req.providerAuthToken)}`);
}
};

View File

@ -547,6 +547,57 @@ export const getIntegrationAuthNorthflankSecretGroups = async (req: Request, res
});
}
/**
* Return list of build configs for TeamCity project with id [appId]
* @param req
* @param res
* @returns
*/
export const getIntegrationAuthTeamCityBuildConfigs = async (req: Request, res: Response) => {
const appId = req.query.appId as string;
interface TeamCityBuildConfig {
id: string;
name: string;
projectName: string;
projectId: string;
href: string;
webUrl: string;
}
interface GetTeamCityBuildConfigsRes {
count: number;
href: string;
buildType: TeamCityBuildConfig[];
}
if (appId && appId !== "") {
const { data: { buildType } } = (
await standardRequest.get<GetTeamCityBuildConfigsRes>(`${req.integrationAuth.url}/app/rest/buildTypes`, {
params: {
locator: `project:${appId}`
},
headers: {
Authorization: `Bearer ${req.accessToken}`,
Accept: "application/json",
},
})
);
return res.status(200).send({
buildConfigs: buildType.map((buildConfig) => ({
name: buildConfig.name,
buildConfigId: buildConfig.id
}))
});
}
return res.status(200).send({
buildConfigs: []
});
}
/**
* Delete integration authorization with id [integrationAuthId]
* @param req

View File

@ -1,13 +1,13 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { Integration } from "../../models";
import { Folder, Integration } from "../../models";
import { EventService } from "../../services";
import { eventStartIntegration } from "../../events";
import Folder from "../../models/folder";
import { getFolderByPath } from "../../services/FolderService";
import { BadRequestError } from "../../utils/errors";
import { EEAuditLogService } from "../../ee/services";
import { EventType } from "../../ee/models";
import { syncSecretsToActiveIntegrationsQueue } from "../../queues/integrations/syncSecretsToThirdPartyServices";
/**
* Create/initialize an (empty) integration for integration authorization
@ -29,7 +29,8 @@ export const createIntegration = async (req: Request, res: Response) => {
owner,
path,
region,
secretPath
secretPath,
metadata
} = req.body;
const folders = await Folder.findOne({
@ -64,7 +65,8 @@ export const createIntegration = async (req: Request, res: Response) => {
region,
secretPath,
integration: req.integrationAuth.integration,
integrationAuth: new Types.ObjectId(integrationAuthId)
integrationAuth: new Types.ObjectId(integrationAuthId),
metadata
}).save();
if (integration) {
@ -76,7 +78,7 @@ export const createIntegration = async (req: Request, res: Response) => {
})
});
}
await EEAuditLogService.createAuditLog(
req.authData,
{
@ -218,3 +220,15 @@ export const deleteIntegration = async (req: Request, res: Response) => {
integration
});
};
// Will trigger sync for all integrations within the given env and workspace id
export const manualSync = async (req: Request, res: Response) => {
const { workspaceId, environment } = req.body;
syncSecretsToActiveIntegrationsQueue({
workspaceId,
environment
})
res.status(200).send()
};

View File

@ -1,8 +1,6 @@
import { Request, Response } from "express";
import { isValidScope, validateMembership } from "../../helpers";
import { ServiceTokenData } from "../../models";
import Folder from "../../models/folder";
import SecretImport from "../../models/secretImports";
import { Folder, SecretImport, ServiceTokenData } from "../../models";
import { getAllImportedSecrets } from "../../services/SecretImportService";
import { getFolderWithPathFromId } from "../../services/FolderService";
import { BadRequestError, ResourceNotFoundError,UnauthorizedRequestError } from "../../utils/errors";

View File

@ -4,8 +4,7 @@ import { EventType, FolderVersion } from "../../ee/models";
import { EEAuditLogService, EESecretService } from "../../ee/services";
import { validateMembership } from "../../helpers/membership";
import { isValidScope } from "../../helpers/secrets";
import { Secret, ServiceTokenData } from "../../models";
import Folder from "../../models/folder";
import { Folder, Secret, ServiceTokenData } from "../../models";
import {
appendFolder,
deleteFolderById,

View File

@ -2,7 +2,7 @@ import { Request, Response } from "express";
import { Types } from "mongoose";
import { client, getRootEncryptionKey } from "../../config";
import { validateMembership } from "../../helpers";
import Webhook from "../../models/webhooks";
import { Webhook } from "../../models";
import { getWebhookPayload, triggerWebhookRequest } from "../../services/WebhookService";
import { BadRequestError, ResourceNotFoundError } from "../../utils/errors";
import { EEAuditLogService } from "../../ee/services";

View File

@ -85,6 +85,43 @@ export const createWorkspaceEnvironment = async (
});
};
/**
* Swaps the ordering of two environments in the database. This is purely for aesthetic purposes.
* @param req
* @param res
* @returns
*/
export const reorderWorkspaceEnvironments = async (
req: Request,
res: Response
) => {
const { workspaceId } = req.params;
const { environmentSlug, environmentName, otherEnvironmentSlug, otherEnvironmentName } = req.body;
// atomic update the env to avoid conflict
const workspace = await Workspace.findById(workspaceId).exec();
if (!workspace) {
throw BadRequestError({message: "Couldn't load workspace"});
}
const environmentIndex = workspace.environments.findIndex((env) => env.name === environmentName && env.slug === environmentSlug)
const otherEnvironmentIndex = workspace.environments.findIndex((env) => env.name === otherEnvironmentName && env.slug === otherEnvironmentSlug)
if (environmentIndex === -1 || otherEnvironmentIndex === -1) {
throw BadRequestError({message: "environment or otherEnvironment couldn't be found"})
}
// swap the order of the environments
[workspace.environments[environmentIndex], workspace.environments[otherEnvironmentIndex]] = [workspace.environments[otherEnvironmentIndex], workspace.environments[environmentIndex]]
await workspace.save()
return res.status(200).send({
message: "Successfully reordered environments",
workspace: workspaceId,
});
};
/**
* Rename workspace environment with new name and slug of a workspace with [workspaceId]
* Old slug [oldEnvironmentSlug] must be provided
@ -124,7 +161,7 @@ export const renameWorkspaceEnvironment = async (
if (envIndex === -1) {
throw new Error("Invalid environment given");
}
const oldEnvironment = workspace.environments[envIndex];
workspace.environments[envIndex].name = environmentName;
@ -159,7 +196,7 @@ export const renameWorkspaceEnvironment = async (
{ $set: { "deniedPermissions.$[element].environmentSlug": environmentSlug } },
{ arrayFilters: [{ "element.environmentSlug": oldEnvironmentSlug }] }
);
await EEAuditLogService.createAuditLog(
req.authData,
{
@ -210,7 +247,7 @@ export const deleteWorkspaceEnvironment = async (
if (envIndex === -1) {
throw new Error("Invalid environment given");
}
const oldEnvironment = workspace.environments[envIndex];
workspace.environments.splice(envIndex, 1);

View File

@ -1,6 +1,5 @@
import { Request, Response } from "express";
import mongoose, { Types } from "mongoose";
import Secret, { ISecret } from "../../models/secret";
import {
CreateSecretRequestBody,
ModifySecretRequestBody,
@ -20,7 +19,7 @@ import {
SECRET_SHARED
} from "../../variables";
import { TelemetryService } from "../../services";
import { User } from "../../models";
import { ISecret, Secret, User } from "../../models";
import { AccountNotFoundError } from "../../utils/errors";
/**

View File

@ -1,6 +1,6 @@
import { Types } from "mongoose";
import { Request, Response } from "express";
import { ISecret, Secret, ServiceTokenData } from "../../models";
import { Folder, ISecret, Secret, ServiceTokenData, Tag } from "../../models";
import { AuditLog, EventType, IAction, SecretVersion } from "../../ee/models";
import {
ACTION_ADD_SECRETS,
@ -9,6 +9,7 @@ import {
ACTION_UPDATE_SECRETS,
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_UTF8,
K8_USER_AGENT_NAME,
SECRET_PERSONAL
} from "../../variables";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
@ -23,10 +24,8 @@ import {
userHasWorkspaceAccess,
userHasWriteOnlyAbility
} from "../../ee/helpers/checkMembershipPermissions";
import Tag from "../../models/tag";
import _ from "lodash";
import { BatchSecret, BatchSecretRequest } from "../../types/secret";
import Folder from "../../models/folder";
import {
getFolderByPath,
getFolderIdFromServiceToken,
@ -59,7 +58,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
let secretPath = req.body.secretPath as string;
let folderId = req.body.folderId as string;
const createSecrets: BatchSecret[] = [];
const updateSecrets: BatchSecret[] = [];
const deleteSecrets: { _id: Types.ObjectId, secretName: string; }[] = [];
@ -154,7 +153,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
};
})
});
const auditLogs = await Promise.all(
createdSecrets.map((secret, index) => {
return EEAuditLogService.createAuditLog(
@ -178,7 +177,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
);
await AuditLog.insertMany(auditLogs);
const addAction = (await EELogService.createAction({
name: ACTION_ADD_SECRETS,
userId: req.user?._id,
@ -234,6 +233,9 @@ export const batchSecrets = async (req: Request, res: Response) => {
$inc: {
version: 1
},
$unset: {
"metadata.source": true as const
},
...u,
_id: new Types.ObjectId(u._id)
}
@ -277,7 +279,7 @@ export const batchSecrets = async (req: Request, res: Response) => {
$in: updateSecrets.map((u) => new Types.ObjectId(u._id))
}
});
const auditLogs = await Promise.all(
updateSecrets.map((secret) => {
return EEAuditLogService.createAuditLog(
@ -329,26 +331,26 @@ export const batchSecrets = async (req: Request, res: Response) => {
// handle delete secrets
if (deleteSecrets.length > 0) {
const deleteSecretIds: Types.ObjectId[] = deleteSecrets.map((s) => s._id);
const deletedSecretsObj = (await Secret.find({
_id: {
$in: deleteSecretIds
}
}))
.reduce(
(obj: any, secret: ISecret) => ({
...obj,
[secret._id.toString()]: secret
}),
{}
);
.reduce(
(obj: any, secret: ISecret) => ({
...obj,
[secret._id.toString()]: secret
}),
{}
);
await Secret.deleteMany({
_id: {
$in: deleteSecretIds
}
});
await EESecretService.markDeletedSecretVersions({
secretIds: deleteSecretIds
});
@ -949,14 +951,14 @@ export const getSecrets = async (req: Request, res: Response) => {
channel,
ipAddress: req.realIP
}));
await EEAuditLogService.createAuditLog(
req.authData,
{
type: EventType.GET_SECRETS,
metadata: {
environment,
secretPath: secretPath as string,
secretPath: (secretPath as string) ?? "/",
numberOfSecrets: secrets.length
}
},
@ -966,21 +968,36 @@ export const getSecrets = async (req: Request, res: Response) => {
);
const postHogClient = await TelemetryService.getPostHogClient();
// reduce the number of events captured
let shouldRecordK8Event = false
if (req.authData.userAgent == K8_USER_AGENT_NAME) {
const randomNumber = Math.random();
if (randomNumber > 0.9) {
shouldRecordK8Event = true
}
}
if (postHogClient) {
postHogClient.capture({
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({
authData: req.authData
}),
properties: {
numberOfSecrets: secrets.length,
environment,
workspaceId,
channel,
folderId,
userAgent: req.headers?.["user-agent"]
}
});
const shouldCapture = req.authData.userAgent !== K8_USER_AGENT_NAME || shouldRecordK8Event;
const approximateForNoneCapturedEvents = secrets.length * 10
if (shouldCapture) {
postHogClient.capture({
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({
authData: req.authData
}),
properties: {
numberOfSecrets: shouldRecordK8Event ? approximateForNoneCapturedEvents : secrets.length,
environment,
workspaceId,
folderId,
channel: req.authData.userAgentType,
userAgent: req.authData.userAgent
}
});
}
}
return res.status(200).send({
@ -1087,10 +1104,10 @@ export const updateSecrets = async (req: Request, res: Response) => {
tags,
...(secretCommentCiphertext !== undefined && secretCommentIV && secretCommentTag
? {
secretCommentCiphertext,
secretCommentIV,
secretCommentTag
}
secretCommentCiphertext,
secretCommentIV,
secretCommentTag
}
: {})
}
}

View File

@ -1,15 +1,15 @@
import { Request, Response } from "express";
import { Types } from "mongoose";
import { Membership, Secret } from "../../models";
import Tag from "../../models/tag";
import { Membership, Secret, Tag } from "../../models";
import { BadRequestError, UnauthorizedRequestError } from "../../utils/errors";
export const createWorkspaceTag = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const { name, slug } = req.body;
const { name, slug, tagColor } = req.body;
const tagToCreate = {
name,
tagColor,
workspace: new Types.ObjectId(workspaceId),
slug,
user: new Types.ObjectId(req.user._id),

View File

@ -6,10 +6,9 @@ import { BotService } from "../../services";
import { containsGlobPatterns, repackageSecretToRaw } from "../../helpers/secrets";
import { encryptSymmetric128BitHexKeyUTF8 } from "../../utils/crypto";
import { getAllImportedSecrets } from "../../services/SecretImportService";
import Folder from "../../models/folder";
import { Folder, IServiceTokenData } from "../../models";
import { getFolderByPath } from "../../services/FolderService";
import { BadRequestError } from "../../utils/errors";
import { IServiceTokenData } from "../../models";
import { requireWorkspaceAuth } from "../../middleware";
import { ADMIN, MEMBER, PERMISSION_READ_SECRETS } from "../../variables";
@ -23,6 +22,7 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
let workspaceId = req.query.workspaceId as string;
let environment = req.query.environment as string;
let secretPath = req.query.secretPath as string;
const folderId = req.query.folderId as string | undefined;
const includeImports = req.query.include_imports as string;
// if the service token has single scope, it will get all secrets for that scope by default
@ -47,6 +47,7 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
const secrets = await SecretService.getSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
folderId,
secretPath,
authData: req.authData
});
@ -284,11 +285,13 @@ export const getSecrets = async (req: Request, res: Response) => {
const workspaceId = req.query.workspaceId as string;
const environment = req.query.environment as string;
const secretPath = req.query.secretPath as string;
const folderId = req.query.folderId as string | undefined;
const includeImports = req.query.include_imports as string;
const secrets = await SecretService.getSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
folderId,
secretPath,
authData: req.authData
});

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
import { Request, Response } from "express";
import { PipelineStage, Types } from "mongoose";
import { Membership, Secret, ServiceTokenData, User } from "../../../models";
import { Folder, Membership, Secret, ServiceTokenData, TFolderSchema, User } from "../../../models";
import {
ActorType,
AuditLog,
@ -18,7 +18,7 @@ import {
} from "../../models";
import { EESecretService } from "../../services";
import { getLatestSecretVersionIds } from "../../helpers/secretVersion";
import Folder, { TFolderSchema } from "../../../models/folder";
// import Folder, { TFolderSchema } from "../../../models/folder";
import { searchByFolderId } from "../../../services/FolderService";
import { EEAuditLogService, EELicenseService } from "../../services";
import { extractIPDetails, isValidIpOrCidr } from "../../../utils/ip";

View File

@ -117,7 +117,7 @@ const secretVersionSchema = new Schema<ISecretVersion>(
ref: "Tag",
type: [Schema.Types.ObjectId],
default: [],
},
}
},
{
timestamps: true,

View File

@ -1,58 +1,29 @@
import { Probot } from "probot";
import { exec } from "child_process";
import { mkdir, readFile, rm, writeFile } from "fs";
import { tmpdir } from "os";
import { join } from "path"
import GitRisks from "../../models/gitRisks";
import GitAppOrganizationInstallation from "../../models/gitAppOrganizationInstallation";
import MembershipOrg from "../../../models/membershipOrg";
import { ADMIN, OWNER } from "../../../variables";
import User from "../../../models/user";
import { sendMail } from "../../../helpers";
import TelemetryService from "../../../services/TelemetryService";
type SecretMatch = {
Description: string;
StartLine: number;
EndLine: number;
StartColumn: number;
EndColumn: number;
Match: string;
Secret: string;
File: string;
SymlinkFile: string;
Commit: string;
Entropy: number;
Author: string;
Email: string;
Date: string;
Message: string;
Tags: string[];
RuleID: string;
Fingerprint: string;
FingerPrintWithoutCommitId: string
};
import { scanGithubPushEventForSecretLeaks } from "../../../queues/secret-scanning/githubScanPushEvent";
export default async (app: Probot) => {
app.on("installation.deleted", async (context) => {
const { payload } = context;
const { installation, repositories } = payload;
if (installation.repository_selection == "all") {
await GitRisks.deleteMany({ installationId: installation.id })
await GitAppOrganizationInstallation.deleteOne({ installationId: installation.id })
} else {
if (repositories) {
for (const repository of repositories) {
await GitRisks.deleteMany({ repositoryId: repository.id })
}
if (repositories) {
for (const repository of repositories) {
await GitRisks.deleteMany({ repositoryId: repository.id })
}
await GitAppOrganizationInstallation.deleteOne({ installationId: installation.id })
}
})
app.on("installation", async (context) => {
const { payload } = context;
payload.repositories
const { installation, repositories } = payload;
// TODO: start full repo scans
})
app.on("push", async (context) => {
const { payload } = context;
const { commits, repository, installation, pusher } = payload;
const [owner, repo] = repository.full_name.split("/");
if (!commits || !repository || !installation || !pusher) {
return
@ -63,188 +34,12 @@ export default async (app: Probot) => {
return
}
const allFindingsByFingerprint: { [key: string]: SecretMatch; } = {}
for (const commit of commits) {
for (const filepath of [...commit.added, ...commit.modified]) {
try {
const fileContentsResponse = await context.octokit.repos.getContent({
owner,
repo,
path: filepath,
});
const data: any = fileContentsResponse.data;
const fileContent = Buffer.from(data.content, "base64").toString();
const findings = await scanContentAndGetFindings(`\n${fileContent}`) // extra line to count lines correctly
for (const finding of findings) {
const fingerPrintWithCommitId = `${commit.id}:${filepath}:${finding.RuleID}:${finding.StartLine}`
const fingerPrintWithoutCommitId = `${filepath}:${finding.RuleID}:${finding.StartLine}`
finding.Fingerprint = fingerPrintWithCommitId
finding.FingerPrintWithoutCommitId = fingerPrintWithoutCommitId
finding.Commit = commit.id
finding.File = filepath
finding.Author = commit.author.name
finding.Email = commit?.author?.email ? commit?.author?.email : ""
allFindingsByFingerprint[fingerPrintWithCommitId] = finding
}
} catch (error) {
console.error(`Error fetching content for ${filepath}`, error); // eslint-disable-line
}
}
}
// change to update
for (const key in allFindingsByFingerprint) {
const risk = await GitRisks.findOneAndUpdate({ fingerprint: allFindingsByFingerprint[key].Fingerprint },
{
...convertKeysToLowercase(allFindingsByFingerprint[key]),
installationId: installation.id,
organization: installationLinkToOrgExists.organizationId,
repositoryFullName: repository.full_name,
repositoryId: repository.id
}, {
upsert: true
}).lean()
}
// get emails of admins
const adminsOfWork = await MembershipOrg.find({
organization: installationLinkToOrgExists.organizationId,
$or: [
{ role: OWNER },
{ role: ADMIN }
]
}).lean()
const userEmails = await User.find({
_id: {
$in: [adminsOfWork.map(orgMembership => orgMembership.user)]
}
}).select("email").lean()
const adminOrOwnerEmails = userEmails.map(userObject => userObject.email)
const usersToNotify = pusher?.email ? [pusher.email, ...adminOrOwnerEmails] : [...adminOrOwnerEmails]
if (Object.keys(allFindingsByFingerprint).length) {
await sendMail({
template: "secretLeakIncident.handlebars",
subjectLine: `Incident alert: leaked secrets found in Github repository ${repository.full_name}`,
recipients: usersToNotify,
substitutions: {
numberOfSecrets: Object.keys(allFindingsByFingerprint).length,
pusher_email: pusher.email,
pusher_name: pusher.name
}
});
}
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient) {
postHogClient.capture({
event: "cloud secret scan",
distinctId: pusher.email,
properties: {
numberOfCommitsScanned: commits.length,
numberOfRisksFound: Object.keys(allFindingsByFingerprint).length,
}
});
}
scanGithubPushEventForSecretLeaks({
commits: commits,
pusher: { name: pusher.name, email: pusher.email },
repository: { fullName: repository.full_name, id: repository.id },
organizationId: installationLinkToOrgExists.organizationId,
installationId: installation.id
})
});
};
async function scanContentAndGetFindings(textContent: string): Promise<SecretMatch[]> {
const tempFolder = await createTempFolder();
const filePath = join(tempFolder, "content.txt");
const findingsPath = join(tempFolder, "findings.json");
try {
await writeTextToFile(filePath, textContent);
await runInfisicalScan(filePath, findingsPath);
const findingsData = await readFindingsFile(findingsPath);
return JSON.parse(findingsData);
} finally {
await deleteTempFolder(tempFolder);
}
}
function createTempFolder(): Promise<string> {
return new Promise((resolve, reject) => {
const tempDir = tmpdir()
const tempFolderName = Math.random().toString(36).substring(2);
const tempFolderPath = join(tempDir, tempFolderName);
mkdir(tempFolderPath, (err: any) => {
if (err) {
reject(err);
} else {
resolve(tempFolderPath);
}
});
});
}
function writeTextToFile(filePath: string, content: string): Promise<void> {
return new Promise((resolve, reject) => {
writeFile(filePath, content, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
function runInfisicalScan(inputPath: string, outputPath: string): Promise<void> {
return new Promise((resolve, reject) => {
const command = `cat "${inputPath}" | infisical scan --exit-code=77 --pipe -r "${outputPath}"`;
exec(command, (error) => {
if (error && error.code != 77) {
reject(error);
} else {
resolve();
}
});
});
}
function readFindingsFile(filePath: string): Promise<string> {
return new Promise((resolve, reject) => {
readFile(filePath, "utf8", (err, data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
}
function deleteTempFolder(folderPath: string): Promise<void> {
return new Promise((resolve, reject) => {
rm(folderPath, { recursive: true }, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
function convertKeysToLowercase<T>(obj: T): T {
const convertedObj = {} as T;
for (const key in obj) {
if (Object.prototype.hasOwnProperty.call(obj, key)) {
const lowercaseKey = key.charAt(0).toLowerCase() + key.slice(1);
convertedObj[lowercaseKey as keyof T] = obj[key];
}
}
return convertedObj;
}

View File

@ -0,0 +1,125 @@
import { exec } from "child_process";
import { mkdir, readFile, rm, writeFile } from "fs";
import { tmpdir } from "os";
import { join } from "path"
import { SecretMatch } from "./types";
import { Octokit } from "@octokit/rest";
export async function scanContentAndGetFindings(textContent: string): Promise<SecretMatch[]> {
const tempFolder = await createTempFolder();
const filePath = join(tempFolder, "content.txt");
const findingsPath = join(tempFolder, "findings.json");
try {
await writeTextToFile(filePath, textContent);
await runInfisicalScan(filePath, findingsPath);
const findingsData = await readFindingsFile(findingsPath);
return JSON.parse(findingsData);
} finally {
await deleteTempFolder(tempFolder);
}
}
export function createTempFolder(): Promise<string> {
return new Promise((resolve, reject) => {
const tempDir = tmpdir()
const tempFolderName = Math.random().toString(36).substring(2);
const tempFolderPath = join(tempDir, tempFolderName);
mkdir(tempFolderPath, (err: any) => {
if (err) {
reject(err);
} else {
resolve(tempFolderPath);
}
});
});
}
export function writeTextToFile(filePath: string, content: string): Promise<void> {
return new Promise((resolve, reject) => {
writeFile(filePath, content, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
export function runInfisicalScan(inputPath: string, outputPath: string): Promise<void> {
return new Promise((resolve, reject) => {
const command = `cat "${inputPath}" | infisical scan --exit-code=77 --pipe -r "${outputPath}"`;
exec(command, (error) => {
if (error && error.code != 77) {
reject(error);
} else {
resolve();
}
});
});
}
export function readFindingsFile(filePath: string): Promise<string> {
return new Promise((resolve, reject) => {
readFile(filePath, "utf8", (err, data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
}
export function deleteTempFolder(folderPath: string): Promise<void> {
return new Promise((resolve, reject) => {
rm(folderPath, { recursive: true }, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
export function convertKeysToLowercase<T>(obj: T): T {
const convertedObj = {} as T;
for (const key in obj) {
if (Object.prototype.hasOwnProperty.call(obj, key)) {
const lowercaseKey = key.charAt(0).toLowerCase() + key.slice(1);
convertedObj[lowercaseKey as keyof T] = obj[key];
}
}
return convertedObj;
}
export async function getCommits(octokit: Octokit, owner: string, repo: string) {
let commits: { sha: string }[] = [];
let page = 1;
while (true) {
const response = await octokit.repos.listCommits({
owner,
repo,
per_page: 100,
page,
});
commits = commits.concat(response.data);
if (response.data.length == 0) break;
page++;
}
return commits;
}
export async function getFilesFromCommit(octokit: any, owner: string, repo: string, sha: string) {
const response = await octokit.repos.getCommit({
owner,
repo,
ref: sha,
});
}

View File

@ -0,0 +1,21 @@
export type SecretMatch = {
Description: string;
StartLine: number;
EndLine: number;
StartColumn: number;
EndColumn: number;
Match: string;
Secret: string;
File: string;
SymlinkFile: string;
Commit: string;
Entropy: number;
Author: string;
Email: string;
Date: string;
Message: string;
Tags: string[];
RuleID: string;
Fingerprint: string;
FingerPrintWithoutCommitId: string
};

View File

@ -14,7 +14,7 @@ import {
} from "../variables";
import { client, getEncryptionKey, getRootEncryptionKey } from "../config";
import { InternalServerError } from "../utils/errors";
import Folder from "../models/folder";
import { Folder } from "../models";
import { getFolderByPath } from "../services/FolderService";
import { getAllImportedSecrets } from "../services/SecretImportService";
import { expandSecrets } from "./secrets";

View File

@ -32,7 +32,7 @@ export const handleEventHelper = async ({ event }: { event: Event }) => {
switch (event.name) {
case EVENT_PUSH_SECRETS:
if (bot) {
await IntegrationService.syncIntegrations({
IntegrationService.syncIntegrations({
workspaceId,
environment
});

View File

@ -1,6 +1,6 @@
import { Types } from "mongoose";
import { Bot, Integration, IntegrationAuth } from "../models";
import { exchangeCode, exchangeRefresh, syncSecrets } from "../integrations";
import { Bot, IntegrationAuth } from "../models";
import { exchangeCode, exchangeRefresh } from "../integrations";
import { BotService } from "../services";
import {
ALGORITHM_AES_256_GCM,
@ -9,7 +9,6 @@ import {
INTEGRATION_VERCEL
} from "../variables";
import { UnauthorizedRequestError } from "../utils/errors";
import * as Sentry from "@sentry/node";
interface Update {
workspace: string;
@ -102,69 +101,6 @@ export const handleOAuthExchangeHelper = async ({
return integrationAuth;
};
/**
* Sync/push environment variables in workspace with id [workspaceId] to
* all active integrations for that workspace
* @param {Object} obj
* @param {Object} obj.workspaceId - id of workspace
*/
export const syncIntegrationsHelper = async ({
workspaceId,
environment
}: {
workspaceId: Types.ObjectId;
environment?: string;
}) => {
try {
const integrations = await Integration.find({
workspace: workspaceId,
...(environment
? {
environment
}
: {}),
isActive: true,
app: { $ne: null }
});
// for each workspace integration, sync/push secrets
// to that integration
for await (const integration of integrations) {
// get workspace, environment (shared) secrets
const secrets = await BotService.getSecrets({
workspaceId: integration.workspace,
environment: integration.environment,
secretPath: integration.secretPath
});
const integrationAuth = await IntegrationAuth.findById(integration.integrationAuth);
if (!integrationAuth) throw new Error("Failed to find integration auth");
// get integration auth access token
const access = await getIntegrationAuthAccessHelper({
integrationAuthId: integration.integrationAuth
});
// sync secrets to integration
await syncSecrets({
integration,
integrationAuth,
secrets,
accessId: access.accessId === undefined ? null : access.accessId,
accessToken: access.accessToken
});
}
} catch (err) {
Sentry.captureException(err);
// eslint-disable-next-line
console.log(
`syncIntegrationsHelper: failed with [workspaceId=${workspaceId}] [environment=${environment}]`,
err
); // eslint-disable-line no-use-before-define
throw err;
}
};
/**
* Return decrypted refresh token using the bot's copy

View File

@ -7,11 +7,13 @@ import {
UpdateSecretParams
} from "../interfaces/services/SecretService";
import {
Folder,
ISecret,
IServiceTokenData,
Secret,
SecretBlindIndexData,
ServiceTokenData,
TFolderRootSchema
} from "../models";
import { EventType, SecretVersion } from "../ee/models";
import {
@ -29,6 +31,7 @@ import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_BASE64,
ENCODING_SCHEME_UTF8,
K8_USER_AGENT_NAME,
SECRET_PERSONAL,
SECRET_SHARED
} from "../variables";
@ -45,7 +48,6 @@ import { getAuthDataPayloadIdObj, getAuthDataPayloadUserObj } from "../utils/aut
import { getFolderByPath, getFolderIdFromServiceToken } from "../services/FolderService";
import picomatch from "picomatch";
import path from "path";
import Folder, { TFolderRootSchema } from "../models/folder";
export const isValidScope = (
authPayload: IServiceTokenData,
@ -393,7 +395,8 @@ export const createSecretHelper = async ({
secretCommentTag,
folder: folderId,
algorithm: ALGORITHM_AES_256_GCM,
keyEncoding: ENCODING_SCHEME_UTF8
keyEncoding: ENCODING_SCHEME_UTF8,
metadata
}).save();
const secretVersion = new SecretVersion({
@ -463,8 +466,8 @@ export const createSecretHelper = async ({
});
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient && (metadata?.source !== "signup")) {
if (postHogClient && metadata?.source !== "signup") {
postHogClient.capture({
event: "secrets added",
distinctId: await TelemetryService.getDistinctId({
@ -496,6 +499,7 @@ export const getSecretsHelper = async ({
workspaceId,
environment,
authData,
folderId,
secretPath = "/"
}: GetSecretsParams) => {
let secrets: ISecret[] = [];
@ -505,7 +509,10 @@ export const getSecretsHelper = async ({
throw UnauthorizedRequestError({ message: "Folder Permission Denied" });
}
}
const folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
if (!folderId) {
folderId = await getFolderIdFromServiceToken(workspaceId, environment, secretPath);
}
// get personal secrets first
secrets = await Secret.find({
@ -549,7 +556,7 @@ export const getSecretsHelper = async ({
channel: authData.userAgentType,
ipAddress: authData.ipAddress
}));
await EEAuditLogService.createAuditLog(
authData,
{
@ -567,21 +574,33 @@ export const getSecretsHelper = async ({
const postHogClient = await TelemetryService.getPostHogClient();
// reduce the number of events captured
let shouldRecordK8Event = false
if (authData.userAgent == K8_USER_AGENT_NAME) {
const randomNumber = Math.random();
if (randomNumber > 0.9) {
shouldRecordK8Event = true
}
}
if (postHogClient) {
postHogClient.capture({
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({
authData
}),
properties: {
numberOfSecrets: secrets.length,
environment,
workspaceId,
folderId,
channel: authData.userAgentType,
userAgent: authData.userAgent
}
});
const shouldCapture = authData.userAgent !== K8_USER_AGENT_NAME || shouldRecordK8Event;
const approximateForNoneCapturedEvents = secrets.length * 10
if (shouldCapture) {
postHogClient.capture({
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({ authData }),
properties: {
numberOfSecrets: shouldRecordK8Event ? approximateForNoneCapturedEvents : secrets.length,
environment,
workspaceId,
folderId,
channel: authData.userAgentType,
userAgent: authData.userAgent
}
});
}
}
return secrets;
@ -659,7 +678,7 @@ export const getSecretHelper = async ({
ipAddress: authData.ipAddress
}));
await EEAuditLogService.createAuditLog(
await EEAuditLogService.createAuditLog(
authData,
{
type: EventType.GET_SECRET,
@ -680,7 +699,7 @@ export const getSecretHelper = async ({
if (postHogClient) {
postHogClient.capture({
event: "secrets pull",
event: "secrets pulled",
distinctId: await TelemetryService.getDistinctId({
authData
}),
@ -824,8 +843,8 @@ export const updateSecretHelper = async ({
channel: authData.userAgentType,
ipAddress: authData.ipAddress
}));
await EEAuditLogService.createAuditLog(
await EEAuditLogService.createAuditLog(
authData,
{
type: EventType.UPDATE_SECRET,
@ -908,14 +927,14 @@ export const deleteSecretHelper = async ({
if (type === SECRET_SHARED) {
secrets = await Secret.find({
secretBlindIndex,
workspaceId: new Types.ObjectId(workspaceId),
workspace: new Types.ObjectId(workspaceId),
environment,
folder: folderId
}).lean();
secret = await Secret.findOneAndDelete({
secretBlindIndex,
workspaceId: new Types.ObjectId(workspaceId),
workspace: new Types.ObjectId(workspaceId),
environment,
type,
folder: folderId
@ -931,7 +950,7 @@ export const deleteSecretHelper = async ({
secret = await Secret.findOneAndDelete({
secretBlindIndex,
folder: folderId,
workspaceId: new Types.ObjectId(workspaceId),
workspace: new Types.ObjectId(workspaceId),
environment,
type,
...getAuthDataPayloadUserObj(authData)
@ -1088,7 +1107,8 @@ const recursivelyExpandSecret = async (
let interpolatedValue = interpolatedSec[key];
if (!interpolatedValue) {
throw new Error(`Couldn't find referenced value - ${key}`);
console.error(`Couldn't find referenced value - ${key}`);
return "";
}
const refs = interpolatedValue.match(INTERPOLATION_SYNTAX_REG);

View File

@ -6,7 +6,7 @@ require("express-async-errors");
import helmet from "helmet";
import cors from "cors";
import { DatabaseService } from "./services";
import { EELicenseService, GithubSecretScanningService} from "./ee/services";
import { EELicenseService, GithubSecretScanningService } from "./ee/services";
import { setUpHealthEndpoint } from "./services/health";
import cookieParser from "cookie-parser";
import swaggerUi = require("swagger-ui-express");
@ -24,7 +24,7 @@ import {
secretSnapshot as eeSecretSnapshotRouter,
users as eeUsersRouter,
workspace as eeWorkspaceRouter,
secretScanning as v1SecretScanningRouter,
secretScanning as v1SecretScanningRouter
} from "./ee/routes/v1";
import {
auth as v1AuthRouter,
@ -58,7 +58,7 @@ import {
signup as v2SignupRouter,
tags as v2TagsRouter,
users as v2UsersRouter,
workspace as v2WorkspaceRouter,
workspace as v2WorkspaceRouter
} from "./routes/v2";
import {
auth as v3AuthRouter,
@ -70,12 +70,21 @@ import { healthCheck } from "./routes/status";
import { getLogger } from "./utils/logger";
import { RouteNotFoundError } from "./utils/errors";
import { requestErrorHandler } from "./middleware/requestErrorHandler";
import { getNodeEnv, getPort, getSecretScanningGitAppId, getSecretScanningPrivateKey, getSecretScanningWebhookProxy, getSecretScanningWebhookSecret, getSiteURL } from "./config";
import {
getNodeEnv,
getPort,
getSecretScanningGitAppId,
getSecretScanningPrivateKey,
getSecretScanningWebhookProxy,
getSecretScanningWebhookSecret,
getSiteURL
} from "./config";
import { setup } from "./utils/setup";
const SmeeClient = require('smee-client') // eslint-disable-line
import { syncSecretsToThirdPartyServices } from "./queues/integrations/syncSecretsToThirdPartyServices";
import { githubPushEventSecretScan } from "./queues/secret-scanning/githubScanPushEvent";
const SmeeClient = require("smee-client"); // eslint-disable-line
const main = async () => {
await setup();
await EELicenseService.initGlobalFeatureSet();
@ -92,11 +101,15 @@ const main = async () => {
})
);
if (await getSecretScanningGitAppId() && await getSecretScanningWebhookSecret() && await getSecretScanningPrivateKey()) {
if (
(await getSecretScanningGitAppId()) &&
(await getSecretScanningWebhookSecret()) &&
(await getSecretScanningPrivateKey())
) {
const probot = new Probot({
appId: await getSecretScanningGitAppId(),
privateKey: await getSecretScanningPrivateKey(),
secret: await getSecretScanningWebhookSecret(),
secret: await getSecretScanningWebhookSecret()
});
if ((await getNodeEnv()) != "production") {
@ -104,12 +117,14 @@ const main = async () => {
source: await getSecretScanningWebhookProxy(),
target: "http://backend:4000/ss-webhook",
logger: console
})
});
smee.start()
smee.start();
}
app.use(createNodeMiddleware(GithubSecretScanningService, { probot, webhooksPath: "/ss-webhook" })); // secret scanning webhook
app.use(
createNodeMiddleware(GithubSecretScanningService, { probot, webhooksPath: "/ss-webhook" })
); // secret scanning webhook
}
if ((await getNodeEnv()) === "production") {
@ -205,6 +220,8 @@ const main = async () => {
server.on("close", async () => {
await DatabaseService.closeDatabase();
syncSecretsToThirdPartyServices.close();
githubPushEventSecretScan.close();
});
return server;

View File

@ -18,6 +18,10 @@ import {
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_FLYIO,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_GCP_API_URL,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME,
INTEGRATION_GCP_SERVICE_USAGE_URL,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_GITLAB_API_URL,
@ -79,6 +83,11 @@ const getApps = async ({
}) => {
let apps: App[] = [];
switch (integrationAuth.integration) {
case INTEGRATION_GCP_SECRET_MANAGER:
apps = await getAppsGCPSecretManager({
accessToken,
});
break;
case INTEGRATION_AZURE_KEY_VAULT:
apps = [];
break;
@ -210,6 +219,96 @@ const getApps = async ({
return apps;
};
/**
* Return list of apps for GCP secret manager integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for GCP API
* @returns {Object[]} apps - list of GCP projects
* @returns {String} apps.name - name of GCP project
* @returns {String} apps.appId - id of GCP project
*/
const getAppsGCPSecretManager = async ({ accessToken }: { accessToken: string }) => {
interface GCPApp {
projectNumber: string;
projectId: string;
lifecycleState: "ACTIVE" | "LIFECYCLE_STATE_UNSPECIFIED" | "DELETE_REQUESTED" | "DELETE_IN_PROGRESS";
name: string;
createTime: string;
parent: {
type: "organization" | "folder" | "project";
id: string;
}
}
interface GCPGetProjectsRes {
projects: GCPApp[];
nextPageToken?: string;
}
interface GCPGetServiceRes {
name: string;
parent: string;
state: "ENABLED" | "DISABLED" | "STATE_UNSPECIFIED"
}
let gcpApps: GCPApp[] = [];
const apps: App[] = [];
const pageSize = 100;
let pageToken: string | undefined;
let hasMorePages = true;
while (hasMorePages) {
const params = new URLSearchParams({
pageSize: String(pageSize),
...(pageToken ? { pageToken } : {})
});
const res: GCPGetProjectsRes = (await standardRequest.get(`${INTEGRATION_GCP_API_URL}/v1/projects`, {
params,
headers: {
"Authorization": `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
})
)
.data;
gcpApps = gcpApps.concat(res.projects);
if (!res.nextPageToken) {
hasMorePages = false;
}
pageToken = res.nextPageToken;
}
for await (const gcpApp of gcpApps) {
try {
const res: GCPGetServiceRes = (await standardRequest.get(
`${INTEGRATION_GCP_SERVICE_USAGE_URL}/v1/projects/${gcpApp.projectId}/services/${INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME}`, {
headers: {
"Authorization": `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
)).data;
if (res.state === "ENABLED") {
apps.push({
name: gcpApp.name,
appId: gcpApp.projectId
});
}
} catch {
continue;
}
}
return apps;
};
/**
* Return list of apps for Heroku integration
* @param {Object} obj
@ -751,7 +850,7 @@ const getAppsTeamCity = async ({
},
})
).data.project.slice(1);
const apps = res.map((a: any) => {
return {
name: a.name,

View File

@ -4,6 +4,8 @@ import {
INTEGRATION_AZURE_TOKEN_URL,
INTEGRATION_BITBUCKET,
INTEGRATION_BITBUCKET_TOKEN_URL,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GCP_TOKEN_URL,
INTEGRATION_GITHUB,
INTEGRATION_GITHUB_TOKEN_URL,
INTEGRATION_GITLAB,
@ -13,17 +15,19 @@ import {
INTEGRATION_NETLIFY,
INTEGRATION_NETLIFY_TOKEN_URL,
INTEGRATION_VERCEL,
INTEGRATION_VERCEL_TOKEN_URL,
INTEGRATION_VERCEL_TOKEN_URL
} from "../variables";
import {
getClientIdAzure,
getClientIdBitBucket,
getClientIdGCPSecretManager,
getClientIdGitHub,
getClientIdGitLab,
getClientIdNetlify,
getClientIdVercel,
getClientSecretAzure,
getClientSecretBitBucket,
getClientSecretGCPSecretManager,
getClientSecretGitHub,
getClientSecretGitLab,
getClientSecretHeroku,
@ -113,6 +117,11 @@ const exchangeCode = async ({
let obj = {} as any;
switch (integration) {
case INTEGRATION_GCP_SECRET_MANAGER:
obj = await exchangeCodeGCP({
code,
});
break;
case INTEGRATION_AZURE_KEY_VAULT:
obj = await exchangeCodeAzure({
code,
@ -153,6 +162,40 @@ const exchangeCode = async ({
return obj;
};
/**
* Return [accessToken] for GCP OAuth2 code-token exchange
* @param {Object} obj
* @param {String} obj.code - code for code-token exchange
* @returns {Object} obj2
* @returns {String} obj2.accessToken - access token for GCP API
* @returns {String} obj2.refreshToken - refresh token for GCP API
* @returns {Date} obj2.accessExpiresAt - date of expiration for access token
*/
const exchangeCodeGCP = async ({ code }: { code: string }) => {
const accessExpiresAt = new Date();
const res: ExchangeCodeAzureResponse = (
await standardRequest.post(
INTEGRATION_GCP_TOKEN_URL,
new URLSearchParams({
grant_type: "authorization_code",
code: code,
client_id: await getClientIdGCPSecretManager(),
client_secret: await getClientSecretGCPSecretManager(),
redirect_uri: `${await getSiteURL()}/integrations/gcp-secret-manager/oauth2/callback`,
} as any)
)
).data;
accessExpiresAt.setSeconds(accessExpiresAt.getSeconds() + res.expires_in);
return {
accessToken: res.access_token,
refreshToken: res.refresh_token,
accessExpiresAt,
};
};
/**
* Return [accessToken] for Azure OAuth2 code-token exchange
* @param param0

View File

@ -2,7 +2,6 @@ import { exchangeCode } from "./exchange";
import { exchangeRefresh } from "./refresh";
import { getApps } from "./apps";
import { getTeams } from "./teams";
import { syncSecrets } from "./sync";
import { revokeAccess } from "./revoke";
export {
@ -10,6 +9,5 @@ export {
exchangeRefresh,
getApps,
getTeams,
syncSecrets,
revokeAccess,
}

View File

@ -26,6 +26,8 @@ import {
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_FLYIO,
INTEGRATION_FLYIO_API_URL,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GCP_SECRET_MANAGER_URL,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_GITLAB_API_URL,
@ -92,6 +94,13 @@ const syncSecrets = async ({
accessToken: string;
}) => {
switch (integration.integration) {
case INTEGRATION_GCP_SECRET_MANAGER:
await syncSecretsGCPSecretManager({
integration,
secrets,
accessToken
});
break;
case INTEGRATION_AZURE_KEY_VAULT:
await syncSecretsAzureKeyVault({
integration,
@ -286,6 +295,165 @@ const syncSecrets = async ({
}
};
/**
* Sync/push [secrets] to GCP secret manager project
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for GCP secret manager
*/
const syncSecretsGCPSecretManager = async ({
integration,
secrets,
accessToken
}: {
integration: IIntegration;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
interface GCPSecret {
name: string;
createTime: string;
}
interface GCPSMListSecretsRes {
secrets?: GCPSecret[];
totalSize?: number;
nextPageToken?: string;
}
let gcpSecrets: GCPSecret[] = [];
const pageSize = 100;
let pageToken: string | undefined;
let hasMorePages = true;
while (hasMorePages) {
const params = new URLSearchParams({
pageSize: String(pageSize),
...(pageToken ? { pageToken } : {})
});
const res: GCPSMListSecretsRes = (await standardRequest.get(
`${INTEGRATION_GCP_SECRET_MANAGER_URL}/v1beta1/projects/${integration.appId}/secrets`,
{
params,
headers: {
"Authorization": `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
)).data;
if (res.secrets) {
gcpSecrets = gcpSecrets.concat(res.secrets);
}
if (!res.nextPageToken) {
hasMorePages = false;
}
pageToken = res.nextPageToken;
}
const res: { [key: string]: string; } = {};
interface GCPLatestSecretVersionAccess {
name: string;
payload: {
data: string;
}
}
for await (const gcpSecret of gcpSecrets) {
const arr = gcpSecret.name.split("/");
const key = arr[arr.length - 1];
const secretLatest: GCPLatestSecretVersionAccess = (await standardRequest.get(
`${INTEGRATION_GCP_SECRET_MANAGER_URL}/v1beta1/projects/${integration.appId}/secrets/${key}/versions/latest:access`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
)).data;
res[key] = Buffer.from(secretLatest.payload.data, "base64").toString("utf-8");
}
for await (const key of Object.keys(secrets)) {
if (!(key in res)) {
// case: create secret
await standardRequest.post(
`${INTEGRATION_GCP_SECRET_MANAGER_URL}/v1beta1/projects/${integration.appId}/secrets`,
{
replication: {
automatic: {}
}
},
{
params: {
secretId: key
},
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
await standardRequest.post(
`${INTEGRATION_GCP_SECRET_MANAGER_URL}/v1beta1/projects/${integration.appId}/secrets/${key}:addVersion`,
{
payload: {
data: Buffer.from(secrets[key].value).toString("base64")
}
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
}
}
for await (const key of Object.keys(res)) {
if (!(key in secrets)) {
// case: delete secret
await standardRequest.delete(
`${INTEGRATION_GCP_SECRET_MANAGER_URL}/v1beta1/projects/${integration.appId}/secrets/${key}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
} else {
// case: update secret
if (secrets[key].value !== res[key]) {
await standardRequest.post(
`${INTEGRATION_GCP_SECRET_MANAGER_URL}/v1beta1/projects/${integration.appId}/secrets/${key}:addVersion`,
{
payload: {
data: Buffer.from(secrets[key].value).toString("base64")
}
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
}
}
}
}
/**
* Sync/push [secrets] to Azure Key Vault with vault URI [integration.app]
* @param {Object} obj
@ -1838,7 +2006,7 @@ const syncSecretsCheckly = async ({
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
// get secrets from travis-ci
const getSecretsRes = (
await standardRequest.get(`${INTEGRATION_CHECKLY_API_URL}/v1/variables`, {
headers: {
@ -1860,7 +2028,6 @@ const syncSecretsCheckly = async ({
if (!(key in getSecretsRes)) {
// case: secret does not exist in checkly
// -> add secret
await standardRequest.post(
`${INTEGRATION_CHECKLY_API_URL}/v1/variables`,
{
@ -2019,7 +2186,7 @@ const syncSecretsTerraformCloud = async ({
};
/**
* Sync/push [secrets] to TeamCity project
* Sync/push [secrets] to TeamCity project (and optionally build config)
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration
@ -2041,57 +2208,124 @@ const syncSecretsTeamCity = async ({
value: string;
}
// get secrets from Teamcity
const res = (
await standardRequest.get(
`${integrationAuth.url}/app/rest/projects/id:${integration.appId}/parameters`,
interface TeamCityBuildConfigParameter {
name: string;
value: string;
inherited: boolean;
}
interface GetTeamCityBuildConfigParametersRes {
href: string;
count: number;
property: TeamCityBuildConfigParameter[];
}
if (integration.targetEnvironment && integration.targetEnvironmentId) {
// case: sync to specific build-config in TeamCity project
const res = (await standardRequest.get<GetTeamCityBuildConfigParametersRes>(
`${integrationAuth.url}/app/rest/buildTypes/${integration.targetEnvironmentId}/parameters`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json"
}
Accept: "application/json",
},
}
)
).data.property.reduce((obj: any, secret: TeamCitySecret) => {
const secretName = secret.name.replace(/^env\./, "");
return {
...obj,
[secretName]: secret.value
};
}, {});
for await (const key of Object.keys(secrets)) {
if (!(key in res) || (key in res && secrets[key] !== res[key])) {
// case: secret does not exist in TeamCity or secret value has changed
// -> create/update secret
await standardRequest.post(
`${integrationAuth.url}/app/rest/projects/id:${integration.appId}/parameters`,
))
.data
.property
.filter((parameter) => !parameter.inherited)
.reduce((obj: any, secret: TeamCitySecret) => {
const secretName = secret.name.replace(/^env\./, "");
return {
...obj,
[secretName]: secret.value
};
}, {});
for await (const key of Object.keys(secrets)) {
if (!(key in res) || (key in res && secrets[key].value !== res[key])) {
// case: secret does not exist in TeamCity or secret value has changed
// -> create/update secret
await standardRequest.post(`${integrationAuth.url}/app/rest/buildTypes/${integration.targetEnvironmentId}/parameters`,
{
name: `env.${key}`,
value: secrets[key]
name:`env.${key}`,
value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json"
}
}
);
Accept: "application/json",
},
});
}
}
}
for await (const key of Object.keys(res)) {
if (!(key in secrets)) {
// delete secret
await standardRequest.delete(
`${integrationAuth.url}/app/rest/projects/id:${integration.appId}/parameters/env.${key}`,
for await (const key of Object.keys(res)) {
if (!(key in secrets)) {
// delete secret
await standardRequest.delete(
`${integrationAuth.url}/app/rest/buildTypes/${integration.targetEnvironmentId}/parameters/env.${key}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json"
}
}
);
}
}
} else {
// case: sync to TeamCity project
const res = (
await standardRequest.get(
`${integrationAuth.url}/app/rest/projects/id:${integration.appId}/parameters`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json"
}
}
);
)
).data.property.reduce((obj: any, secret: TeamCitySecret) => {
const secretName = secret.name.replace(/^env\./, "");
return {
...obj,
[secretName]: secret.value
};
}, {});
for await (const key of Object.keys(secrets)) {
if (!(key in res) || (key in res && secrets[key] !== res[key])) {
// case: secret does not exist in TeamCity or secret value has changed
// -> create/update secret
await standardRequest.post(
`${integrationAuth.url}/app/rest/projects/id:${integration.appId}/parameters`,
{
name: `env.${key}`,
value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json"
}
}
);
}
}
for await (const key of Object.keys(res)) {
if (!(key in secrets)) {
// delete secret
await standardRequest.delete(
`${integrationAuth.url}/app/rest/projects/id:${integration.appId}/parameters/env.${key}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json"
}
}
);
}
}
}
};

View File

@ -25,6 +25,7 @@ export interface CreateSecretParams {
export interface GetSecretsParams {
workspaceId: Types.ObjectId;
environment: string;
folderId?: string;
secretPath: string;
authData: AuthData;
}

View File

@ -36,6 +36,4 @@ const apiKeyDataSchema = new Schema<IAPIKeyData>(
}
);
const APIKeyData = model<IAPIKeyData>("APIKeyData", apiKeyDataSchema);
export default APIKeyData;
export const APIKeyData = model<IAPIKeyData>("APIKeyData", apiKeyDataSchema);

View File

@ -68,9 +68,7 @@ const backupPrivateKeySchema = new Schema<IBackupPrivateKey>(
}
);
const BackupPrivateKey = model<IBackupPrivateKey>(
export const BackupPrivateKey = model<IBackupPrivateKey>(
"BackupPrivateKey",
backupPrivateKeySchema
);
export default BackupPrivateKey;

View File

@ -74,6 +74,4 @@ const botSchema = new Schema<IBot>(
}
);
const Bot = model<IBot>("Bot", botSchema);
export default Bot;
export const Bot = model<IBot>("Bot", botSchema);

View File

@ -40,6 +40,4 @@ const botKeySchema = new Schema<IBotKey>(
}
);
const BotKey = model<IBotKey>("BotKey", botKeySchema);
export default BotKey;
export const BotKey = model<IBotKey>("BotKey", botKeySchema);

View File

@ -93,6 +93,4 @@ const botOrgSchema = new Schema<IBotOrg>(
}
);
const BotOrg = model<IBotOrg>("BotOrg", botOrgSchema);
export default BotOrg;
export const BotOrg = model<IBotOrg>("BotOrg", botOrgSchema);

View File

@ -51,6 +51,4 @@ const folderRootSchema = new Schema<TFolderRootSchema>(
}
);
const Folder = model<TFolderRootSchema>("Folder", folderRootSchema);
export default Folder;
export const Folder = model<TFolderRootSchema>("Folder", folderRootSchema);

View File

@ -23,9 +23,7 @@ const incidentContactOrgSchema = new Schema<IIncidentContactOrg>(
}
);
const IncidentContactOrg = model<IIncidentContactOrg>(
export const IncidentContactOrg = model<IIncidentContactOrg>(
"IncidentContactOrg",
incidentContactOrgSchema
);
export default IncidentContactOrg;
);

View File

@ -1,89 +1,30 @@
import BackupPrivateKey, { IBackupPrivateKey } from "./backupPrivateKey";
import Bot, { IBot } from "./bot";
import BotOrg, { IBotOrg } from "./botOrg";
import BotKey, { IBotKey } from "./botKey";
import IncidentContactOrg, { IIncidentContactOrg } from "./incidentContactOrg";
import Integration, { IIntegration } from "./integration";
import IntegrationAuth, { IIntegrationAuth } from "./integrationAuth";
import Key, { IKey } from "./key";
import Membership, { IMembership } from "./membership";
import MembershipOrg, { IMembershipOrg } from "./membershipOrg";
import Organization, { IOrganization } from "./organization";
import Secret, { ISecret } from "./secret";
import Folder, { TFolderRootSchema, TFolderSchema } from "./folder";
import SecretImport, { ISecretImports } from "./secretImports";
import SecretBlindIndexData, { ISecretBlindIndexData } from "./secretBlindIndexData";
import ServiceToken, { IServiceToken } from "./serviceToken";
import ServiceAccount, { IServiceAccount } from "./serviceAccount"; // new
import ServiceAccountKey, { IServiceAccountKey } from "./serviceAccountKey"; // new
import ServiceAccountOrganizationPermission, { IServiceAccountOrganizationPermission } from "./serviceAccountOrganizationPermission"; // new
import ServiceAccountWorkspacePermission, { IServiceAccountWorkspacePermission } from "./serviceAccountWorkspacePermission"; // new
import TokenData, { ITokenData } from "./tokenData";
import User, { AuthMethod, IUser } from "./user";
import UserAction, { IUserAction } from "./userAction";
import Workspace, { IWorkspace } from "./workspace";
import ServiceTokenData, { IServiceTokenData } from "./serviceTokenData";
import APIKeyData, { IAPIKeyData } from "./apiKeyData";
import LoginSRPDetail, { ILoginSRPDetail } from "./loginSRPDetail";
import TokenVersion, { ITokenVersion } from "./tokenVersion";
export {
AuthMethod,
BackupPrivateKey,
IBackupPrivateKey,
Bot,
IBot,
BotOrg,
IBotOrg,
BotKey,
IBotKey,
IncidentContactOrg,
IIncidentContactOrg,
Integration,
IIntegration,
IntegrationAuth,
IIntegrationAuth,
Key,
IKey,
Membership,
IMembership,
MembershipOrg,
IMembershipOrg,
Organization,
IOrganization,
Secret,
ISecret,
Folder,
TFolderRootSchema,
TFolderSchema,
SecretImport,
ISecretImports,
SecretBlindIndexData,
ISecretBlindIndexData,
ServiceToken,
IServiceToken,
ServiceAccount,
IServiceAccount,
ServiceAccountKey,
IServiceAccountKey,
ServiceAccountOrganizationPermission,
IServiceAccountOrganizationPermission,
ServiceAccountWorkspacePermission,
IServiceAccountWorkspacePermission,
TokenData,
ITokenData,
User,
IUser,
UserAction,
IUserAction,
Workspace,
IWorkspace,
ServiceTokenData,
IServiceTokenData,
APIKeyData,
IAPIKeyData,
LoginSRPDetail,
ILoginSRPDetail,
TokenVersion,
ITokenVersion
};
export * from "./backupPrivateKey";
export * from "./bot";
export * from "./botOrg";
export * from "./botKey";
export * from "./incidentContactOrg";
export * from "./integration/integration";
export * from "./integrationAuth";
export * from "./key";
export * from "./membership";
export * from "./membershipOrg";
export * from "./organization";
export * from "./secret";
export * from "./tag";
export * from "./folder";
export * from "./secretImports";
export * from "./secretBlindIndexData";
export * from "./serviceToken";
export * from "./serviceAccount";
export * from "./serviceAccountKey";
export * from "./serviceAccountOrganizationPermission";
export * from "./serviceAccountWorkspacePermission";
export * from "./tokenData";
export * from "./user";
export * from "./userAction";
export * from "./workspace";
export * from "./serviceTokenData";
export * from "./apiKeyData";
export * from "./loginSRPDetail";
export * from "./tokenVersion";
export * from "./webhooks";

View File

@ -0,0 +1 @@
export * from "./integration";

View File

@ -10,6 +10,7 @@ import {
INTEGRATION_CODEFRESH,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_FLYIO,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
@ -25,8 +26,9 @@ import {
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL,
INTEGRATION_WINDMILL
} from "../variables";
} from "../../variables";
import { Schema, Types, model } from "mongoose";
import { Metadata } from "./types";
export interface IIntegration {
_id: Types.ObjectId;
@ -70,8 +72,10 @@ export interface IIntegration {
| "digital-ocean-app-platform"
| "cloud-66"
| "northflank"
| "windmill";
| "windmill"
| "gcp-secret-manager";
integrationAuth: Types.ObjectId;
metadata: Metadata;
}
const integrationSchema = new Schema<IIntegration>(
@ -167,7 +171,8 @@ const integrationSchema = new Schema<IIntegration>(
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
INTEGRATION_NORTHFLANK,
INTEGRATION_GCP_SECRET_MANAGER
],
required: true,
},
@ -180,6 +185,9 @@ const integrationSchema = new Schema<IIntegration>(
type: String,
required: true,
default: "/",
},
metadata: {
type: Schema.Types.Mixed
}
},
{
@ -187,6 +195,4 @@ const integrationSchema = new Schema<IIntegration>(
}
);
const Integration = model<IIntegration>("Integration", integrationSchema);
export default Integration;
export const Integration = model<IIntegration>("Integration", integrationSchema);

View File

@ -0,0 +1,3 @@
export type Metadata = {
secretSuffix?: string;
}

View File

@ -12,6 +12,7 @@ import {
INTEGRATION_CODEFRESH,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_FLYIO,
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_HASHICORP_VAULT,
@ -58,7 +59,8 @@ export interface IIntegrationAuth extends Document {
| "terraform-cloud"
| "teamcity"
| "northflank"
| "windmill";
| "windmill"
| "gcp-secret-manager";
teamId: string;
accountId: string;
url: string;
@ -111,7 +113,8 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
INTEGRATION_NORTHFLANK,
INTEGRATION_GCP_SECRET_MANAGER
],
required: true,
},
@ -190,9 +193,7 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
}
);
const IntegrationAuth = model<IIntegrationAuth>(
export const IntegrationAuth = model<IIntegrationAuth>(
"IntegrationAuth",
integrationAuthSchema
);
export default IntegrationAuth;
);

View File

@ -40,6 +40,4 @@ const keySchema = new Schema<IKey>(
}
);
const Key = model<IKey>("Key", keySchema);
export default Key;
export const Key = model<IKey>("Key", keySchema);

View File

@ -24,6 +24,4 @@ const loginSRPDetailSchema = new Schema<ILoginSRPDetail>(
}
);
const LoginSRPDetail = model("LoginSRPDetail", loginSRPDetailSchema);
export default LoginSRPDetail;
export const LoginSRPDetail = model("LoginSRPDetail", loginSRPDetailSchema);

View File

@ -52,6 +52,4 @@ const membershipSchema = new Schema<IMembership>(
}
);
const Membership = model<IMembership>("Membership", membershipSchema);
export default Membership;
export const Membership = model<IMembership>("Membership", membershipSchema);

View File

@ -39,9 +39,7 @@ const membershipOrgSchema = new Schema(
}
);
const MembershipOrg = model<IMembershipOrg>(
export const MembershipOrg = model<IMembershipOrg>(
"MembershipOrg",
membershipOrgSchema
);
export default MembershipOrg;
);

View File

@ -21,6 +21,4 @@ const organizationSchema = new Schema<IOrganization>(
}
);
const Organization = model<IOrganization>("Organization", organizationSchema);
export default Organization;
export const Organization = model<IOrganization>("Organization", organizationSchema);

View File

@ -31,6 +31,9 @@ export interface ISecret {
keyEncoding: "utf8" | "base64";
tags?: string[];
folder?: string;
metadata?: {
[key: string]: string;
}
}
const secretSchema = new Schema<ISecret>(
@ -131,6 +134,9 @@ const secretSchema = new Schema<ISecret>(
type: String,
default: "root",
},
metadata: {
type: Schema.Types.Mixed
}
},
{
timestamps: true,
@ -139,6 +145,4 @@ const secretSchema = new Schema<ISecret>(
secretSchema.index({ tags: 1 }, { background: true });
const Secret = model<ISecret>("Secret", secretSchema);
export default Secret;
export const Secret = model<ISecret>("Secret", secretSchema);

View File

@ -1,5 +1,5 @@
import mongoose, { Schema, model } from "mongoose";
import Secret, { ISecret } from "./secret";
import { ISecret, Secret } from "./secret";
interface ISecretApprovalRequest {
secret: mongoose.Types.ObjectId;
@ -78,6 +78,4 @@ const secretApprovalRequestSchema = new Schema<ISecretApprovalRequest>(
}
);
const SecretApprovalRequest = model<ISecretApprovalRequest>("SecretApprovalRequest", secretApprovalRequestSchema);
export default SecretApprovalRequest;
export const SecretApprovalRequest = model<ISecretApprovalRequest>("SecretApprovalRequest", secretApprovalRequestSchema);

View File

@ -53,6 +53,4 @@ const secretBlindIndexDataSchema = new Schema<ISecretBlindIndexData>(
}
);
const SecretBlindIndexData = model<ISecretBlindIndexData>("SecretBlindIndexData", secretBlindIndexDataSchema);
export default SecretBlindIndexData;
export const SecretBlindIndexData = model<ISecretBlindIndexData>("SecretBlindIndexData", secretBlindIndexDataSchema);

View File

@ -48,5 +48,4 @@ const secretImportSchema = new Schema<ISecretImports>(
}
);
const SecretImport = model<ISecretImports>("SecretImports", secretImportSchema);
export default SecretImport;
export const SecretImport = model<ISecretImports>("SecretImports", secretImportSchema);

View File

@ -48,6 +48,4 @@ const serviceAccountSchema = new Schema<IServiceAccount>(
}
);
const ServiceAccount = model<IServiceAccount>("ServiceAccount", serviceAccountSchema);
export default ServiceAccount;
export const ServiceAccount = model<IServiceAccount>("ServiceAccount", serviceAccountSchema);

View File

@ -39,6 +39,4 @@ const serviceAccountKeySchema = new Schema<IServiceAccountKey>(
}
);
const ServiceAccountKey = model<IServiceAccountKey>("ServiceAccountKey", serviceAccountKeySchema);
export default ServiceAccountKey;
export const ServiceAccountKey = model<IServiceAccountKey>("ServiceAccountKey", serviceAccountKeySchema);

View File

@ -18,6 +18,4 @@ const serviceAccountOrganizationPermissionSchema = new Schema<IServiceAccountOrg
}
);
const ServiceAccountOrganizationPermission = model<IServiceAccountOrganizationPermission>("ServiceAccountOrganizationPermission", serviceAccountOrganizationPermissionSchema);
export default ServiceAccountOrganizationPermission;
export const ServiceAccountOrganizationPermission = model<IServiceAccountOrganizationPermission>("ServiceAccountOrganizationPermission", serviceAccountOrganizationPermissionSchema);

View File

@ -39,6 +39,4 @@ const serviceAccountWorkspacePermissionSchema = new Schema<IServiceAccountWorksp
}
);
const ServiceAccountWorkspacePermission = model<IServiceAccountWorkspacePermission>("ServiceAccountWorkspacePermission", serviceAccountWorkspacePermissionSchema);
export default ServiceAccountWorkspacePermission;
export const ServiceAccountWorkspacePermission = model<IServiceAccountWorkspacePermission>("ServiceAccountWorkspacePermission", serviceAccountWorkspacePermissionSchema);

View File

@ -56,6 +56,4 @@ const serviceTokenSchema = new Schema<IServiceToken>(
}
);
const ServiceToken = model<IServiceToken>("ServiceToken", serviceTokenSchema);
export default ServiceToken;
export const ServiceToken = model<IServiceToken>("ServiceToken", serviceTokenSchema);

View File

@ -89,6 +89,4 @@ const serviceTokenDataSchema = new Schema<IServiceTokenData>(
}
);
const ServiceTokenData = model<IServiceTokenData>("ServiceTokenData", serviceTokenDataSchema);
export default ServiceTokenData;
export const ServiceTokenData = model<IServiceTokenData>("ServiceTokenData", serviceTokenDataSchema);

View File

@ -3,6 +3,7 @@ import { Schema, Types, model } from "mongoose";
export interface ITag {
_id: Types.ObjectId;
name: string;
tagColor: string;
slug: string;
user: Types.ObjectId;
workspace: Types.ObjectId;
@ -15,6 +16,11 @@ const tagSchema = new Schema<ITag>(
required: true,
trim: true,
},
tagColor: {
type: String,
required: false,
trim: true,
},
slug: {
type: String,
required: true,
@ -44,6 +50,4 @@ const tagSchema = new Schema<ITag>(
tagSchema.index({ slug: 1, workspace: 1 }, { unique: true })
tagSchema.index({ workspace: 1 })
const Tag = model<ITag>("Tag", tagSchema);
export default Tag;
export const Tag = model<ITag>("Tag", tagSchema);

View File

@ -27,6 +27,4 @@ const tokenSchema = new Schema<IToken>({
tokenSchema.index({ email: 1 });
const Token = model<IToken>("Token", tokenSchema);
export default Token;
export const Token = model<IToken>("Token", tokenSchema);

View File

@ -50,6 +50,4 @@ const tokenDataSchema = new Schema<ITokenData>({
timestamps: true,
});
const TokenData = model<ITokenData>("TokenData", tokenDataSchema);
export default TokenData;
export const TokenData = model<ITokenData>("TokenData", tokenDataSchema);

View File

@ -42,6 +42,4 @@ const tokenVersionSchema = new Schema<ITokenVersion>(
}
);
const TokenVersion = model<ITokenVersion>("TokenVersion", tokenVersionSchema);
export default TokenVersion;
export const TokenVersion = model<ITokenVersion>("TokenVersion", tokenVersionSchema);

View File

@ -121,6 +121,4 @@ const userSchema = new Schema<IUser>(
}
);
const User = model<IUser>("User", userSchema);
export default User;
export const User = model<IUser>("User", userSchema);

View File

@ -23,6 +23,4 @@ const userActionSchema = new Schema<IUserAction>(
}
);
const UserAction = model<IUserAction>("UserAction", userActionSchema);
export default UserAction;
export const UserAction = model<IUserAction>("UserAction", userActionSchema);

View File

@ -80,6 +80,4 @@ const WebhookSchema = new Schema<IWebhook>(
}
);
const Webhook = model<IWebhook>("Webhook", WebhookSchema);
export default Webhook;
export const Webhook = model<IWebhook>("Webhook", WebhookSchema);

View File

@ -49,6 +49,4 @@ const workspaceSchema = new Schema<IWorkspace>({
},
});
const Workspace = model<IWorkspace>("Workspace", workspaceSchema);
export default Workspace;
export const Workspace = model<IWorkspace>("Workspace", workspaceSchema);

View File

@ -0,0 +1,83 @@
import Queue, { Job } from "bull";
import { Integration, IntegrationAuth } from "../../models";
import { BotService } from "../../services";
import { getIntegrationAuthAccessHelper } from "../../helpers";
import { syncSecrets } from "../../integrations/sync"
type TSyncSecretsToThirdPartyServices = {
workspaceId: string
environment?: string
}
export const syncSecretsToThirdPartyServices = new Queue("sync-secrets-to-third-party-services", process.env.REDIS_URL as string);
syncSecretsToThirdPartyServices.process(async (job: Job) => {
const { workspaceId, environment }: TSyncSecretsToThirdPartyServices = job.data
const integrations = await Integration.find({
workspace: workspaceId,
...(environment
? {
environment
}
: {}),
isActive: true,
app: { $ne: null }
});
// for each workspace integration, sync/push secrets
// to that integration
for (const integration of integrations) {
// get workspace, environment (shared) secrets
const secrets = await BotService.getSecrets({
workspaceId: integration.workspace,
environment: integration.environment,
secretPath: integration.secretPath
});
const suffixedSecrets: any = {};
if (integration.metadata?.secretSuffix) {
for (const key in secrets) {
const newKey = key + integration.metadata?.secretSuffix;
suffixedSecrets[newKey] = secrets[key];
}
}
const integrationAuth = await IntegrationAuth.findById(integration.integrationAuth);
if (!integrationAuth) throw new Error("Failed to find integration auth");
// get integration auth access token
const access = await getIntegrationAuthAccessHelper({
integrationAuthId: integration.integrationAuth
});
// sync secrets to integration
await syncSecrets({
integration,
integrationAuth,
secrets: Object.keys(suffixedSecrets).length !== 0 ? suffixedSecrets : secrets,
accessId: access.accessId === undefined ? null : access.accessId,
accessToken: access.accessToken
});
}
})
syncSecretsToThirdPartyServices.on("error", (error) => {
console.log("QUEUE ERROR:", error) // eslint-disable-line
})
export const syncSecretsToActiveIntegrationsQueue = (jobDetails: TSyncSecretsToThirdPartyServices) => {
syncSecretsToThirdPartyServices.add(jobDetails, {
attempts: 5,
backoff: {
type: "exponential",
delay: 3000
},
removeOnComplete: true,
removeOnFail: {
count: 20 // keep the most recent 20 jobs
}
})
}

View File

@ -0,0 +1,201 @@
// import Queue, { Job } from "bull";
// import { ProbotOctokit } from "probot"
// import { Commit, Committer, Repository } from "@octokit/webhooks-types";
// import TelemetryService from "../../services/TelemetryService";
// import { sendMail } from "../../helpers";
// import GitRisks from "../../ee/models/gitRisks";
// import { MembershipOrg, User } from "../../models";
// import { OWNER, ADMIN } from "../../variables";
// import { convertKeysToLowercase, getFilesFromCommit, scanContentAndGetFindings } from "../../ee/services/GithubSecretScanning/helper";
// import { getSecretScanningGitAppId, getSecretScanningPrivateKey } from "../../config";
// const githubFullRepositoryScan = new Queue('github-historical-secret-scanning', 'redis://redis:6379');
// type TScanFullRepositoryDetails = {
// organizationId: string,
// repositories: {
// id: number;
// node_id: string;
// name: string;
// full_name: string;
// private: boolean;
// }[] | undefined
// installationId: number
// }
// type SecretMatch = {
// Description: string;
// StartLine: number;
// EndLine: number;
// StartColumn: number;
// EndColumn: number;
// Match: string;
// Secret: string;
// File: string;
// SymlinkFile: string;
// Commit: string;
// Entropy: number;
// Author: string;
// Email: string;
// Date: string;
// Message: string;
// Tags: string[];
// RuleID: string;
// Fingerprint: string;
// FingerPrintWithoutCommitId: string
// };
// type Helllo = {
// url: string;
// sha: string;
// node_id: string;
// html_url: string;
// comments_url: string;
// commit: {
// url: string;
// author: {
// name?: string | undefined;
// email?: string | undefined;
// date?: string | undefined;
// } | null;
// verification?: {
// } | undefined;
// };
// files?: {}[] | undefined;
// }[]
// githubFullRepositoryScan.process(async (job: Job, done: Queue.DoneCallback) => {
// const { organizationId, repositories, installationId }: TScanFullRepositoryDetails = job.data
// const repositoryFullNamesList = repositories ? repositories.map(repoDetails => repoDetails.full_name) : []
// const octokit = new ProbotOctokit({
// auth: {
// appId: await getSecretScanningGitAppId(),
// privateKey: await getSecretScanningPrivateKey(),
// installationId: installationId
// },
// });
// for (const repositoryFullName of repositoryFullNamesList) {
// const [owner, repo] = repositoryFullName.split("/");
// let page = 1;
// while (true) {
// // octokit.repos.getco
// const { data } = await octokit.repos.listCommits({
// owner,
// repo,
// per_page: 100,
// page
// });
// await getFilesFromCommit(octokit, owner, repo, "646b386605177ed0a2cc0a596eeee0cf57666342")
// page++;
// }
// }
// done()
// // const allFindingsByFingerprint: { [key: string]: SecretMatch; } = {}
// // for (const commit of commits) {
// // for (const filepath of [...commit.added, ...commit.modified]) {
// // try {
// // const fileContentsResponse = await octokit.repos.getContent({
// // owner,
// // repo,
// // path: filepath,
// // });
// // const data: any = fileContentsResponse.data;
// // const fileContent = Buffer.from(data.content, "base64").toString();
// // const findings = await scanContentAndGetFindings(`\n${fileContent}`) // extra line to count lines correctly
// // for (const finding of findings) {
// // const fingerPrintWithCommitId = `${commit.id}:${filepath}:${finding.RuleID}:${finding.StartLine}`
// // const fingerPrintWithoutCommitId = `${filepath}:${finding.RuleID}:${finding.StartLine}`
// // finding.Fingerprint = fingerPrintWithCommitId
// // finding.FingerPrintWithoutCommitId = fingerPrintWithoutCommitId
// // finding.Commit = commit.id
// // finding.File = filepath
// // finding.Author = commit.author.name
// // finding.Email = commit?.author?.email ? commit?.author?.email : ""
// // allFindingsByFingerprint[fingerPrintWithCommitId] = finding
// // }
// // } catch (error) {
// // done(new Error(`gitHubHistoricalScanning.process: unable to fetch content for [filepath=${filepath}] because [error=${error}]`), null)
// // }
// // }
// // }
// // // change to update
// // for (const key in allFindingsByFingerprint) {
// // await GitRisks.findOneAndUpdate({ fingerprint: allFindingsByFingerprint[key].Fingerprint },
// // {
// // ...convertKeysToLowercase(allFindingsByFingerprint[key]),
// // installationId: installationId,
// // organization: organizationId,
// // repositoryFullName: repository.fullName,
// // repositoryId: repository.id
// // }, {
// // upsert: true
// // }).lean()
// // }
// // // get emails of admins
// // const adminsOfWork = await MembershipOrg.find({
// // organization: organizationId,
// // $or: [
// // { role: OWNER },
// // { role: ADMIN }
// // ]
// // }).lean()
// // const userEmails = await User.find({
// // _id: {
// // $in: [adminsOfWork.map(orgMembership => orgMembership.user)]
// // }
// // }).select("email").lean()
// // const adminOrOwnerEmails = userEmails.map(userObject => userObject.email)
// // const usersToNotify = pusher?.email ? [pusher.email, ...adminOrOwnerEmails] : [...adminOrOwnerEmails]
// // if (Object.keys(allFindingsByFingerprint).length) {
// // await sendMail({
// // template: "secretLeakIncident.handlebars",
// // subjectLine: `Incident alert: leaked secrets found in Github repository ${repository.fullName}`,
// // recipients: usersToNotify,
// // substitutions: {
// // numberOfSecrets: Object.keys(allFindingsByFingerprint).length,
// // pusher_email: pusher.email,
// // pusher_name: pusher.name
// // }
// // });
// // }
// // const postHogClient = await TelemetryService.getPostHogClient();
// // if (postHogClient) {
// // postHogClient.capture({
// // event: "cloud secret scan",
// // distinctId: pusher.email,
// // properties: {
// // numberOfCommitsScanned: commits.length,
// // numberOfRisksFound: Object.keys(allFindingsByFingerprint).length,
// // }
// // });
// // }
// // done(null, allFindingsByFingerprint)
// })
// export const scanGithubFullRepositoryForSecretLeaks = (scanFullRepositoryDetails: TScanFullRepositoryDetails) => {
// console.log("full repo scan started")
// githubFullRepositoryScan.add(scanFullRepositoryDetails)
// }

View File

@ -0,0 +1,148 @@
import Queue, { Job } from "bull";
import { ProbotOctokit } from "probot"
import { Commit } from "@octokit/webhooks-types";
import TelemetryService from "../../services/TelemetryService";
import { sendMail } from "../../helpers";
import GitRisks from "../../ee/models/gitRisks";
import { MembershipOrg, User } from "../../models";
import { ADMIN, OWNER } from "../../variables";
import { convertKeysToLowercase, scanContentAndGetFindings } from "../../ee/services/GithubSecretScanning/helper";
import { getSecretScanningGitAppId, getSecretScanningPrivateKey } from "../../config";
import { SecretMatch } from "../../ee/services/GithubSecretScanning/types";
export const githubPushEventSecretScan = new Queue("github-push-event-secret-scanning", "redis://redis:6379");
type TScanPushEventQueueDetails = {
organizationId: string,
commits: Commit[]
pusher: {
name: string,
email: string | null
},
repository: {
id: number,
fullName: string,
},
installationId: number
}
githubPushEventSecretScan.process(async (job: Job, done: Queue.DoneCallback) => {
const { organizationId, commits, pusher, repository, installationId }: TScanPushEventQueueDetails = job.data
const [owner, repo] = repository.fullName.split("/");
const octokit = new ProbotOctokit({
auth: {
appId: await getSecretScanningGitAppId(),
privateKey: await getSecretScanningPrivateKey(),
installationId: installationId
},
});
const allFindingsByFingerprint: { [key: string]: SecretMatch; } = {}
for (const commit of commits) {
for (const filepath of [...commit.added, ...commit.modified]) {
try {
const fileContentsResponse = await octokit.repos.getContent({
owner,
repo,
path: filepath,
});
const data: any = fileContentsResponse.data;
const fileContent = Buffer.from(data.content, "base64").toString();
const findings = await scanContentAndGetFindings(`\n${fileContent}`) // extra line to count lines correctly
for (const finding of findings) {
const fingerPrintWithCommitId = `${commit.id}:${filepath}:${finding.RuleID}:${finding.StartLine}`
const fingerPrintWithoutCommitId = `${filepath}:${finding.RuleID}:${finding.StartLine}`
finding.Fingerprint = fingerPrintWithCommitId
finding.FingerPrintWithoutCommitId = fingerPrintWithoutCommitId
finding.Commit = commit.id
finding.File = filepath
finding.Author = commit.author.name
finding.Email = commit?.author?.email ? commit?.author?.email : ""
allFindingsByFingerprint[fingerPrintWithCommitId] = finding
}
} catch (error) {
done(new Error(`gitHubHistoricalScanning.process: unable to fetch content for [filepath=${filepath}] because [error=${error}]`), null)
}
}
}
// change to update
for (const key in allFindingsByFingerprint) {
await GitRisks.findOneAndUpdate({ fingerprint: allFindingsByFingerprint[key].Fingerprint },
{
...convertKeysToLowercase(allFindingsByFingerprint[key]),
installationId: installationId,
organization: organizationId,
repositoryFullName: repository.fullName,
repositoryId: repository.id
}, {
upsert: true
}).lean()
}
// get emails of admins
const adminsOfWork = await MembershipOrg.find({
organization: organizationId,
$or: [
{ role: OWNER },
{ role: ADMIN }
]
}).lean()
const userEmails = await User.find({
_id: {
$in: [adminsOfWork.map(orgMembership => orgMembership.user)]
}
}).select("email").lean()
const adminOrOwnerEmails = userEmails.map(userObject => userObject.email)
const usersToNotify = pusher?.email ? [pusher.email, ...adminOrOwnerEmails] : [...adminOrOwnerEmails]
if (Object.keys(allFindingsByFingerprint).length) {
await sendMail({
template: "secretLeakIncident.handlebars",
subjectLine: `Incident alert: leaked secrets found in Github repository ${repository.fullName}`,
recipients: usersToNotify,
substitutions: {
numberOfSecrets: Object.keys(allFindingsByFingerprint).length,
pusher_email: pusher.email,
pusher_name: pusher.name
}
});
}
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient) {
postHogClient.capture({
event: "cloud secret scan",
distinctId: pusher.email,
properties: {
numberOfCommitsScanned: commits.length,
numberOfRisksFound: Object.keys(allFindingsByFingerprint).length,
}
});
}
done(null, allFindingsByFingerprint)
})
export const scanGithubPushEventForSecretLeaks = (pushEventPayload: TScanPushEventQueueDetails) => {
githubPushEventSecretScan.add(pushEventPayload, {
attempts: 3,
backoff: {
type: "exponential",
delay: 5000
},
removeOnComplete: true,
removeOnFail: {
count: 20 // keep the most recent 20 jobs
}
})
}

View File

@ -1,5 +1,5 @@
import express, { Request, Response } from "express";
import { getInviteOnlySignup, getSecretScanningGitAppId, getSecretScanningPrivateKey, getSecretScanningWebhookSecret, getSmtpConfigured } from "../../config";
import { getInviteOnlySignup, getRedisUrl, getSecretScanningGitAppId, getSecretScanningPrivateKey, getSecretScanningWebhookSecret, getSmtpConfigured } from "../../config";
const router = express.Router();
@ -18,8 +18,9 @@ router.get(
date: new Date(),
message: "Ok",
emailConfigured: await getSmtpConfigured(),
inviteOnlySignup: await getInviteOnlySignup(),
redisConfigured: await getRedisUrl() !== "" && await getRedisUrl() !== undefined,
secretScanningConfigured: secretScanningConfigured,
inviteOnlySignup: await getInviteOnlySignup()
})
}
);

View File

@ -8,19 +8,21 @@ import { AuthMode } from "../../variables";
router.post("/token", validateRequest, authController.getNewToken);
router.post( // TODO endpoint: deprecate (moved to api/v3/auth/login1)
router.post(
// TODO endpoint: deprecate (moved to api/v3/auth/login1)
"/login1",
authLimiter,
body("email").exists().trim().notEmpty(),
body("email").exists().trim().notEmpty().toLowerCase(),
body("clientPublicKey").exists().trim().notEmpty(),
validateRequest,
authController.login1
);
router.post( // TODO endpoint: deprecate (moved to api/v3/auth/login2)
router.post(
// TODO endpoint: deprecate (moved to api/v3/auth/login2)
"/login2",
authLimiter,
body("email").exists().trim().notEmpty(),
body("email").exists().trim().notEmpty().toLowerCase(),
body("clientProof").exists().trim().notEmpty(),
validateRequest,
authController.login2
@ -30,7 +32,7 @@ router.post(
"/logout",
authLimiter,
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
acceptedAuthModes: [AuthMode.JWT]
}),
authController.logout
);
@ -38,24 +40,19 @@ router.post(
router.post(
"/checkAuth",
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
acceptedAuthModes: [AuthMode.JWT]
}),
authController.checkAuth
);
router.get(
"/common-passwords",
authLimiter,
authController.getCommonPasswords
);
router.delete( // TODO endpoint: deprecate (moved to DELETE v2/users/me/sessions)
router.delete(
// TODO endpoint: deprecate (moved to DELETE v2/users/me/sessions)
"/sessions",
authLimiter,
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
}),
acceptedAuthModes: [AuthMode.JWT]
}),
authController.revokeAllSessions
);
export default router;
export default router;

View File

@ -4,6 +4,7 @@ import {
requireAuth,
requireIntegrationAuth,
requireIntegrationAuthorizationAuth,
requireWorkspaceAuth,
validateRequest,
} from "../../middleware";
import {
@ -36,6 +37,8 @@ router.post(
body("owner").trim(),
body("path").trim(),
body("region").trim(),
body("metadata").optional().isObject().withMessage("Metadata should be an object"),
body("metadata.secretSuffix").optional().isString().withMessage("Suffix should be a string"),
validateRequest,
integrationController.createIntegration
);
@ -73,4 +76,19 @@ router.delete(
integrationController.deleteIntegration
);
export default router;
router.post(
"/manual-sync",
requireAuth({
acceptedAuthModes: [AuthMode.JWT]
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "body",
}),
body("environment").isString().exists().trim(),
body("workspaceId").exists().trim(),
validateRequest,
integrationController.manualSync
);
export default router;

View File

@ -168,6 +168,20 @@ router.get(
integrationAuthController.getIntegrationAuthNorthflankSecretGroups
);
router.get(
"/:integrationAuthId/teamcity/build-configs",
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
}),
requireIntegrationAuthorizationAuth({
acceptedRoles: [ADMIN, MEMBER],
}),
param("integrationAuthId").exists().isString(),
query("appId").exists().isString(),
validateRequest,
integrationAuthController.getIntegrationAuthTeamCityBuildConfigs
);
router.delete(
"/:integrationAuthId",
requireAuth({

View File

@ -8,7 +8,7 @@ import { authLimiter } from "../../helpers/rateLimiter";
router.post( // TODO: deprecate (moved to api/v3/auth/login1)
"/login1",
authLimiter,
body("email").isString().trim().notEmpty(),
body("email").isString().trim().notEmpty().toLowerCase(),
body("clientPublicKey").isString().trim().notEmpty(),
validateRequest,
authController.login1
@ -17,7 +17,7 @@ router.post( // TODO: deprecate (moved to api/v3/auth/login1)
router.post( // TODO: deprecate (moved to api/v3/auth/login1)
"/login2",
authLimiter,
body("email").isString().trim().notEmpty(),
body("email").isString().trim().notEmpty().toLowerCase(),
body("clientProof").isString().trim().notEmpty(),
validateRequest,
authController.login2

View File

@ -16,7 +16,7 @@ import {
router.post(
"/:workspaceId/environments",
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
acceptedAuthModes: [AuthMode.JWT, AuthMode.API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
@ -32,7 +32,7 @@ router.post(
router.put(
"/:workspaceId/environments",
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
acceptedAuthModes: [AuthMode.JWT, AuthMode.API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
@ -46,10 +46,28 @@ router.put(
environmentController.renameWorkspaceEnvironment
);
router.patch(
"/:workspaceId/environments",
requireAuth({
acceptedAuthModes: [AuthMode.JWT, AuthMode.API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "params",
}),
param("workspaceId").exists().trim(),
body("environmentSlug").exists().isString().trim(),
body("environmentName").exists().isString().trim(),
body("otherEnvironmentSlug").exists().isString().trim(),
body("otherEnvironmentName").exists().isString().trim(),
validateRequest,
environmentController.reorderWorkspaceEnvironments
);
router.delete(
"/:workspaceId/environments",
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
acceptedAuthModes: [AuthMode.JWT, AuthMode.API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN],
@ -64,7 +82,7 @@ router.delete(
router.get(
"/:workspaceId/environments",
requireAuth({
acceptedAuthModes: [AuthMode.JWT],
acceptedAuthModes: [AuthMode.JWT, AuthMode.API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [MEMBER, ADMIN],

View File

@ -48,6 +48,7 @@ router.post(
}),
param("workspaceId").exists().trim(),
body("name").exists().trim(),
body("tagColor").exists().trim(),
body("slug").exists().trim(),
validateRequest,
tagController.createWorkspaceTag

View File

@ -9,7 +9,7 @@ const router = express.Router();
router.post(
"/login1",
authLimiter,
body("email").isString().trim(),
body("email").isString().trim().toLowerCase(),
body("providerAuthToken").isString().trim().optional({nullable: true}),
body("clientPublicKey").isString().trim().notEmpty(),
validateRequest,
@ -19,7 +19,7 @@ router.post(
router.post(
"/login2",
authLimiter,
body("email").isString().trim(),
body("email").isString().trim().toLowerCase(),
body("providerAuthToken").isString().trim().optional({nullable: true}),
body("clientProof").isString().trim().notEmpty(),
validateRequest,

View File

@ -17,6 +17,7 @@ router.get(
"/raw",
query("workspaceId").optional().isString().trim(),
query("environment").optional().isString().trim(),
query("folderId").optional().isString().trim(),
query("secretPath").default("/").isString().trim(),
query("include_imports").optional().isBoolean().default(false),
validateRequest,
@ -144,6 +145,7 @@ router.get(
"/",
query("workspaceId").exists().isString().trim(),
query("environment").exists().isString().trim(),
query("folderId").optional().isString().trim(),
query("secretPath").default("/").isString().trim(),
validateRequest,
requireAuth({

View File

@ -1,6 +1,6 @@
import { nanoid } from "nanoid";
import { Types } from "mongoose";
import Folder, { TFolderSchema } from "../models/folder";
import { Folder, TFolderSchema } from "../models";
import path from "path";
type TAppendFolderDTO = {

View File

@ -1,18 +1,18 @@
import { Types } from "mongoose";
import {
import {
getIntegrationAuthAccessHelper,
getIntegrationAuthRefreshHelper,
handleOAuthExchangeHelper,
setIntegrationAuthAccessHelper,
setIntegrationAuthRefreshHelper,
syncIntegrationsHelper,
} from "../helpers/integration";
import { syncSecretsToActiveIntegrationsQueue } from "../queues/integrations/syncSecretsToThirdPartyServices";
/**
* Class to handle integrations
*/
class IntegrationService {
/**
* Perform OAuth2 code-token exchange for workspace with id [workspaceId] and integration
* named [integration]
@ -26,12 +26,12 @@ class IntegrationService {
* @param {String} obj1.code - code
* @returns {IntegrationAuth} integrationAuth - integration authorization after OAuth2 code-token exchange
*/
static async handleOAuthExchange({
static async handleOAuthExchange({
workspaceId,
integration,
code,
environment,
}: {
}: {
workspaceId: string;
integration: string;
code: string;
@ -44,25 +44,23 @@ class IntegrationService {
environment,
});
}
/**
* Sync/push environment variables in workspace with id [workspaceId] to
* all associated integrations
* @param {Object} obj
* @param {Object} obj.workspaceId - id of workspace
*/
static async syncIntegrations({
static syncIntegrations({
workspaceId,
environment,
}: {
workspaceId: Types.ObjectId;
environment?: string;
}) {
return await syncIntegrationsHelper({
workspaceId,
});
syncSecretsToActiveIntegrationsQueue({ workspaceId: workspaceId.toString(), environment: environment })
}
/**
* Return decrypted refresh token for integration auth
* with id [integrationAuthId]
@ -70,12 +68,12 @@ class IntegrationService {
* @param {String} obj.integrationAuthId - id of integration auth
* @param {String} refreshToken - decrypted refresh token
*/
static async getIntegrationAuthRefresh({ integrationAuthId }: { integrationAuthId: Types.ObjectId}) {
static async getIntegrationAuthRefresh({ integrationAuthId }: { integrationAuthId: Types.ObjectId }) {
return await getIntegrationAuthRefreshHelper({
integrationAuthId,
});
}
/**
* Return decrypted access token for integration auth
* with id [integrationAuthId]
@ -98,11 +96,11 @@ class IntegrationService {
* @param {String} obj.refreshToken - refresh token
* @returns {IntegrationAuth} integrationAuth - updated integration auth
*/
static async setIntegrationAuthRefresh({
static async setIntegrationAuthRefresh({
integrationAuthId,
refreshToken,
}: {
integrationAuthId: string;
refreshToken,
}: {
integrationAuthId: string;
refreshToken: string;
}) {
return await setIntegrationAuthRefreshHelper({
@ -122,12 +120,12 @@ class IntegrationService {
* @param {Date} obj.accessExpiresAt - expiration date of access token
* @returns {IntegrationAuth} - updated integration auth
*/
static async setIntegrationAuthAccess({
static async setIntegrationAuthAccess({
integrationAuthId,
accessId,
accessToken,
accessExpiresAt,
}: {
}: {
integrationAuthId: string;
accessId: string | null;
accessToken: string;

View File

@ -1,7 +1,10 @@
import { Types } from "mongoose";
import Folder from "../models/folder";
import Secret, { ISecret } from "../models/secret";
import SecretImport from "../models/secretImports";
import {
Folder,
ISecret,
Secret,
SecretImport
} from "../models";
import { getFolderByPath } from "./FolderService";
type TSecretImportFid = { environment: string; folderId: string; secretPath: string };
@ -54,6 +57,14 @@ export const getAllImportedSecrets = async (
type: "shared"
}
},
{
$lookup: {
from: "tags", // note this is the name of the collection in the database, not the Mongoose model name
localField: "tags",
foreignField: "_id",
as: "tags"
}
},
{
$group: {
_id: {

View File

@ -3,7 +3,7 @@ import crypto from "crypto";
import { Types } from "mongoose";
import picomatch from "picomatch";
import { client, getRootEncryptionKey } from "../config";
import Webhook, { IWebhook } from "../models/webhooks";
import { IWebhook, Webhook } from "../models";
export const triggerWebhookRequest = async (
{ url, encryptedSecretKey, iv, tag }: IWebhook,

View File

@ -540,20 +540,26 @@ export const backfillIntegration = async () => {
};
export const backfillServiceTokenMultiScope = async () => {
await ServiceTokenData.updateMany(
{
scopes: {
$exists: false
}
},
[
{
$set: {
scopes: [{ environment: "$environment", secretPath: "$secretPath" }]
const documentsToUpdate = await ServiceTokenData.find({ scopes: { $exists: false } });
for (const doc of documentsToUpdate) {
// Cast doc to any to bypass TypeScript's type checks
const anyDoc = doc as any;
const environment = anyDoc.environment;
const secretPath = anyDoc.secretPath;
if (environment && secretPath) {
const updatedScopes = [
{
environment: environment,
secretPath: secretPath
}
}
]
);
];
await ServiceTokenData.updateOne({ _id: doc._id }, { $set: { scopes: updatedScopes } });
}
}
console.log("Migration: Service token migration v2 complete");
};
@ -649,24 +655,25 @@ export const backfillUserAuthMethods = async () => {
}
);
await User.updateMany(
{
authProvider: {
$exists: true
},
authMethods: {
$exists: false
}
},
[
{
$set: {
authMethods: ["$authProvider"]
const documentsToUpdate = await User.find({
authProvider: { $exists: true },
authMethods: { $exists: false }
});
for (const doc of documentsToUpdate) {
// Cast doc to any to bypass TypeScript's type checks
const anyDoc = doc as any;
const authProvider = anyDoc.authProvider;
const authMethods = [authProvider];
await User.updateOne(
{ _id: doc._id },
{
$set: { authMethods: authMethods },
$unset: { authProvider: 1, authId: 1 }
}
},
{
$unset: ["authProvider", "authId"]
}
]
);
);
}
}

View File

@ -19,7 +19,7 @@ import {
backfillTrustedIps,
backfillUserAuthMethods
} from "./backfillData";
import {
import {
reencryptBotOrgKeys,
reencryptBotPrivateKeys,
reencryptSecretBlindIndexDataSalts
@ -27,6 +27,7 @@ import {
import {
getMongoURL,
getNodeEnv,
getRedisUrl,
getSentryDSN
} from "../../config";
import { initializePassport } from "../auth";
@ -42,6 +43,10 @@ import { initializePassport } from "../auth";
* - Re-encrypting data
*/
export const setup = async () => {
if (await getRedisUrl() === undefined || await getRedisUrl() === "") {
console.error("WARNING: Redis is not yet configured. Infisical may not function as expected without it.")
}
await validateEncryptionKeysConfig();
await TelemetryService.logTelemetryMessage();

View File

@ -2,4 +2,6 @@ export enum AuthMode {
JWT = "jwt",
SERVICE_TOKEN = "serviceToken",
API_KEY = "apiKey"
}
}
export const K8_USER_AGENT_NAME = "k8-operator"

View File

@ -1,17 +1,19 @@
import {
getClientIdAzure,
getClientIdBitBucket,
getClientIdGCPSecretManager,
getClientIdGitHub,
getClientIdGitLab,
getClientIdHeroku,
getClientIdNetlify,
getClientSlugVercel,
getClientSlugVercel
} from "../config";
// integrations
export const INTEGRATION_AZURE_KEY_VAULT = "azure-key-vault";
export const INTEGRATION_AWS_PARAMETER_STORE = "aws-parameter-store";
export const INTEGRATION_AWS_SECRET_MANAGER = "aws-secret-manager";
export const INTEGRATION_GCP_SECRET_MANAGER = "gcp-secret-manager";
export const INTEGRATION_HEROKU = "heroku";
export const INTEGRATION_VERCEL = "vercel";
export const INTEGRATION_NETLIFY = "netlify";
@ -36,35 +38,37 @@ export const INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM = "digital-ocean-app-platfor
export const INTEGRATION_CLOUD_66 = "cloud-66";
export const INTEGRATION_NORTHFLANK = "northflank";
export const INTEGRATION_SET = new Set([
INTEGRATION_GCP_SECRET_MANAGER,
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_TEAMCITY,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
INTEGRATION_HEROKU,
INTEGRATION_VERCEL,
INTEGRATION_NETLIFY,
INTEGRATION_GITHUB,
INTEGRATION_GITLAB,
INTEGRATION_RENDER,
INTEGRATION_FLYIO,
INTEGRATION_CIRCLECI,
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_TEAMCITY,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
]);
// integration types
export const INTEGRATION_OAUTH2 = "oauth2";
// integration oauth endpoints
export const INTEGRATION_GCP_TOKEN_URL = "https://accounts.google.com/o/oauth2/token";
export const INTEGRATION_AZURE_TOKEN_URL = "https://login.microsoftonline.com/common/oauth2/v2.0/token";
export const INTEGRATION_HEROKU_TOKEN_URL = "https://id.heroku.com/oauth/token";
export const INTEGRATION_VERCEL_TOKEN_URL =
@ -76,6 +80,7 @@ export const INTEGRATION_GITLAB_TOKEN_URL = "https://gitlab.com/oauth/token";
export const INTEGRATION_BITBUCKET_TOKEN_URL = "https://bitbucket.org/site/oauth2/access_token"
// integration apps endpoints
export const INTEGRATION_GCP_API_URL = "https://cloudresourcemanager.googleapis.com";
export const INTEGRATION_HEROKU_API_URL = "https://api.heroku.com";
export const INTEGRATION_GITLAB_API_URL = "https://gitlab.com/api";
export const INTEGRATION_VERCEL_API_URL = "https://api.vercel.com";
@ -97,6 +102,10 @@ export const INTEGRATION_DIGITAL_OCEAN_API_URL = "https://api.digitalocean.com";
export const INTEGRATION_CLOUD_66_API_URL = "https://app.cloud66.com/api";
export const INTEGRATION_NORTHFLANK_API_URL = "https://api.northflank.com";
export const INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME = "secretmanager.googleapis.com"
export const INTEGRATION_GCP_SECRET_MANAGER_URL = `https://${INTEGRATION_GCP_SECRET_MANAGER_SERVICE_NAME}`;
export const INTEGRATION_GCP_SERVICE_USAGE_URL = "https://serviceusage.googleapis.com";
export const getIntegrationOptions = async () => {
const INTEGRATION_OPTIONS = [
{
@ -272,12 +281,12 @@ export const getIntegrationOptions = async () => {
docsLink: "",
},
{
name: "Google Cloud Platform",
slug: "gcp",
name: "GCP Secret Manager",
slug: "gcp-secret-manager",
image: "Google Cloud Platform.png",
isAvailable: false,
type: "",
clientId: "",
isAvailable: true,
type: "oauth",
clientId: await getClientIdGCPSecretManager(),
docsLink: ""
},
{

View File

@ -2,6 +2,8 @@ import { Server } from "http";
import main from "../src";
import { afterAll, beforeAll, describe, expect, it } from "@jest/globals";
import request from "supertest";
import { githubPushEventSecretScan } from "../src/queues/secret-scanning/githubScanPushEvent";
import { syncSecretsToThirdPartyServices } from "../src/queues/integrations/syncSecretsToThirdPartyServices";
let server: Server;
@ -11,6 +13,8 @@ beforeAll(async () => {
afterAll(async () => {
server.close();
githubPushEventSecretScan.close()
syncSecretsToThirdPartyServices.close()
});
describe("Healthcheck endpoint", () => {

View File

@ -57,14 +57,14 @@ func CallLogin1V2(httpClient *resty.Client, request GetLoginOneV2Request) (GetLo
SetResult(&loginOneV2Response).
SetHeader("User-Agent", USER_AGENT).
SetBody(request).
Post(fmt.Sprintf("%v/v2/auth/login1", config.INFISICAL_URL))
Post(fmt.Sprintf("%v/v3/auth/login1", config.INFISICAL_URL))
if err != nil {
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V2: Unable to complete api request [err=%s]", err)
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V3: Unable to complete api request [err=%s]", err)
}
if response.IsError() {
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V2: Unsuccessful response: [response=%s]", response)
return GetLoginOneV2Response{}, fmt.Errorf("CallLogin1V3: Unsuccessful response: [response=%s]", response)
}
return loginOneV2Response, nil
@ -115,7 +115,7 @@ func CallLogin2V2(httpClient *resty.Client, request GetLoginTwoV2Request) (GetLo
SetResult(&loginTwoV2Response).
SetHeader("User-Agent", USER_AGENT).
SetBody(request).
Post(fmt.Sprintf("%v/v2/auth/login2", config.INFISICAL_URL))
Post(fmt.Sprintf("%v/v3/auth/login2", config.INFISICAL_URL))
cookies := response.Cookies()
// Find a cookie by name
@ -134,11 +134,11 @@ func CallLogin2V2(httpClient *resty.Client, request GetLoginTwoV2Request) (GetLo
}
if err != nil {
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V2: Unable to complete api request [err=%s]", err)
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V3: Unable to complete api request [err=%s]", err)
}
if response.IsError() {
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V2: Unsuccessful response: [response=%s]", response)
return GetLoginTwoV2Response{}, fmt.Errorf("CallLogin2V3: Unsuccessful response: [response=%s]", response)
}
return loginTwoV2Response, nil

View File

@ -64,11 +64,22 @@ var secretsCmd = &cobra.Command{
util.HandleError(err, "Unable to parse flag")
}
secretOverriding, err := cmd.Flags().GetBool("secret-overriding")
if err != nil {
util.HandleError(err, "Unable to parse flag")
}
secrets, err := util.GetAllEnvironmentVariables(models.GetAllSecretsParameters{Environment: environmentName, InfisicalToken: infisicalToken, TagSlugs: tagSlugs, SecretsPath: secretsPath, IncludeImport: includeImports})
if err != nil {
util.HandleError(err)
}
if secretOverriding {
secrets = util.OverrideSecrets(secrets, util.SECRET_TYPE_PERSONAL)
} else {
secrets = util.OverrideSecrets(secrets, util.SECRET_TYPE_SHARED)
}
if shouldExpandSecrets {
secrets = util.ExpandSecrets(secrets, infisicalToken)
}
@ -641,6 +652,7 @@ func init() {
secretsGetCmd.Flags().String("token", "", "Fetch secrets using the Infisical Token")
secretsCmd.AddCommand(secretsGetCmd)
secretsCmd.Flags().Bool("secret-overriding", true, "Prioritizes personal secrets, if any, with the same name over shared secrets")
secretsCmd.AddCommand(secretsSetCmd)
secretsSetCmd.Flags().String("path", "/", "get secrets within a folder path")

View File

@ -22,7 +22,7 @@ Resources:
DocumentDBCluster:
Type: "AWS::DocDB::DBCluster"
Properties:
EngineVersion: 4.0.0
EngineVersion: 5.0.0
StorageEncrypted: true
MasterUsername: !Ref DocumentDBUsername
MasterUserPassword: !Ref DocumentDBPassword
@ -38,7 +38,7 @@ Resources:
Type: "AWS::DocDB::DBClusterParameterGroup"
Properties:
Description: "description"
Family: "docdb4.0"
Family: "docdb5.0"
Parameters:
tls: "disabled"
ttl_monitor: "disabled"
@ -97,6 +97,7 @@ Resources:
echo "JWT_SERVICE_SECRET=${!JWT_SERVICE_SECRET}" >> .env
echo "MONGO_URL=${!DOCUMENT_DB_CONNECTION_URL}" >> .env
echo "HTTPS_ENABLED=false" >> .env
echo "REDIS_URL=redis://redis:6379" >> .env
docker-compose up -d
@ -174,4 +175,4 @@ Metadata:
x: 270
"y": 90
z: 1
embeds: []
embeds: []

View File

@ -21,6 +21,7 @@ services:
depends_on:
- mongo
- smtp-server
- redis
build:
context: ./backend
dockerfile: Dockerfile
@ -99,9 +100,36 @@ services:
networks:
- infisical-dev
redis:
image: redis
container_name: infisical-dev-redis
environment:
- ALLOW_EMPTY_PASSWORD=yes
ports:
- 6379:6379
volumes:
- redis_data:/data
networks:
- infisical-dev
redis-commander:
container_name: infisical-dev-redis-commander
image: rediscommander/redis-commander
restart: always
depends_on:
- redis
environment:
- REDIS_HOSTS=local:redis:6379
ports:
- "8085:8081"
networks:
- infisical-dev
volumes:
mongo-data:
driver: local
redis_data:
driver: local
networks:
infisical-dev:

View File

@ -41,19 +41,17 @@ services:
networks:
- infisical
# secret-scanning-git-app:
# container_name: infisical-secret-scanning-git-app
# restart: unless-stopped
# depends_on:
# - backend
# - frontend
# - mongo
# ports:
# - "3000:3001"
# image: infisical/staging_deployment_secret-scanning-git-app
# env_file: .env
# networks:
# - infisical
redis:
image: redis
container_name: infisical-dev-redis
environment:
- ALLOW_EMPTY_PASSWORD=yes
ports:
- 6379:6379
networks:
- infisical
volumes:
- redis_data:/data
mongo:
container_name: infisical-mongo
@ -71,6 +69,8 @@ services:
volumes:
mongo-data:
driver: local
redis_data:
driver: local
networks:
infisical:

View File

@ -0,0 +1,4 @@
---
title: "Create"
openapi: "POST /api/v2/workspace/{workspaceId}/environments"
---

View File

@ -0,0 +1,4 @@
---
title: "Delete"
openapi: "DELETE /api/v2/workspace/{workspaceId}/environments"
---

View File

@ -0,0 +1,4 @@
---
title: "List"
openapi: "GET /api/v2/workspace/{workspaceId}/environments"
---

View File

@ -0,0 +1,4 @@
---
title: "Update"
openapi: "PUT /api/v2/workspace/{workspaceId}/environments"
---

View File

@ -1,6 +1,6 @@
---
title: "Retrieve"
openapi: "GET /api/v3/secrets/{secretName}"
title: "List"
openapi: "GET /api/v3/secrets/"
---
<Tip>

View File

@ -1,6 +1,6 @@
---
title: "Retrieve All"
openapi: "GET /api/v3/secrets/"
title: "Retrieve"
openapi: "GET /api/v3/secrets/{secretName}"
---
<Tip>

Some files were not shown because too many files have changed in this diff Show More