Compare commits

..

117 Commits

Author SHA1 Message Date
b78150e78d Updated membership logic for SAML auth 2023-07-31 17:36:42 +07:00
59ebe0c22e Merge pull request #805 from Infisical/jumpcloud-saml
Add JumpCloud SAML Support
2023-07-30 14:36:10 +07:00
6729caeb75 Add JumpCloud SAML 2023-07-30 14:29:47 +07:00
26867f7328 Update overview.mdx 2023-07-29 15:16:11 -07:00
233459d063 Merge pull request #804 from Infisical/jumpcloud-saml
Optimize SAML SSO configuration flow and add documentation for Azure AD SAML
2023-07-29 15:01:41 +07:00
ba6355e4d2 Fix lint errors 2023-07-29 14:58:13 +07:00
e961a30937 Optimize SAML SSO configuration flow, add docs for Azure AD SAML 2023-07-29 14:39:06 +07:00
53ff420304 Merge pull request #802 from akhilmhdh/feat/org-overview-loading
feat: added loading state for org overview page
2023-07-28 15:13:51 -04:00
196a613f16 Merge pull request #790 from subh-cs/subh-cs/better-logs-k8-operator
Better log for k8-operator
2023-07-28 15:11:40 -04:00
cc4b749ce8 Revise SAML flow, update Okta SAML docs 2023-07-29 01:57:07 +07:00
8cc5f2ef43 typo in cli usage.mdx 2023-07-28 13:26:35 -04:00
06bc02c392 feat: added loading state and show empty state only when loading for org overview page 2023-07-28 21:33:31 +05:30
3682c4d044 Merge pull request #800 from akhilmhdh/fix/style-overview-fixes
feat: fixed padding, added progress bar for routing, added sticky hea…
2023-07-28 09:51:33 -04:00
52892c26e5 feat: fixed padding, added progress bar for routing, added sticky header for overview 2023-07-28 16:55:22 +05:30
ed2cf68935 Merge remote-tracking branch 'origin' into jumpcloud-saml 2023-07-28 14:52:28 +07:00
386bc09d49 Update Okta SSO image convention 2023-07-28 14:52:07 +07:00
353c6e9166 Merge pull request #798 from Infisical/windmill-docs
Add docs for Windmill integration
2023-07-28 13:26:20 +07:00
1f69467207 Add docs for Windmill integration 2023-07-28 13:24:33 +07:00
5ab218f1f8 fixed parsing .env with : 2023-07-27 19:40:39 -07:00
e1b25aaa54 fixed the padding issue for the secret raw 2023-07-27 19:03:02 -07:00
9193e7ef58 fix styling issues with secret rows 2023-07-27 15:50:42 -07:00
3f998296fe ip table fix 2023-07-27 13:33:31 -07:00
6f7601f2c4 Merge pull request #793 from akhilmhdh/feat/new-overview-page
Feat/new overview page
2023-07-27 15:42:34 -04:00
b7c7544baf minor style changes 2023-07-27 12:37:18 -07:00
4b7ae2477a Merge pull request #765 from sunilk4u/feat/windmill-integration
Feature: Windmill.dev cloud Integeration
2023-07-28 02:11:27 +07:00
e548883bba Fix lint errors, merge conflicts 2023-07-28 02:02:26 +07:00
a7ece1830e Revise Windmill integration 2023-07-28 01:30:28 +07:00
6502d232c9 Start Azure AD SAML docs 2023-07-27 23:48:53 +07:00
f31e8ddfe9 feat: added width for expandable table and secret missing count ui fix 2023-07-27 20:57:38 +05:30
7bbbdcc58b feat: implemented new overview page with improvement in dashboard 2023-07-27 16:36:34 +05:30
bca14dd5c4 feat: added new secret input component and updated toolbar key special prop to innerKey 2023-07-27 16:36:34 +05:30
b6b3c8a736 fix: resolved v2 secret update bug and object returning in import secret empty 2023-07-27 16:29:43 +05:30
d458bd7948 Merge branch 'feat/northflank-integration' 2023-07-27 15:18:28 +07:00
239989ceab Update contributors README 2023-07-27 15:17:15 +07:00
7ff13242c0 Add docs for Northflank 2023-07-27 15:16:11 +07:00
7db8555b65 Merge pull request #788 from ChukwunonsoFrank/feat/northflank-integration
Feature: Northflank integration
2023-07-27 15:15:36 +07:00
980a578bd5 Revise Northflank integration 2023-07-27 14:52:52 +07:00
adb27bb729 fix: allow apps which have write access 2023-07-27 13:11:48 +05:30
d89d360880 Merge pull request #792 from Infisical/fix-ip-whitelisting
Update IP allowlist implementation
2023-07-27 11:47:56 +07:00
8ed5dbb26a Add default IPV6 CIDR for creating workspace 2023-07-27 11:23:57 +07:00
221a43e8a4 Update IP allowlist implementation 2023-07-27 11:18:36 +07:00
e8a2575f7e logging workspaceId, tokenName from k8-operator 2023-07-27 09:10:08 +05:30
41c1828324 roll forward: disable IP white listing 2023-07-26 20:50:53 -04:00
c2c8cf90b7 Merge branch 'main' of https://github.com/Infisical/infisical 2023-07-26 14:03:47 -07:00
00b4d6bd45 changed the icon 2023-07-26 14:03:37 -07:00
f5a6270d2a add workspace auth for multi env/glob request 2023-07-26 16:50:35 -04:00
bc9d6253be change isDisabled criteria for Create Integration button 2023-07-26 21:19:02 +01:00
a5b37c80ad chore: resolve merge conflicts 2023-07-26 20:39:51 +01:00
7b1a4fa8e4 change regexp to accept deeper level paths 2023-07-27 00:48:17 +05:30
7457f573e9 add dash and underscores for secret pattern test 2023-07-26 23:43:44 +05:30
d67e96507a fix:unauthorized response for app name 2023-07-26 23:14:42 +05:30
46545c1462 add secretGroup to integrationController.ts 2023-07-26 18:19:54 +01:00
8331cd4de8 Merge pull request #761 from atimapreandrew/terraform-cloud-integration
Terraform cloud integration
2023-07-26 23:16:51 +07:00
3447074eb5 Fix merge conflicts 2023-07-26 23:13:33 +07:00
5a708ee931 Optimize Terraform Cloud sync function 2023-07-26 23:10:38 +07:00
9913b2fb6c Initialize TrustedIP upon creating a new workspace 2023-07-26 22:20:51 +07:00
2c021f852f Update filter for trusted IPs backfill 2023-07-26 21:15:07 +07:00
8dbc894ce9 Replace insertMany operation with upsert for backfilling trusted ips 2023-07-26 20:45:58 +07:00
511904605f Merge pull request #786 from Infisical/debug-integrations
Fix PATCH IP whitelist behavior and breaking integrations due to incorrect project id in local storage
2023-07-26 17:55:24 +07:00
7ae6d1610f Fix IP whitelist PATCH endpoint, update localStorage project id to reflect navigated to project 2023-07-26 17:46:33 +07:00
7da6d72f13 Remove save call from backfilling trusted ips 2023-07-26 16:18:13 +07:00
ad33356994 Remove required comment for trusted IP schema 2023-07-26 15:29:43 +07:00
cfa2461479 Merge pull request #785 from Infisical/network-access
Add support for IP allowlisting / trusted IPs
2023-07-26 15:11:04 +07:00
bf08bfacb5 Fix lint errors 2023-07-26 15:06:18 +07:00
cf77820059 Merge remote-tracking branch 'origin' into network-access 2023-07-26 14:55:12 +07:00
1ca90f56b8 Add docs for IP allowlisting 2023-07-26 14:51:25 +07:00
5899d7aee9 Complete trusted IPs feature 2023-07-26 13:34:56 +07:00
b565194c43 create versions for brew releases 2023-07-25 15:39:29 -04:00
86e04577c9 print exec error messages as is 2023-07-25 14:17:31 -04:00
f4b3cafc5b Added Terraform Cloud integration docs 2023-07-25 16:51:53 +01:00
18aad7d520 Terraform Cloud integration 2023-07-25 15:25:11 +01:00
54c79012db fix the org-members link 2023-07-25 07:01:27 -07:00
4b720bf940 Update kubernetes.mdx 2023-07-24 18:13:35 -04:00
993866bb8b Update secret-reference.mdx 2023-07-24 17:18:07 -04:00
8c39fa2438 add conditional imports to raw api 2023-07-24 15:01:31 -04:00
7bccfaefac Merge pull request #784 from akhilmhdh/fix/import-delete
fix: resolved secret import delete and include_import response control
2023-07-24 11:47:02 -04:00
e2b666345b fix: resolved secret import delete and include_import response control 2023-07-24 20:53:59 +05:30
90910819a3 Merge pull request #778 from afrieirham/docs/running-docs-locally
docs: add running infisical docs locally guide
2023-07-24 09:00:46 -04:00
8b070484dd docs: add running infisical docs locally 2023-07-24 20:36:42 +08:00
a764087c83 Merge pull request #782 from Infisical/improve-security-docs
Add section on service token best practices
2023-07-24 18:14:26 +07:00
27d5fa5aa0 Add section on service token best practices 2023-07-24 18:10:37 +07:00
2e7705999c Updated changelog and contributors in README 2023-07-24 12:56:02 +07:00
428bf8e252 Merge branch 'main' of https://github.com/Infisical/infisical 2023-07-23 13:59:42 -07:00
264740d84d style updates 2023-07-23 13:59:25 -07:00
723bcd4d83 Update react.mdx 2023-07-23 15:44:54 -04:00
9ed516ccb6 Uncomment Google SSO for signup 2023-07-24 02:35:51 +07:00
c2be6674b1 chore: resolve merge conflicts 2023-07-22 11:29:40 +01:00
c62504d658 correct codefresh image file name 2023-07-20 19:21:04 +05:30
ce08512ab5 Merge remote-tracking branch 'upstream/main' into feat/windmill-integration 2023-07-20 19:20:38 +05:30
8abe7c7f99 add secretGroup attribute to model definition 2023-07-20 12:58:07 +01:00
b3baaac5c8 map secret comments to windmill api description 2023-07-20 12:57:16 +05:30
aa019e1501 add pattern match for windmill stored secrets 2023-07-20 02:12:36 +05:30
0f8b505c78 change label for windmill workspace form 2023-07-20 01:45:16 +05:30
5b7e23cdc5 add authorization of user for each app 2023-07-20 01:44:21 +05:30
ec1e842202 change windmill workspace label 2023-07-19 19:04:59 +05:30
83d5291998 add interface for windmill request body 2023-07-19 15:00:42 +05:30
638e011cc0 add windmill logo to integration variable 2023-07-19 14:47:37 +05:30
d2d23a7aba add windmill logo 2023-07-19 14:47:15 +05:30
a52c2f03bf add integration slug name mapping for windmill 2023-07-19 14:12:05 +05:30
51c12e0202 Merge branch 'Infisical:main' into feat/windmill-integration 2023-07-19 13:15:21 +05:30
4db7b0c05e add function for windmill secret sync 2023-07-19 13:13:14 +05:30
edef22d28e Terraform Cloud integration 2023-07-18 23:14:41 +01:00
76f43ab6b4 Terraform Cloud integration 2023-07-18 21:08:30 +01:00
6ee7081640 add secret groups field functionality 2023-07-17 22:00:48 +01:00
04611d980b create windmill get all workspaces list function 2023-07-17 16:50:27 +05:30
6125246794 add integration authorize redirect url 2023-07-17 16:35:11 +05:30
52e26fc6fa create integration pages for windmill 2023-07-17 16:34:39 +05:30
06bd98bf56 add windmill variables to model schema 2023-07-17 15:12:12 +05:30
7c24e0181a add windmill variables to integration 2023-07-17 15:09:15 +05:30
ceeebc24fa Terraform Cloud integration 2023-07-16 21:12:35 +01:00
112d4ec9c0 refactor: modify Northflank integration sync logic 2023-07-12 12:25:44 +01:00
a3836b970a Terraform Cloud integration 2023-07-10 23:44:55 +01:00
5e2b31cb6c add window redirect for the Northflank integration 2023-07-10 12:57:16 +01:00
3c45941474 chore: resolve merge conflicts 2023-07-09 17:38:45 +01:00
91e172fd79 add Northflank specific create.tsx file 2023-07-09 16:18:58 +01:00
3e975dc4f0 Terraform Cloud integration 2023-07-08 00:07:38 +01:00
d9ab38c590 chore: resolve merge conflicts 2023-07-04 22:52:23 +01:00
232 changed files with 8687 additions and 4195 deletions

View File

@ -108,6 +108,22 @@ brews:
zsh_completion.install "completions/infisical.zsh" => "_infisical"
fish_completion.install "completions/infisical.fish"
man1.install "manpages/infisical.1.gz"
- name: 'infisical@{{.Version}}'
tap:
owner: Infisical
name: homebrew-get-cli
commit_author:
name: "Infisical"
email: ai@infisical.com
folder: Formula
homepage: "https://infisical.com"
description: "The official Infisical CLI"
install: |-
bin.install "infisical"
bash_completion.install "completions/infisical.bash" => "infisical"
zsh_completion.install "completions/infisical.zsh" => "_infisical"
fish_completion.install "completions/infisical.fish"
man1.install "manpages/infisical.1.gz"
nfpms:
- id: infisical

View File

@ -1,4 +1,3 @@
#!/usr/bin/env sh
. "$(dirname -- "$0")/_/husky.sh"

File diff suppressed because one or more lines are too long

View File

@ -8,6 +8,7 @@ import {
ALGORITHM_AES_256_GCM,
ENCODING_SCHEME_UTF8,
INTEGRATION_BITBUCKET_API_URL,
INTEGRATION_NORTHFLANK_API_URL,
INTEGRATION_RAILWAY_API_URL,
INTEGRATION_SET,
INTEGRATION_VERCEL_API_URL,
@ -445,6 +446,79 @@ export const getIntegrationAuthBitBucketWorkspaces = async (req: Request, res: R
});
};
/**
* Return list of secret groups for Northflank project with id [appId]
* @param req
* @param res
* @returns
*/
export const getIntegrationAuthNorthflankSecretGroups = async (req: Request, res: Response) => {
const appId = req.query.appId as string;
interface NorthflankSecretGroup {
id: string;
name: string;
description: string;
priority: number;
projectId: string;
}
interface SecretGroup {
name: string;
groupId: string;
}
const secretGroups: SecretGroup[] = [];
if (appId && appId !== "") {
let page = 1;
const perPage = 10;
let hasMorePages = true;
while(hasMorePages) {
const params = new URLSearchParams({
page: String(page),
per_page: String(perPage),
filter: "all",
});
const {
data: {
data: {
secrets
}
}
} = await standardRequest.get<{ data: { secrets: NorthflankSecretGroup[] }}>(
`${INTEGRATION_NORTHFLANK_API_URL}/v1/projects/${appId}/secrets`,
{
params,
headers: {
Authorization: `Bearer ${req.accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
secrets.forEach((a: any) => {
secretGroups.push({
name: a.name,
groupId: a.id
});
});
if (secrets.length < perPage) {
hasMorePages = false;
}
page++;
}
}
return res.status(200).send({
secretGroups
});
}
/**
* Delete integration authorization with id [integrationAuthId]
* @param req
@ -461,3 +535,4 @@ export const deleteIntegrationAuth = async (req: Request, res: Response) => {
integrationAuth
});
};

View File

@ -30,7 +30,6 @@ export const createSecretImport = async (req: Request, res: Response) => {
if (doesImportExist) {
throw BadRequestError({ message: "Secret import already exist" });
}
importSecDoc.imports.push({
environment: secretImport.environment,
secretPath: secretImport.secretPath
@ -109,7 +108,7 @@ export const getAllSecretsFromImport = async (req: Request, res: Response) => {
});
if (!importSecDoc) {
return res.status(200).json({ secrets: {} });
return res.status(200).json({ secrets: [] });
}
const secrets = await getAllImportedSecrets(workspaceId, environment, folderId);

View File

@ -830,7 +830,7 @@ export const getSecrets = async (req: Request, res: Response) => {
// TODO(akhilmhdh) - secret-imp change this to org type
let importedSecrets: any[] = [];
if (include_imports) {
if (include_imports === "true") {
importedSecrets = await getAllImportedSecrets(workspaceId, environment, folderId as string);
}

View File

@ -106,7 +106,6 @@ export const login1 = async (req: Request, res: Response) => {
*/
export const login2 = async (req: Request, res: Response) => {
try {
if (!req.headers["user-agent"]) throw InternalServerError({ message: "User-Agent header is required" });
const { email, clientProof, providerAuthToken } = req.body;
@ -189,7 +188,7 @@ export const login2 = async (req: Request, res: Response) => {
ip: req.realIP,
userAgent: req.headers["user-agent"] ?? "",
});
// store (refresh) token in httpOnly cookie
res.cookie("jid", tokens.refreshToken, {
httpOnly: true,

View File

@ -25,26 +25,25 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
let secretPath = req.query.secretPath as string;
const includeImports = req.query.include_imports as string;
// if the service token has single scope, it will get all secrets for that scope by default
const serviceTokenDetails: IServiceTokenData = req?.serviceTokenData
if (serviceTokenDetails) {
if (serviceTokenDetails.scopes.length == 1 && !containsGlobPatterns(serviceTokenDetails.scopes[0].secretPath)) {
const scope = serviceTokenDetails.scopes[0]
secretPath = scope.secretPath
environment = scope.environment
workspaceId = serviceTokenDetails.workspace.toString()
} else {
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "query",
locationEnvironment: "query",
requiredPermissions: [PERMISSION_READ_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: true
})
}
// if the service token has single scope, it will get all secrets for that scope by default
const serviceTokenDetails: IServiceTokenData = req?.serviceTokenData;
if (serviceTokenDetails && serviceTokenDetails.scopes.length == 1 && !containsGlobPatterns(serviceTokenDetails.scopes[0].secretPath)) {
const scope = serviceTokenDetails.scopes[0];
secretPath = scope.secretPath;
environment = scope.environment;
workspaceId = serviceTokenDetails.workspace.toString();
} else {
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "query",
locationEnvironment: "query",
requiredPermissions: [PERMISSION_READ_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: true
});
}
const secrets = await SecretService.getSecrets({
workspaceId: new Types.ObjectId(workspaceId),
environment,
@ -56,7 +55,7 @@ export const getSecretsRaw = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(workspaceId)
});
if (includeImports) {
if (includeImports === "true") {
const folders = await Folder.findOne({ workspace: workspaceId, environment });
let folderId = "root";
// if folder exist get it and replace folderid with new one
@ -294,7 +293,7 @@ export const getSecrets = async (req: Request, res: Response) => {
authData: req.authData
});
if (includeImports) {
if (includeImports === "true") {
const folders = await Folder.findOne({ workspace: workspaceId, environment });
let folderId = "root";
// if folder exist get it and replace folderid with new one

View File

@ -2,6 +2,7 @@ import * as secretController from "./secretController";
import * as secretSnapshotController from "./secretSnapshotController";
import * as organizationsController from "./organizationsController";
import * as ssoController from "./ssoController";
import * as usersController from "./usersController";
import * as workspaceController from "./workspaceController";
import * as actionController from "./actionController";
import * as membershipController from "./membershipController";
@ -12,6 +13,7 @@ export {
secretSnapshotController,
organizationsController,
ssoController,
usersController,
workspaceController,
actionController,
membershipController,

View File

@ -21,7 +21,7 @@ import { EELicenseService } from "../../services";
*/
export const redirectSSO = async (req: Request, res: Response) => {
if (req.isUserCompleted) {
return res.redirect(`${await getSiteURL()}/login/sso?token=${encodeURIComponent(req.providerAuthToken)}`);
return res.redirect(`${await getSiteURL()}/login/sso?token=${encodeURIComponent(req.providerAuthToken)}`);
}
return res.redirect(`${await getSiteURL()}/signup/sso?token=${encodeURIComponent(req.providerAuthToken)}`);
@ -57,7 +57,6 @@ export const updateSSOConfig = async (req: Request, res: Response) => {
entryPoint,
issuer,
cert,
audience
} = req.body;
const plan = await EELicenseService.getPlan(organizationId);
@ -78,9 +77,6 @@ export const updateSSOConfig = async (req: Request, res: Response) => {
encryptedCert?: string;
certIV?: string;
certTag?: string;
encryptedAudience?: string;
audienceIV?: string;
audienceTag?: string;
}
const update: PatchUpdate = {};
@ -132,18 +128,6 @@ export const updateSSOConfig = async (req: Request, res: Response) => {
update.certIV = certIV;
update.certTag = certTag;
}
if (audience) {
const {
ciphertext: encryptedAudience,
iv: audienceIV,
tag: audienceTag
} = client.encryptSymmetric(audience, key);
update.encryptedAudience = encryptedAudience;
update.audienceIV = audienceIV;
update.audienceTag = audienceTag;
}
const ssoConfig = await SSOConfig.findOneAndUpdate(
{
@ -207,8 +191,7 @@ export const createSSOConfig = async (req: Request, res: Response) => {
isActive,
entryPoint,
issuer,
cert,
audience
cert
} = req.body;
const plan = await EELicenseService.getPlan(organizationId);
@ -238,12 +221,6 @@ export const createSSOConfig = async (req: Request, res: Response) => {
iv: certIV,
tag: certTag
} = client.encryptSymmetric(cert, key);
const {
ciphertext: encryptedAudience,
iv: audienceIV,
tag: audienceTag
} = client.encryptSymmetric(audience, key);
const ssoConfig = await new SSOConfig({
organization: new Types.ObjectId(organizationId),
@ -257,10 +234,7 @@ export const createSSOConfig = async (req: Request, res: Response) => {
issuerTag,
encryptedCert,
certIV,
certTag,
encryptedAudience,
audienceIV,
audienceTag
certTag
}).save();
return res.status(200).send(ssoConfig);

View File

@ -0,0 +1,13 @@
import { Request, Response } from "express";
/**
* Return the ip address of the current user
* @param req
* @param res
* @returns
*/
export const getMyIp = (req: Request, res: Response) => {
return res.status(200).send({
ip: req.authData.authIP
});
}

View File

@ -3,16 +3,20 @@ import { PipelineStage, Types } from "mongoose";
import { Secret } from "../../../models";
import {
FolderVersion,
IPType,
ISecretVersion,
Log,
SecretSnapshot,
SecretVersion,
TFolderRootVersionSchema,
TrustedIP
} from "../../models";
import { EESecretService } from "../../services";
import { getLatestSecretVersionIds } from "../../helpers/secretVersion";
import Folder, { TFolderSchema } from "../../../models/folder";
import { searchByFolderId } from "../../../services/FolderService";
import { EELicenseService } from "../../services";
import { extractIPDetails, isValidIpOrCidr } from "../../../utils/ip";
/**
* Return secret snapshots for workspace with id [workspaceId]
@ -588,3 +592,147 @@ export const getWorkspaceLogs = async (req: Request, res: Response) => {
logs,
});
};
/**
* Return trusted ips for workspace with id [workspaceId]
* @param req
* @param res
*/
export const getWorkspaceTrustedIps = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const trustedIps = await TrustedIP.find({
workspace: new Types.ObjectId(workspaceId)
});
return res.status(200).send({
trustedIps
});
}
/**
* Add a trusted ip to workspace with id [workspaceId]
* @param req
* @param res
*/
export const addWorkspaceTrustedIp = async (req: Request, res: Response) => {
const { workspaceId } = req.params;
const {
ipAddress: ip,
comment,
isActive
} = req.body;
const plan = await EELicenseService.getPlan(req.workspace.organization.toString());
if (!plan.ipAllowlisting) return res.status(400).send({
message: "Failed to add IP access range due to plan restriction. Upgrade plan to add IP access range."
});
const isValidIPOrCidr = isValidIpOrCidr(ip);
if (!isValidIPOrCidr) return res.status(400).send({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
const { ipAddress, type, prefix } = extractIPDetails(ip);
const trustedIp = await new TrustedIP({
workspace: new Types.ObjectId(workspaceId),
ipAddress,
type,
prefix,
isActive,
comment,
}).save();
return res.status(200).send({
trustedIp
});
}
/**
* Update trusted ip with id [trustedIpId] workspace with id [workspaceId]
* @param req
* @param res
*/
export const updateWorkspaceTrustedIp = async (req: Request, res: Response) => {
const { workspaceId, trustedIpId } = req.params;
const {
ipAddress: ip,
comment
} = req.body;
const plan = await EELicenseService.getPlan(req.workspace.organization.toString());
if (!plan.ipAllowlisting) return res.status(400).send({
message: "Failed to update IP access range due to plan restriction. Upgrade plan to update IP access range."
});
const isValidIPOrCidr = isValidIpOrCidr(ip);
if (!isValidIPOrCidr) return res.status(400).send({
message: "The IP is not a valid IPv4, IPv6, or CIDR block"
});
const { ipAddress, type, prefix } = extractIPDetails(ip);
const updateObject: {
ipAddress: string;
type: IPType;
comment: string;
prefix?: number;
$unset?: {
prefix: number;
}
} = {
ipAddress,
type,
comment
};
if (prefix !== undefined) {
updateObject.prefix = prefix;
} else {
updateObject.$unset = { prefix: 1 };
}
const trustedIp = await TrustedIP.findOneAndUpdate(
{
_id: new Types.ObjectId(trustedIpId),
workspace: new Types.ObjectId(workspaceId),
},
updateObject,
{
new: true
}
);
return res.status(200).send({
trustedIp
});
}
/**
* Delete IP access range from workspace with id [workspaceId]
* @param req
* @param res
*/
export const deleteWorkspaceTrustedIp = async (req: Request, res: Response) => {
const { workspaceId, trustedIpId } = req.params;
const plan = await EELicenseService.getPlan(req.workspace.organization.toString());
if (!plan.ipAllowlisting) return res.status(400).send({
message: "Failed to delete IP access range due to plan restriction. Upgrade plan to delete IP access range."
});
const trustedIp = await TrustedIP.findOneAndDelete({
_id: new Types.ObjectId(trustedIpId),
workspace: new Types.ObjectId(workspaceId)
});
return res.status(200).send({
trustedIp
});
}

View File

@ -51,13 +51,6 @@ export const getSSOConfigHelper = async ({
ssoConfig.certIV,
ssoConfig.certTag
);
const audience = client.decryptSymmetric(
ssoConfig.encryptedAudience,
key,
ssoConfig.audienceIV,
ssoConfig.audienceTag
);
return ({
_id: ssoConfig._id,
@ -66,7 +59,6 @@ export const getSSOConfigHelper = async ({
isActive: ssoConfig.isActive,
entryPoint,
issuer,
cert,
audience
cert
});
}

View File

@ -66,6 +66,4 @@ const actionSchema = new Schema<IAction>(
}
);
const Action = model<IAction>("Action", actionSchema);
export default Action;
export const Action = model<IAction>("Action", actionSchema);

View File

@ -52,9 +52,7 @@ const folderRootVersionSchema = new Schema<TFolderRootVersionSchema>(
}
);
const FolderVersion = model<TFolderRootVersionSchema>(
export const FolderVersion = model<TFolderRootVersionSchema>(
"FolderVersion",
folderRootVersionSchema
);
export default FolderVersion;
);

View File

@ -1,21 +1,7 @@
import SecretSnapshot, { ISecretSnapshot } from "./secretSnapshot";
import SecretVersion, { ISecretVersion } from "./secretVersion";
import FolderVersion, { TFolderRootVersionSchema } from "./folderVersion";
import Log, { ILog } from "./log";
import Action, { IAction } from "./action";
import SSOConfig, { ISSOConfig } from "./ssoConfig";
export {
SecretSnapshot,
ISecretSnapshot,
SecretVersion,
ISecretVersion,
FolderVersion,
TFolderRootVersionSchema,
Log,
ILog,
Action,
IAction,
SSOConfig,
ISSOConfig
};
export * from "./secretSnapshot";
export * from "./secretVersion";
export * from "./folderVersion";
export * from "./log";
export * from "./action";
export * from "./ssoConfig";
export * from "./trustedIp";

View File

@ -69,6 +69,4 @@ const logSchema = new Schema<ILog>(
}
);
const Log = model<ILog>("Log", logSchema);
export default Log;
export const Log = model<ILog>("Log", logSchema);

View File

@ -46,9 +46,7 @@ const secretSnapshotSchema = new Schema<ISecretSnapshot>(
}
);
const SecretSnapshot = model<ISecretSnapshot>(
export const SecretSnapshot = model<ISecretSnapshot>(
"SecretSnapshot",
secretSnapshotSchema
);
export default SecretSnapshot;
);

View File

@ -124,9 +124,7 @@ const secretVersionSchema = new Schema<ISecretVersion>(
}
);
const SecretVersion = model<ISecretVersion>(
export const SecretVersion = model<ISecretVersion>(
"SecretVersion",
secretVersionSchema
);
export default SecretVersion;
);

View File

@ -1,8 +1,14 @@
import { Schema, Types, model } from "mongoose";
export enum AuthProvider {
OKTA_SAML = "okta-saml",
AZURE_SAML = "azure-saml",
JUMPCLOUD_SAML = "jumpcloud-saml"
}
export interface ISSOConfig {
organization: Types.ObjectId;
authProvider: "okta-saml"
authProvider: AuthProvider;
isActive: boolean;
encryptedEntryPoint: string;
entryPointIV: string;
@ -13,9 +19,6 @@ export interface ISSOConfig {
encryptedCert: string;
certIV: string;
certTag: string;
encryptedAudience: string;
audienceIV: string;
audienceTag: string;
}
const ssoConfigSchema = new Schema<ISSOConfig>(
@ -26,9 +29,7 @@ const ssoConfigSchema = new Schema<ISSOConfig>(
},
authProvider: {
type: String,
enum: [
"okta-saml"
],
enum: AuthProvider,
required: true
},
isActive: {
@ -61,15 +62,6 @@ const ssoConfigSchema = new Schema<ISSOConfig>(
},
certTag: {
type: String
},
encryptedAudience: {
type: String
},
audienceIV: {
type: String
},
audienceTag: {
type: String
}
},
{
@ -77,6 +69,4 @@ const ssoConfigSchema = new Schema<ISSOConfig>(
}
);
const SSOConfig = model<ISSOConfig>("SSOConfig", ssoConfigSchema);
export default SSOConfig;
export const SSOConfig = model<ISSOConfig>("SSOConfig", ssoConfigSchema);

View File

@ -0,0 +1,54 @@
import { Schema, Types, model } from "mongoose";
export enum IPType {
IPV4 = "ipv4",
IPV6 = "ipv6"
}
export interface ITrustedIP {
_id: Types.ObjectId;
workspace: Types.ObjectId;
ipAddress: string;
type: "ipv4" | "ipv6", // either IPv4/IPv6 address or network IPv4/IPv6 address
isActive: boolean;
comment: string;
prefix?: number; // CIDR
}
const trustedIpSchema = new Schema<ITrustedIP>(
{
workspace: {
type: Schema.Types.ObjectId,
ref: "Workspace",
required: true
},
ipAddress: {
type: String,
required: true
},
type: {
type: String,
enum: [
IPType.IPV4,
IPType.IPV6
],
required: true
},
prefix: {
type: Number,
required: false
},
isActive: {
type: Boolean,
required: true
},
comment: {
type: String
}
},
{
timestamps: true
}
);
export const TrustedIP = model<ITrustedIP>("TrustedIP", trustedIpSchema);

View File

@ -2,6 +2,7 @@ import secret from "./secret";
import secretSnapshot from "./secretSnapshot";
import organizations from "./organizations";
import sso from "./sso";
import users from "./users";
import workspace from "./workspace";
import action from "./action";
import cloudProducts from "./cloudProducts";
@ -11,6 +12,7 @@ export {
secretSnapshot,
organizations,
sso,
users,
workspace,
action,
cloudProducts,

View File

@ -1,6 +1,9 @@
import express from "express";
const router = express.Router();
import passport from "passport";
import {
AuthProvider
} from "../../models";
import {
requireAuth,
requireOrganizationAuth,
@ -87,12 +90,11 @@ router.post(
locationOrganizationId: "body"
}),
body("organizationId").exists().trim(),
body("authProvider").exists().isString(),
body("authProvider").exists().isString().isIn([AuthProvider.OKTA_SAML]),
body("isActive").exists().isBoolean(),
body("entryPoint").exists().isString(),
body("issuer").exists().isString(),
body("cert").exists().isString(),
body("audience").exists().isString(),
validateRequest,
ssoController.createSSOConfig
);
@ -113,7 +115,6 @@ router.patch(
body("entryPoint").optional().isString(),
body("issuer").optional().isString(),
body("cert").optional().isString(),
body("audience").optional().isString(),
validateRequest,
ssoController.updateSSOConfig
);

View File

@ -0,0 +1,17 @@
import express from "express";
const router = express.Router();
import {
requireAuth
} from "../../../middleware";
import { AUTH_MODE_API_KEY, AUTH_MODE_JWT } from "../../../variables";
import { usersController } from "../../controllers/v1";
router.get(
"/me/ip",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY],
}),
usersController.getMyIp
);
export default router;

View File

@ -6,13 +6,18 @@ import {
validateRequest,
} from "../../../middleware";
import { body, param, query } from "express-validator";
import { ADMIN, MEMBER } from "../../../variables";
import {
ADMIN,
AUTH_MODE_API_KEY,
AUTH_MODE_JWT,
MEMBER
} from "../../../variables";
import { workspaceController } from "../../controllers/v1";
router.get(
"/:workspaceId/secret-snapshots",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
@ -30,7 +35,7 @@ router.get(
router.get(
"/:workspaceId/secret-snapshots/count",
requireAuth({
acceptedAuthModes: ["jwt"],
acceptedAuthModes: [AUTH_MODE_JWT],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
@ -46,7 +51,7 @@ router.get(
router.post(
"/:workspaceId/secret-snapshots/rollback",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
@ -63,7 +68,7 @@ router.post(
router.get(
"/:workspaceId/logs",
requireAuth({
acceptedAuthModes: ["jwt", "apiKey"],
acceptedAuthModes: [AUTH_MODE_JWT, AUTH_MODE_API_KEY],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
@ -79,4 +84,66 @@ router.get(
workspaceController.getWorkspaceLogs
);
router.get(
"/:workspaceId/trusted-ips",
param("workspaceId").exists().isString().trim(),
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN, MEMBER],
locationWorkspaceId: "params",
}),
workspaceController.getWorkspaceTrustedIps
);
router.post(
"/:workspaceId/trusted-ips",
param("workspaceId").exists().isString().trim(),
body("ipAddress").exists().isString().trim(),
body("comment").default("").isString().trim(),
body("isActive").exists().isBoolean(),
validateRequest,
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN],
locationWorkspaceId: "params",
}),
workspaceController.addWorkspaceTrustedIp
);
router.patch(
"/:workspaceId/trusted-ips/:trustedIpId",
param("workspaceId").exists().isString().trim(),
param("trustedIpId").exists().isString().trim(),
body("ipAddress").isString().trim().default(""),
body("comment").default("").isString().trim(),
validateRequest,
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN],
locationWorkspaceId: "params",
}),
workspaceController.updateWorkspaceTrustedIp
);
router.delete(
"/:workspaceId/trusted-ips/:trustedIpId",
param("workspaceId").exists().isString().trim(),
param("trustedIpId").exists().isString().trim(),
validateRequest,
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
requireWorkspaceAuth({
acceptedRoles: [ADMIN],
locationWorkspaceId: "params",
}),
workspaceController.deleteWorkspaceTrustedIp
);
export default router;

View File

@ -26,6 +26,7 @@ interface FeatureSet {
environmentsUsed: number;
secretVersioning: boolean;
pitRecovery: boolean;
ipAllowlisting: boolean;
rbac: boolean;
customRateLimits: boolean;
customAlerts: boolean;
@ -60,6 +61,7 @@ class EELicenseService {
environmentsUsed: 0,
secretVersioning: true,
pitRecovery: false,
ipAllowlisting: false,
rbac: true,
customRateLimits: true,
customAlerts: true,

View File

@ -275,3 +275,70 @@ export const decryptSymmetricHelper = async ({
return plaintext;
};
/**
* Return decrypted comments for workspace secrets with id [workspaceId]
* and [envionment] using bot
* @param {Object} obj
* @param {String} obj.workspaceId - id of workspace
* @param {String} obj.environment - environment
*/
export const getSecretsCommentBotHelper = async ({
workspaceId,
environment,
secretPath
} : {
workspaceId: Types.ObjectId;
environment: string;
secretPath: string;
}) => {
const content = {} as any;
const key = await getKey({ workspaceId: workspaceId });
let folderId = "root";
const folders = await Folder.findOne({
workspace: workspaceId,
environment,
});
if (!folders && secretPath !== "/") {
throw InternalServerError({ message: "Folder not found" });
}
if (folders) {
const folder = getFolderByPath(folders.nodes, secretPath);
if (!folder) {
throw InternalServerError({ message: "Folder not found" });
}
folderId = folder.id;
}
const secrets = await Secret.find({
workspace: workspaceId,
environment,
type: SECRET_SHARED,
folder: folderId,
});
secrets.forEach((secret: ISecret) => {
if(secret.secretCommentCiphertext && secret.secretCommentIV && secret.secretCommentTag) {
const secretKey = decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretKeyCiphertext,
iv: secret.secretKeyIV,
tag: secret.secretKeyTag,
key,
});
const commentValue = decryptSymmetric128BitHexKeyUTF8({
ciphertext: secret.secretCommentCiphertext,
iv: secret.secretCommentIV,
tag: secret.secretCommentTag,
key,
});
content[secretKey] = commentValue;
}
});
return content;
}

View File

@ -123,7 +123,7 @@ export const syncIntegrationsHelper = async ({
? {
environment,
}
: {}),
: {}),
isActive: true,
app: { $ne: null },
});
@ -133,17 +133,24 @@ export const syncIntegrationsHelper = async ({
for await (const integration of integrations) {
// get workspace, environment (shared) secrets
const secrets = await BotService.getSecrets({
// issue here?
workspaceId: integration.workspace,
environment: integration.environment,
secretPath: integration.secretPath,
});
// get workspace, environment (shared) secrets comments
const secretComments = await BotService.getSecretComments({
workspaceId: integration.workspace,
environment: integration.environment,
secretPath: integration.secretPath,
})
const integrationAuth = await IntegrationAuth.findById(
integration.integrationAuth
);
if (!integrationAuth) throw new Error("Failed to find integration auth");
if (!integrationAuth) throw new Error("Failed to find integration auth");
// get integration auth access token
const access = await getIntegrationAuthAccessHelper({
integrationAuthId: integration.integrationAuth,
@ -156,6 +163,7 @@ export const syncIntegrationsHelper = async ({
secrets,
accessId: access.accessId === undefined ? null : access.accessId,
accessToken: access.accessToken,
secretComments
});
}
} catch (err) {

View File

@ -346,6 +346,7 @@ export const createSecretHelper = async ({
workspace: new Types.ObjectId(workspaceId),
folder: folderId,
type,
environment,
...(type === SECRET_PERSONAL ? getAuthDataPayloadUserObj(authData) : {})
});
@ -362,6 +363,7 @@ export const createSecretHelper = async ({
secretBlindIndex,
folder: folderId,
workspace: new Types.ObjectId(workspaceId),
environment,
type: SECRET_SHARED
});

View File

@ -5,6 +5,10 @@ import {
Secret,
Workspace,
} from "../models";
import {
IPType,
TrustedIP
} from "../ee/models";
import { createBot } from "../helpers/bot";
import { EELicenseService } from "../ee/services";
import { SecretService } from "../services";
@ -40,6 +44,26 @@ export const createWorkspace = async ({
await SecretService.createSecretBlindIndexData({
workspaceId: workspace._id,
});
// initialize default trusted IPv4 CIDR - 0.0.0.0/0
await new TrustedIP({
workspace: workspace._id,
ipAddress: "0.0.0.0",
type: IPType.IPV4,
prefix: 0,
isActive: true,
comment: ""
}).save()
// initialize default trusted IPv6 CIDR - ::/0
await new TrustedIP({
workspace: workspace._id,
ipAddress: "::",
type: IPType.IPV6,
prefix: 0,
isActive: true,
comment: ""
});
await EELicenseService.refreshPlan(organizationId);

View File

@ -22,7 +22,8 @@ import {
sso as eeSSORouter,
secret as eeSecretRouter,
secretSnapshot as eeSecretSnapshotRouter,
workspace as eeWorkspaceRouter
users as eeUsersRouter,
workspace as eeWorkspaceRouter,
} from "./ee/routes/v1";
import {
auth as v1AuthRouter,
@ -129,6 +130,7 @@ const main = async () => {
// (EE) routes
app.use("/api/v1/secret", eeSecretRouter);
app.use("/api/v1/secret-snapshot", eeSecretSnapshotRouter);
app.use("/api/v1/users", eeUsersRouter);
app.use("/api/v1/workspace", eeWorkspaceRouter);
app.use("/api/v1/action", eeActionRouter);
app.use("/api/v1/organizations", eeOrganizationsRouter);

View File

@ -27,16 +27,22 @@ import {
INTEGRATION_LARAVELFORGE_API_URL,
INTEGRATION_NETLIFY,
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_NORTHFLANK,
INTEGRATION_NORTHFLANK_API_URL,
INTEGRATION_RAILWAY,
INTEGRATION_RAILWAY_API_URL,
INTEGRATION_RENDER,
INTEGRATION_RENDER_API_URL,
INTEGRATION_SUPABASE,
INTEGRATION_SUPABASE_API_URL,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_TERRAFORM_CLOUD_API_URL,
INTEGRATION_TRAVISCI,
INTEGRATION_TRAVISCI_API_URL,
INTEGRATION_VERCEL,
INTEGRATION_VERCEL_API_URL
INTEGRATION_VERCEL_API_URL,
INTEGRATION_WINDMILL,
INTEGRATION_WINDMILL_API_URL,
} from "../variables";
import { IIntegrationAuth } from "../models";
import { Octokit } from "@octokit/rest";
@ -134,6 +140,12 @@ const getApps = async ({
serverId: accessId
});
break;
case INTEGRATION_TERRAFORM_CLOUD:
apps = await getAppsTerraformCloud({
accessToken,
workspacesId: accessId,
});
break;
case INTEGRATION_TRAVISCI:
apps = await getAppsTravisCI({
accessToken,
@ -153,7 +165,12 @@ const getApps = async ({
apps = await getAppsCloudflarePages({
accessToken,
accountId: accessId
})
});
break;
case INTEGRATION_NORTHFLANK:
apps = await getAppsNorthflank({
accessToken,
});
break;
case INTEGRATION_BITBUCKET:
apps = await getAppsBitBucket({
@ -166,6 +183,11 @@ const getApps = async ({
accessToken,
});
break;
case INTEGRATION_WINDMILL:
apps = await getAppsWindmill({
accessToken
});
break;
case INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM:
apps = await getAppsDigitalOceanAppPlatform({
accessToken
@ -563,6 +585,43 @@ const getAppsTravisCI = async ({ accessToken }: { accessToken: string }) => {
return apps;
};
/**
* Return list of projects for Terraform Cloud integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Terraform Cloud API
* @param {String} obj.workspacesId - workspace id of Terraform Cloud projects
* @returns {Object[]} apps - names and ids of Terraform Cloud projects
* @returns {String} apps.name - name of Terraform Cloud projects
*/
const getAppsTerraformCloud = async ({
accessToken,
workspacesId
}: {
accessToken: string;
workspacesId?: string;
}) => {
const res = (
await standardRequest.get(`${INTEGRATION_TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${workspacesId}`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
},
})
).data.data;
const apps = []
const appsObj = {
name: res?.attributes.name,
appId: res?.id,
};
apps.push(appsObj)
return apps;
};
/**
* Return list of repositories for GitLab integration
* @param {Object} obj
@ -826,6 +885,39 @@ const getAppsBitBucket = async ({
return apps;
}
/** Return list of projects for Northflank integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Northflank API
* @returns {Object[]} apps - names of Northflank apps
* @returns {String} apps.name - name of Northflank app
*/
const getAppsNorthflank = async ({ accessToken }: { accessToken: string }) => {
const {
data: {
data: {
projects
}
}
} = await standardRequest.get(
`${INTEGRATION_NORTHFLANK_API_URL}/v1/projects`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
const apps = projects.map((a: any) => {
return {
name: a.name,
appId: a.id
};
});
return apps;
};
/**
* Return list of projects for Supabase integration
* @param {Object} obj
@ -856,6 +948,106 @@ const getAppsCodefresh = async ({
};
/**
* Return list of projects for Windmill integration
* @param {Object} obj
* @param {String} obj.accessToken - access token for Windmill API
* @returns {Object[]} apps - names of Windmill workspaces
* @returns {String} apps.name - name of Windmill workspace
*/
const getAppsWindmill = async ({ accessToken }: { accessToken: string }) => {
const { data } = await standardRequest.get(
`${INTEGRATION_WINDMILL_API_URL}/workspaces/list`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
// check for write access of secrets in windmill workspaces
const writeAccessCheck = data.map(async (app: any) => {
try {
const userPath = "u/user/variable";
const folderPath = "f/folder/variable";
const { data: writeUser } = await standardRequest.post(
`${INTEGRATION_WINDMILL_API_URL}/w/${app.id}/variables/create`,
{
path: userPath,
value: "variable",
is_secret: true,
description: "variable description"
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
const { data: writeFolder } = await standardRequest.post(
`${INTEGRATION_WINDMILL_API_URL}/w/${app.id}/variables/create`,
{
path: folderPath,
value: "variable",
is_secret: true,
description: "variable description"
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
// is write access is allowed then delete the created secrets from workspace
if (writeUser && writeFolder) {
await standardRequest.delete(
`${INTEGRATION_WINDMILL_API_URL}/w/${app.id}/variables/delete/${userPath}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
await standardRequest.delete(
`${INTEGRATION_WINDMILL_API_URL}/w/${app.id}/variables/delete/${folderPath}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
return app;
} else {
return { error: "cannot write secret" };
}
} catch (err: any) {
return { error: err.message };
}
});
const appsWriteResponses = await Promise.all(writeAccessCheck);
const appsWithWriteAccess = appsWriteResponses.filter((appRes: any) => !appRes.error);
const apps = appsWithWriteAccess.map((a: any) => {
return {
name: a.name,
appId: a.id,
};
});
return apps;
}
/**
* Return list of applications for DigitalOcean App Platform integration
* @param {Object} obj

View File

@ -36,16 +36,22 @@ import {
INTEGRATION_LARAVELFORGE_API_URL,
INTEGRATION_NETLIFY,
INTEGRATION_NETLIFY_API_URL,
INTEGRATION_NORTHFLANK,
INTEGRATION_NORTHFLANK_API_URL,
INTEGRATION_RAILWAY,
INTEGRATION_RAILWAY_API_URL,
INTEGRATION_RENDER,
INTEGRATION_RENDER_API_URL,
INTEGRATION_SUPABASE,
INTEGRATION_SUPABASE_API_URL,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_TERRAFORM_CLOUD_API_URL,
INTEGRATION_TRAVISCI,
INTEGRATION_TRAVISCI_API_URL,
INTEGRATION_VERCEL,
INTEGRATION_VERCEL_API_URL
INTEGRATION_VERCEL_API_URL,
INTEGRATION_WINDMILL,
INTEGRATION_WINDMILL_API_URL,
} from "../variables";
import AWS from "aws-sdk";
import { Octokit } from "@octokit/rest";
@ -61,6 +67,7 @@ import { standardRequest } from "../config/request";
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessId - access id for integration
* @param {String} obj.accessToken - access token for integration
* @param {Object} obj.secretComments - secret comments to push to integration (object where keys are secret keys and values are comment values)
*/
const syncSecrets = async ({
integration,
@ -68,12 +75,14 @@ const syncSecrets = async ({
secrets,
accessId,
accessToken,
secretComments
}: {
integration: IIntegration;
integrationAuth: IIntegrationAuth;
secrets: any;
accessId: string | null;
accessToken: string;
secretComments: any;
}) => {
switch (integration.integration) {
case INTEGRATION_AZURE_KEY_VAULT:
@ -193,6 +202,13 @@ const syncSecrets = async ({
accessToken,
});
break;
case INTEGRATION_TERRAFORM_CLOUD:
await syncSecretsTerraformCloud({
integration,
secrets,
accessToken,
});
break;
case INTEGRATION_HASHICORP_VAULT:
await syncSecretsHashiCorpVault({
integration,
@ -238,7 +254,22 @@ const syncSecrets = async ({
accessToken
});
break;
}
case INTEGRATION_NORTHFLANK:
await syncSecretsNorthflank({
integration,
secrets,
accessToken
});
break;
case INTEGRATION_WINDMILL:
await syncSecretsWindmill({
integration,
secrets,
accessToken,
secretComments
});
break;
}
};
/**
@ -1840,6 +1871,106 @@ const syncSecretsCheckly = async ({
}
};
/**
* Sync/push [secrets] to Terraform Cloud project with id [integration.appId]
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for Terraform Cloud API
*/
const syncSecretsTerraformCloud = async ({
integration,
secrets,
accessToken,
}: {
integration: IIntegration;
secrets: any;
accessToken: string;
}) => {
// get secrets from Terraform Cloud
const getSecretsRes = (
await standardRequest.get(`${INTEGRATION_TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
},
}
))
.data
.data
.reduce((obj: any, secret: any) => ({
...obj,
[secret.attributes.key]: secret
}), {});
// create or update secrets on Terraform Cloud
for await (const key of Object.keys(secrets)) {
if (!(key in getSecretsRes)) {
// case: secret does not exist in Terraform Cloud
// -> add secret
await standardRequest.post(
`${INTEGRATION_TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars`,
{
data: {
type: "vars",
attributes: {
key,
value: secrets[key],
category: integration.targetService,
},
},
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/vnd.api+json",
Accept: "application/vnd.api+json",
},
}
);
} else {
// case: secret exists in Terraform Cloud
if (secrets[key] !== getSecretsRes[key].attributes.value) {
// -> update secret
await standardRequest.patch(
`${INTEGRATION_TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars/${getSecretsRes[key].id}`,
{
data: {
type: "vars",
id: getSecretsRes[key].id,
attributes: {
...getSecretsRes[key],
value: secrets[key]
},
},
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/vnd.api+json",
Accept: "application/vnd.api+json",
},
}
);
}
}
}
for await (const key of Object.keys(getSecretsRes)) {
if (!(key in secrets)) {
// case: delete secret
await standardRequest.delete(`${INTEGRATION_TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars/${getSecretsRes[key].id}`, {
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/vnd.api+json",
Accept: "application/vnd.api+json",
},
})
}
}
};
/**
* Sync/push [secrets] to HashiCorp Vault path
* @param {Object} obj
@ -2126,7 +2257,7 @@ const syncSecretsCodefresh = async ({
* @param {IIntegration} obj.integration - integration details
* @param {IIntegrationAuth} obj.integrationAuth - integration auth details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - personal access token for DigitalOcean
* @param {String} obj.accessToken - access token for integration
*/
const syncSecretsDigitalOceanAppPlatform = async ({
integration,
@ -2154,6 +2285,114 @@ const syncSecretsDigitalOceanAppPlatform = async ({
);
}
/**
* Sync/push [secrets] to Windmill with name [integration.app]
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {IIntegrationAuth} obj.integrationAuth - integration auth details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for windmill integration
* @param {Object} obj.secretComments - secret comments to push to integration (object where keys are secret keys and values are comment values)
*/
const syncSecretsWindmill = async ({
integration,
secrets,
accessToken,
secretComments
}: {
integration: IIntegration;
secrets: any;
accessToken: string;
secretComments: any;
}) => {
interface WindmillSecret {
path: string;
value: string;
is_secret: boolean;
description?: string;
}
// get secrets stored in windmill workspace
const res = (await standardRequest.get(
`${INTEGRATION_WINDMILL_API_URL}/w/${integration.appId}/variables/list`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
))
.data
.reduce(
(obj: any, secret: WindmillSecret) => ({
...obj,
[secret.path]: secret
}),
{}
);
// eslint-disable-next-line no-useless-escape
const pattern = new RegExp("^(u\/|f\/)[a-zA-Z0-9_-]+\/([a-zA-Z0-9_-]+\/)*[a-zA-Z0-9_-]*[^\/]$");
for await (const key of Object.keys(secrets)) {
if((key.startsWith("u/") || key.startsWith("f/")) && pattern.test(key)) {
if(!(key in res)) {
// case: secret does not exist in windmill
// -> create secret
await standardRequest.post(
`${INTEGRATION_WINDMILL_API_URL}/w/${integration.appId}/variables/create`,
{
path: key,
value: secrets[key],
is_secret: true,
description: secretComments[key] || ""
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
} else {
// -> update secret
await standardRequest.post(
`${INTEGRATION_WINDMILL_API_URL}/w/${integration.appId}/variables/update/${res[key].path}`,
{
path: key,
value: secrets[key],
is_secret: true,
description: secretComments[key] || ""
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
},
}
);
}
}
}
for await (const key of Object.keys(res)) {
if (!(key in secrets)) {
// -> delete secret
await standardRequest.delete(
`${INTEGRATION_WINDMILL_API_URL}/w/${integration.appId}/variables/delete/${res[key].path}`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json",
"Accept-Encoding": "application/json",
}
}
);
}
}
}
/**
* Sync/push [secrets] to Cloud66 application with name [integration.app]
* @param {Object} obj
@ -2257,4 +2496,35 @@ const syncSecretsCloud66 = async ({
}
};
/** Sync/push [secrets] to Northflank
* @param {Object} obj
* @param {IIntegration} obj.integration - integration details
* @param {Object} obj.secrets - secrets to push to integration (object where keys are secret keys and values are secret values)
* @param {String} obj.accessToken - access token for Northflank integration
*/
const syncSecretsNorthflank = async ({
integration,
secrets,
accessToken
}: {
integration: IIntegration;
secrets: any;
accessToken: string;
}) => {
await standardRequest.patch(
`${INTEGRATION_NORTHFLANK_API_URL}/v1/projects/${integration.appId}/secrets/${integration.targetServiceId}`,
{
secrets: {
variables: secrets
}
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
};
export { syncSecrets };

View File

@ -18,6 +18,7 @@ const requireWorkspaceAuth = ({
requiredPermissions = [],
requireBlindIndicesEnabled = false,
requireE2EEOff = false,
checkIPAllowlist = false
}: {
acceptedRoles: Array<"admin" | "member">;
locationWorkspaceId: req;
@ -25,6 +26,7 @@ const requireWorkspaceAuth = ({
requiredPermissions?: string[];
requireBlindIndicesEnabled?: boolean;
requireE2EEOff?: boolean;
checkIPAllowlist?: boolean;
}) => {
return async (req: Request, res: Response, next: NextFunction) => {
const workspaceId = req[locationWorkspaceId]?.workspaceId;
@ -39,6 +41,7 @@ const requireWorkspaceAuth = ({
requiredPermissions,
requireBlindIndicesEnabled,
requireE2EEOff,
checkIPAllowlist
});
if (membership) {

View File

@ -16,11 +16,14 @@ import {
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
INTEGRATION_NORTHFLANK,
INTEGRATION_RAILWAY,
INTEGRATION_RENDER,
INTEGRATION_SUPABASE,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL
INTEGRATION_VERCEL,
INTEGRATION_WINDMILL
} from "../variables";
import { Schema, Types, model } from "mongoose";
@ -57,12 +60,15 @@ export interface IIntegration {
| "travisci"
| "supabase"
| "checkly"
| "terraform-cloud"
| "hashicorp-vault"
| "cloudflare-pages"
| "bitbucket"
| "codefresh"
| "digital-ocean-app-platform"
| "cloud-66"
| "northflank"
| "windmill";
integrationAuth: Types.ObjectId;
}
@ -150,12 +156,15 @@ const integrationSchema = new Schema<IIntegration>(
INTEGRATION_TRAVISCI,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CODEFRESH,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
],
required: true,
},
@ -168,7 +177,7 @@ const integrationSchema = new Schema<IIntegration>(
type: String,
required: true,
default: "/",
},
}
},
{
timestamps: true,

View File

@ -18,11 +18,14 @@ import {
INTEGRATION_HEROKU,
INTEGRATION_LARAVELFORGE,
INTEGRATION_NETLIFY,
INTEGRATION_NORTHFLANK,
INTEGRATION_RAILWAY,
INTEGRATION_RENDER,
INTEGRATION_SUPABASE,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_TRAVISCI,
INTEGRATION_VERCEL
INTEGRATION_VERCEL,
INTEGRATION_WINDMILL
} from "../variables";
import { Document, Schema, Types, model } from "mongoose";
@ -50,7 +53,10 @@ export interface IIntegrationAuth extends Document {
| "codefresh"
| "digital-ocean-app-platform"
| "bitbucket"
| "cloud-66";
| "cloud-66"
| "terraform-cloud"
| "northflank"
| "windmill";
teamId: string;
accountId: string;
url: string;
@ -94,12 +100,15 @@ const integrationAuthSchema = new Schema<IIntegrationAuth>(
INTEGRATION_LARAVELFORGE,
INTEGRATION_TRAVISCI,
INTEGRATION_SUPABASE,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CODEFRESH,
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
],
required: true,
},

View File

@ -3,7 +3,9 @@ import { Document, Schema, Types, model } from "mongoose";
export enum AuthProvider {
EMAIL = "email",
GOOGLE = "google",
OKTA_SAML = "okta-saml"
OKTA_SAML = "okta-saml",
AZURE_SAML = "azure-saml",
JUMPCLOUD_SAML = "jumpcloud-saml",
}
export interface IUser extends Document {

View File

@ -155,6 +155,20 @@ router.get(
integrationAuthController.getIntegrationAuthBitBucketWorkspaces
);
router.get(
"/:integrationAuthId/northflank/secret-groups",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
}),
requireIntegrationAuthorizationAuth({
acceptedRoles: [ADMIN, MEMBER],
}),
param("integrationAuthId").exists().isString(),
query("appId").exists().isString(),
validateRequest,
integrationAuthController.getIntegrationAuthNorthflankSecretGroups
);
router.delete(
"/:integrationAuthId",
requireAuth({

View File

@ -46,7 +46,7 @@ router.delete(
body("secretImportPath").isString().exists().trim(),
body("secretImportEnv").isString().exists().trim(),
validateRequest,
secretImportController.updateSecretImport
secretImportController.deleteSecretImport
);
router.get(

View File

@ -93,7 +93,7 @@ router.delete(
usersController.deleteAPIKey
);
router.get( // new
router.get(
"/me/sessions",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],
@ -101,7 +101,7 @@ router.get( // new
usersController.getMySessions
);
router.delete( // new
router.delete(
"/me/sessions",
requireAuth({
acceptedAuthModes: [AUTH_MODE_JWT],

View File

@ -56,7 +56,8 @@ router.get(
locationEnvironment: "query",
requiredPermissions: [PERMISSION_READ_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: true
requireE2EEOff: true,
checkIPAllowlist: false
}),
secretsController.getSecretByNameRaw
);
@ -84,7 +85,8 @@ router.post(
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: true
requireE2EEOff: true,
checkIPAllowlist: false
}),
secretsController.createSecretRaw
);
@ -112,7 +114,8 @@ router.patch(
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: true
requireE2EEOff: true,
checkIPAllowlist: false
}),
secretsController.updateSecretByNameRaw
);
@ -139,7 +142,8 @@ router.delete(
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: true
requireE2EEOff: true,
checkIPAllowlist: false
}),
secretsController.deleteSecretByNameRaw
);
@ -164,7 +168,8 @@ router.get(
locationEnvironment: "query",
requiredPermissions: [PERMISSION_READ_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: false
requireE2EEOff: false,
checkIPAllowlist: false
}),
secretsController.getSecrets
);
@ -199,7 +204,8 @@ router.post(
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: false
requireE2EEOff: false,
checkIPAllowlist: false
}),
secretsController.createSecret
);
@ -225,7 +231,8 @@ router.get(
locationWorkspaceId: "query",
locationEnvironment: "query",
requiredPermissions: [PERMISSION_READ_SECRETS],
requireBlindIndicesEnabled: true
requireBlindIndicesEnabled: true,
checkIPAllowlist: false
}),
secretsController.getSecretByName
);
@ -255,7 +262,8 @@ router.patch(
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: false
requireE2EEOff: false,
checkIPAllowlist: false
}),
secretsController.updateSecretByName
);
@ -282,7 +290,8 @@ router.delete(
locationEnvironment: "body",
requiredPermissions: [PERMISSION_WRITE_SECRETS],
requireBlindIndicesEnabled: true,
requireE2EEOff: false
requireE2EEOff: false,
checkIPAllowlist: false
}),
secretsController.deleteSecretByName
);

View File

@ -5,6 +5,7 @@ import {
getIsWorkspaceE2EEHelper,
getKey,
getSecretsBotHelper,
getSecretsCommentBotHelper,
} from "../helpers/bot";
/**
@ -107,6 +108,30 @@ class BotService {
tag,
});
}
/**
* Return decrypted secret comments for workspace with id [worskpaceId] and
* environment [environment] shared to bot.
* @param {Object} obj
* @param {String} obj.workspaceId - id of workspace of secrets
* @param {String} obj.environment - environment for secrets
* @returns {Object} secretObj - object where keys are secret keys and values are comments
*/
static async getSecretComments({
workspaceId,
environment,
secretPath
}: {
workspaceId: Types.ObjectId;
environment: string;
secretPath: string;
}) {
return await getSecretsCommentBotHelper({
workspaceId,
environment,
secretPath
});
}
}
export default BotService;

View File

@ -19,7 +19,7 @@ import {
} from "../config";
import { getSSOConfigHelper } from "../ee/helpers/organizations";
import { InternalServerError, OrganizationNotFoundError } from "./errors";
import { INVITED, MEMBER } from "../variables";
import { ACCEPTED, INVITED, MEMBER } from "../variables";
import { getSiteURL } from "../config";
// eslint-disable-next-line @typescript-eslint/no-var-requires
@ -135,24 +135,38 @@ const initializePassport = async () => {
{
passReqToCallback: true,
getSamlOptions: async (req: any, done: any) => {
const { ssoIdentifier } = req.params;
const ssoConfig = await getSSOConfigHelper({
ssoConfigId: new Types.ObjectId(ssoIdentifier)
});
const samlConfig = ({
path: "/api/v1/auth/callback/saml",
callbackURL: `${await getSiteURL()}/api/v1/auth/callback/saml`,
entryPoint: ssoConfig.entryPoint,
issuer: ssoConfig.issuer,
cert: ssoConfig.cert,
audience: ssoConfig.audience
});
req.ssoConfig = ssoConfig;
done(null, samlConfig);
const { ssoIdentifier } = req.params;
const ssoConfig = await getSSOConfigHelper({
ssoConfigId: new Types.ObjectId(ssoIdentifier)
});
interface ISAMLConfig {
path: string;
callbackURL: string;
entryPoint: string;
issuer: string;
cert: string;
audience: string;
wantAuthnResponseSigned?: boolean;
}
const samlConfig: ISAMLConfig = ({
path: `${await getSiteURL()}/api/v1/sso/saml2/${ssoIdentifier}`,
callbackURL: `${await getSiteURL()}/api/v1/sso/saml2${ssoIdentifier}`,
entryPoint: ssoConfig.entryPoint,
issuer: ssoConfig.issuer,
cert: ssoConfig.cert,
audience: await getSiteURL()
});
if (ssoConfig.authProvider === AuthProvider.JUMPCLOUD_SAML) {
samlConfig.wantAuthnResponseSigned = false;
}
req.ssoConfig = ssoConfig;
done(null, samlConfig);
},
},
async (req: any, profile: any, done: any) => {
@ -161,7 +175,7 @@ const initializePassport = async () => {
const organization = await Organization.findById(req.ssoConfig.organization);
if (!organization) return done(OrganizationNotFoundError());
const email = profile.email;
const firstName = profile.firstName;
const lastName = profile.lastName;
@ -170,15 +184,44 @@ const initializePassport = async () => {
email
}).select("+publicKey");
if (user && user.authProvider !== AuthProvider.OKTA_SAML) {
done(InternalServerError());
}
if (!user) {
if (user) {
if (!user.authProvider || user.authProvider === AuthProvider.EMAIL || user.authProvider === AuthProvider.GOOGLE) {
await User.findByIdAndUpdate(
user._id,
{
authProvider: req.ssoConfig.authProvider
},
{
new: true
}
);
}
let membershipOrg = await MembershipOrg.findOne(
{
user: user._id,
organization: organization._id
}
);
if (!membershipOrg) {
membershipOrg = await new MembershipOrg({
inviteEmail: email,
user: user._id,
organization: organization._id,
role: MEMBER,
status: ACCEPTED
}).save();
}
if (membershipOrg.status === INVITED) {
membershipOrg.status = ACCEPTED;
await membershipOrg.save();
}
} else {
user = await new User({
email,
authProvider: AuthProvider.OKTA_SAML,
authId: profile.id,
authProvider: req.ssoConfig.authProvider,
firstName,
lastName
}).save();
@ -186,7 +229,7 @@ const initializePassport = async () => {
await new MembershipOrg({
inviteEmail: email,
user: user._id,
organization: organization?._id,
organization: organization._id,
role: MEMBER,
status: INVITED
}).save();

View File

@ -0,0 +1 @@
export * from "./ip";

101
backend/src/utils/ip/ip.ts Normal file
View File

@ -0,0 +1,101 @@
import net from "net";
import { IPType } from "../../ee/models";
import { InternalServerError } from "../errors";
/**
* Return details of IP [ip]:
* - If [ip] is a specific IP address then return the IPv4/IPv6 address
* - If [ip] is a subnet then return the network IPv4/IPv6 address and prefix
* @param {String} ip - ip whose details to return
* @returns
*/
export const extractIPDetails = (ip: string) => {
if (net.isIPv4(ip)) return ({
ipAddress: ip,
type: IPType.IPV4
});
if (net.isIPv6(ip)) return ({
ipAddress: ip,
type: IPType.IPV6
});
const [ipNet, prefix] = ip.split("/");
let type;
switch (net.isIP(ipNet)) {
case 4:
type = IPType.IPV4;
break;
case 6:
type = IPType.IPV6;
break;
default:
throw InternalServerError({
message: "Failed to extract IP details"
});
}
return ({
ipAddress: ipNet,
type,
prefix: parseInt(prefix, 10)
});
}
/**
* Checks if a given string is a valid CIDR block.
*
* The function checks if the input string is a valid IPv4 or IPv6 address in CIDR notation.
*
* CIDR notation includes a network address followed by a slash ('/') and a prefix length.
* For IPv4, the prefix length must be between 0 and 32. For IPv6, it must be between 0 and 128.
* If the input string is not a valid CIDR block, the function returns `false`.
*
* @param {string} cidr - string in CIDR notation
* @returns {boolean} Returns `true` if the string is a valid CIDR block, `false` otherwise.
*
*/
export const isValidCidr = (cidr: string): boolean => {
const [ip, prefix] = cidr.split("/");
const prefixNum = parseInt(prefix, 10);
// ensure prefix exists and is a number within the appropriate range for each IP version
if (!prefix || isNaN(prefixNum) ||
(net.isIPv4(ip) && (prefixNum < 0 || prefixNum > 32)) ||
(net.isIPv6(ip) && (prefixNum < 0 || prefixNum > 128))) {
return false;
}
// ensure the IP portion of the CIDR block is a valid IPv4 or IPv6 address
if (!net.isIPv4(ip) && !net.isIPv6(ip)) {
return false;
}
return true;
}
/**
* Checks if a given string is a valid IPv4/IPv6 address or a valid CIDR block.
*
* If the string contains a slash ('/'), it treats the input as a CIDR block and checks its validity.
* Otherwise, it treats the string as a standalone IP address (either IPv4 or IPv6) and checks its validity.
*
* @param {string} input - The string to be checked. It could be an IP address or a CIDR block.
* @returns {boolean} Returns `true` if the string is a valid IP address (either IPv4 or IPv6) or a valid CIDR block, `false` otherwise.
*
*/
export const isValidIpOrCidr = (ip: string): boolean => {
// if the string contains a slash, treat it as a CIDR block
if (ip.includes("/")) {
return isValidCidr(ip);
}
// otherwise, treat it as a standalone IP address
if (net.isIPv4(ip) || net.isIPv6(ip)) {
return true;
}
return false;
}

View File

@ -3,7 +3,13 @@ import crypto from "crypto";
import { Types } from "mongoose";
import { encryptSymmetric128BitHexKeyUTF8 } from "../crypto";
import { EESecretService } from "../../ee/services";
import { ISecretVersion, SecretSnapshot, SecretVersion } from "../../ee/models";
import {
IPType,
ISecretVersion,
SecretSnapshot,
SecretVersion,
TrustedIP
} from "../../ee/models";
import {
BackupPrivateKey,
Bot,
@ -549,3 +555,79 @@ export const backfillServiceTokenMultiScope = async () => {
console.log("Migration: Service token migration v2 complete");
};
/**
* Backfill each workspace without any registered trusted IPs to
* have default trusted ip of 0.0.0.0/0
*/
export const backfillTrustedIps = async () => {
const workspaceIdsWithTrustedIps = await TrustedIP.distinct("workspace");
const workspaceIdsToAddTrustedIp = await Workspace.distinct("_id", {
_id: {
$nin: workspaceIdsWithTrustedIps
}
});
if (workspaceIdsToAddTrustedIp.length > 0) {
const operations: {
updateOne: {
filter: {
workspace: Types.ObjectId;
ipAddress: string;
},
update: {
workspace: Types.ObjectId;
ipAddress: string;
type: string;
prefix: number;
isActive: boolean;
comment: string;
},
upsert: boolean;
}
}[] = [];
workspaceIdsToAddTrustedIp.forEach((workspaceId) => {
// default IPv4 trusted CIDR
operations.push({
updateOne: {
filter: {
workspace: workspaceId,
ipAddress: "0.0.0.0"
},
update: {
workspace: workspaceId,
ipAddress: "0.0.0.0",
type: IPType.IPV4.toString(),
prefix: 0,
isActive: true,
comment: ""
},
upsert: true
}
});
// default IPv6 trusted CIDR
operations.push({
updateOne: {
filter: {
workspace: workspaceId,
ipAddress: "::"
},
update: {
workspace: workspaceId,
ipAddress: "::",
type: IPType.IPV6.toString(),
prefix: 0,
isActive: true,
comment: ""
},
upsert: true
}
});
});
await TrustedIP.bulkWrite(operations);
console.log("Backfill: Trusted IPs complete");
}
}

View File

@ -15,7 +15,8 @@ import {
backfillSecretFolders,
backfillSecretVersions,
backfillServiceToken,
backfillServiceTokenMultiScope
backfillServiceTokenMultiScope,
backfillTrustedIps
} from "./backfillData";
import {
reencryptBotOrgKeys,
@ -84,6 +85,7 @@ export const setup = async () => {
await backfillServiceToken();
await backfillIntegration();
await backfillServiceTokenMultiScope();
await backfillTrustedIps();
// re-encrypt any data previously encrypted under server hex 128-bit ENCRYPTION_KEY
// to base64 256-bit ROOT_ENCRYPTION_KEY

View File

@ -1,14 +1,15 @@
import net from "net";
import { Types } from "mongoose";
import {
IServiceAccount,
IServiceTokenData,
IUser,
SecretBlindIndexData,
ServiceAccount,
ServiceTokenData,
User,
Workspace,
} from "../models";
import {
TrustedIP
} from "../ee/models";
import { validateServiceAccountClientForWorkspace } from "./serviceAccount";
import { validateUserClientForWorkspace } from "./user";
import { validateServiceTokenDataClientForWorkspace } from "./serviceTokenData";
@ -24,6 +25,8 @@ import {
AUTH_MODE_SERVICE_TOKEN,
} from "../variables";
import { BotService } from "../services";
import { AuthData } from "../interfaces/middleware";
import { extractIPDetails } from "../utils/ip";
/**
* Validate authenticated clients for workspace with id [workspaceId] based
@ -43,17 +46,16 @@ export const validateClientForWorkspace = async ({
requiredPermissions,
requireBlindIndicesEnabled,
requireE2EEOff,
checkIPAllowlist
}: {
authData: {
authMode: string;
authPayload: IUser | IServiceAccount | IServiceTokenData;
};
authData: AuthData;
workspaceId: Types.ObjectId;
environment?: string;
acceptedRoles: Array<"admin" | "member">;
requiredPermissions?: string[];
requireBlindIndicesEnabled: boolean;
requireE2EEOff: boolean;
checkIPAllowlist: boolean;
}) => {
const workspace = await Workspace.findById(workspaceId);
@ -82,6 +84,8 @@ export const validateClientForWorkspace = async ({
message: "Failed workspace authorization due to end-to-end encryption not being disabled",
});
}
if (authData.authMode === AUTH_MODE_JWT && authData.authPayload instanceof User) {
const membership = await validateUserClientForWorkspace({
@ -107,6 +111,40 @@ export const validateClientForWorkspace = async ({
}
if (authData.authMode === AUTH_MODE_SERVICE_TOKEN && authData.authPayload instanceof ServiceTokenData) {
if (checkIPAllowlist) {
const trustedIps = await TrustedIP.find({
workspace: workspaceId
});
if (trustedIps.length > 0) {
// case: check the IP address of the inbound request against trusted IPs
const blockList = new net.BlockList();
for (const trustedIp of trustedIps) {
if (trustedIp.prefix !== undefined) {
blockList.addSubnet(
trustedIp.ipAddress,
trustedIp.prefix,
trustedIp.type
);
} else {
blockList.addAddress(
trustedIp.ipAddress,
trustedIp.type
);
}
}
const { type } = extractIPDetails(authData.authIP);
const check = blockList.check(authData.authIP, type);
if (!check) throw UnauthorizedRequestError({
message: "Failed workspace authorization"
});
}
}
await validateServiceTokenDataClientForWorkspace({
serviceTokenData: authData.authPayload,
workspaceId,

View File

@ -25,12 +25,15 @@ export const INTEGRATION_CIRCLECI = "circleci";
export const INTEGRATION_TRAVISCI = "travisci";
export const INTEGRATION_SUPABASE = "supabase";
export const INTEGRATION_CHECKLY = "checkly";
export const INTEGRATION_TERRAFORM_CLOUD = "terraform-cloud";
export const INTEGRATION_HASHICORP_VAULT = "hashicorp-vault";
export const INTEGRATION_CLOUDFLARE_PAGES = "cloudflare-pages";
export const INTEGRATION_BITBUCKET = "bitbucket";
export const INTEGRATION_CODEFRESH = "codefresh";
export const INTEGRATION_WINDMILL = "windmill";
export const INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM = "digital-ocean-app-platform";
export const INTEGRATION_CLOUD_66 = "cloud-66";
export const INTEGRATION_NORTHFLANK = "northflank";
export const INTEGRATION_SET = new Set([
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
@ -45,12 +48,15 @@ export const INTEGRATION_SET = new Set([
INTEGRATION_TRAVISCI,
INTEGRATION_SUPABASE,
INTEGRATION_CHECKLY,
INTEGRATION_TERRAFORM_CLOUD,
INTEGRATION_HASHICORP_VAULT,
INTEGRATION_CLOUDFLARE_PAGES,
INTEGRATION_CODEFRESH,
INTEGRATION_WINDMILL,
INTEGRATION_BITBUCKET,
INTEGRATION_DIGITAL_OCEAN_APP_PLATFORM,
INTEGRATION_CODEFRESH,
INTEGRATION_CLOUD_66
INTEGRATION_CLOUD_66,
INTEGRATION_NORTHFLANK
]);
// integration types
@ -80,11 +86,14 @@ export const INTEGRATION_TRAVISCI_API_URL = "https://api.travis-ci.com";
export const INTEGRATION_SUPABASE_API_URL = "https://api.supabase.com";
export const INTEGRATION_LARAVELFORGE_API_URL = "https://forge.laravel.com";
export const INTEGRATION_CHECKLY_API_URL = "https://api.checklyhq.com";
export const INTEGRATION_TERRAFORM_CLOUD_API_URL = "https://app.terraform.io";
export const INTEGRATION_CLOUDFLARE_PAGES_API_URL = "https://api.cloudflare.com";
export const INTEGRATION_BITBUCKET_API_URL = "https://api.bitbucket.org";
export const INTEGRATION_CODEFRESH_API_URL = "https://g.codefresh.io/api";
export const INTEGRATION_WINDMILL_API_URL = "https://app.windmill.dev/api";
export const INTEGRATION_DIGITAL_OCEAN_API_URL = "https://api.digitalocean.com";
export const INTEGRATION_CLOUD_66_API_URL = "https://app.cloud66.com/api";
export const INTEGRATION_NORTHFLANK_API_URL = "https://api.northflank.com";
export const getIntegrationOptions = async () => {
const INTEGRATION_OPTIONS = [
@ -206,6 +215,15 @@ export const getIntegrationOptions = async () => {
clientId: await getClientIdGitLab(),
docsLink: "",
},
{
name: "Terraform Cloud",
slug: "terraform-cloud",
image: "Terraform Cloud.png",
isAvailable: true,
type: "pat",
cliendId: "",
docsLink: "",
},
{
name: "Travis CI",
slug: "travisci",
@ -278,6 +296,15 @@ export const getIntegrationOptions = async () => {
clientId: "",
docsLink: "",
},
{
name: "Windmill",
slug: "windmill",
image: "Windmill.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: "",
},
{
name: "Digital Ocean App Platform",
slug: "digital-ocean-app-platform",
@ -296,6 +323,15 @@ export const getIntegrationOptions = async () => {
clientId: "",
docsLink: "",
},
{
name: "Northflank",
slug: "northflank",
image: "Northflank.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
]
return INTEGRATION_OPTIONS;

View File

@ -143,13 +143,13 @@ var runCmd = &cobra.Command{
err = executeMultipleCommandWithEnvs(command, len(secretsByKey), env)
if err != nil {
util.HandleError(err, "Unable to execute your chained command")
fmt.Println(err)
}
} else {
err = executeSingleCommandWithEnvs(args, len(secretsByKey), env)
if err != nil {
util.HandleError(err, "Unable to execute your single command")
fmt.Println(err)
}
}
},

View File

@ -2,23 +2,31 @@
title: "Changelog"
---
The changelog below reflects new product developments and updates on a monthly basis; it will be updated later this quarter to include issues-addressed on a weekly basis.
The changelog below reflects new product developments and updates on a monthly basis.
## July 2023
- Released [secret referencing and importing](https://infisical.com/docs/documentation/platform/secret-reference) across folders and environments.
- Added the [intergation with Laravel Forge](https://infisical.com/docs/integrations/cloud/laravel-forge).
- Released [secret referencing and importing](https://infisical.com/docs/documentation/platform/secret-reference) across folders and environments.
- Redesigned the project/organization experience.
- Updated the secrets overview page; users are now able to edit secrets directly from it.
- Added native [Laravel Forge integration](https://infisical.com/docs/integrations/cloud/laravel-forge).
- Added native [Codefresh integration](https://infisical.com/docs/integrations/cicd/codefresh)
- Added native [Bitbucket integration](https://infisical.com/docs/integrations/cicd/bitbucket)
- Added native [DigitalOcean App Platform integration](https://infisical.com/docs/integrations/cloud/digital-ocean-app-platform)
- Added native [Cloud66 integration](https://infisical.com/docs/integrations/cloud/cloud-66)
- Added support for Google SSO.
- Added support for [Okta](https://infisical.com/docs/documentation/platform/sso/okta) and [Azure AD](https://infisical.com/docs/documentation/platform/sso/azure) [SAML 2.0](https://infisical.com/docs/documentation/platform/saml) authentication
- Released [folders / path-based secret storage](https://infisical.com/docs/documentation/platform/folder)
- Released [webhooks](https://infisical.com/docs/documentation/platform/webhooks)
## June 2023
- Released the [Terraform Provider](https://infisical.com/docs/integrations/frameworks/terraform#5-run-terraform).
- Updated the usage and billing page. Added the free trial for the professional tier.
- Added the intergation with [Checkly](https://infisical.com/docs/integrations/cloud/checkly), [Hashicorp Vault](https://infisical.com/docs/integrations/cloud/hashicorp-vault), and [Cloudflare Pages](https://infisical.com/docs/integrations/cloud/cloudflare-pages).
- Comleted a penetration test with a `very good` result.
- Added native intergations with [Checkly](https://infisical.com/docs/integrations/cloud/checkly), [Hashicorp Vault](https://infisical.com/docs/integrations/cloud/hashicorp-vault), and [Cloudflare Pages](https://infisical.com/docs/integrations/cloud/cloudflare-pages).
- Completed a penetration test with a `very good` result.
- Added support for multi-line secrets.
## May 2023
- Released secret scanning capability for the CLI.
@ -26,7 +34,7 @@ The changelog below reflects new product developments and updates on a monthly b
- Completed penetration test.
- Released new landing page.
- Started SOC 2 (Type II) compliance certification preparation.
- Released new deployment options for Fly.io, Digital Ocean and Render.
- Released new deployment options for Fly.io, Digital Ocean and Render.
## April 2023

View File

@ -8,7 +8,7 @@ The distinguishing factor, however, is the authentication method used.
<Tabs>
<Tab title="Local development">
To use the Infisical CLI in your development environment, simply run the command below and follow the interactive guide.
To use the Infisical CLI in your local development environment, simply run the command below and follow the interactive guide.
```bash
infisical login

View File

@ -82,4 +82,28 @@ Password: `testInfisical1`
```bash
# To stop environment use Control+C (on Mac) CTRL+C (on Win) or
docker-compose -f docker-compose.dev.yml down
```
## Starting Infisical docs locally
We use [Mintlify](https://mintlify.com/) for our docs.
#### Install Mintlify CLI.
```bash
npm i -g mintlify
```
or
```bash
yarn global add mintlify
```
#### Running the docs
Go to `docs` directory and run `mintlify dev`. This will start up the docs on `localhost:3000`
```bash
# From the root directory
cd docs; mintlify dev;
```

View File

@ -0,0 +1,24 @@
---
title: "IP Allowlisting"
description: "Restrict access to your secrets in Infisical using trusted IPs"
---
Projects in Infisical can be configured to restrict client access to specific IP addresses or CIDR ranges. This applies to any client using service tokens and
can be useful, for example, for limiting access to traffic coming from corporate networks.
By default, each project is initialized with the `0.0.0.0/0` entry, representing all possible IPv4 addresses.
For enhanced security, we strongly recommend replacing the default entry with your client IPs to tighten access to your secrets.
<Note>
You must be a project `admin` to manage your project's IP whitelist.
</Note>
![IP whitelist](../../images/project-ip-whitelist.png)
## Creating a trusted IP entry
To create a trusted IP entry, head over to the **IP Whitelist** tab in your project. When creating an entry,
you can specify either a specific IP address like `192.0.2.1` or a CIDR range like `2001:db8::/32`; both IPv4 and IPv6
formats are accepted.
![IP whitelist add](../../images/project-ip-whitelist-add.png)

View File

@ -1,100 +0,0 @@
---
title: "SSO"
description: "Log in to Infisical via SSO protocols"
---
<Warning>
Infisical currently only supports SAML SSO authentication with [Okta as the
identity provider (IDP)](https://www.okta.com/). We're expanding support for
other IDPs in the coming months, so stay tuned with this issue
[here](https://github.com/Infisical/infisical/issues/442).
</Warning>
You can configure your organization in Infisical to have members authenticate with the platform via protocols like [SAML 2.0](https://en.wikipedia.org/wiki/SAML_2.0).
To note, configuring SSO retains the end-to-end encrypted architecture of Infisical because we decouple the **authentication** and **decryption** steps. In all login with SSO implementations,
your IDP cannot and will not have access to the decryption key needed to decrypt your secrets.
## Configuration
Head over to your organization Settings > Authentication > SAML SSO Configuration.
Next, press "Set up SAML SSO" in the SAML SSO and follow the instructions
below to configure SSO for your identity provider:
<Note>
Note that only members with the `owner` or `admin` roles in an organization
can configure SSO for it.
</Note>
<AccordionGroup>
<Accordion title="Okta SAML 2.0">
1. In the Okta Admin Portal, select Applications > Applications from the
navigation. On the Applications screen, select the Create App Integration
button.
![SAML Okta create app integration](../../images/saml-okta-1.png)
2. In the Create a New Application Integration dialog, select the SAML 2.0 radio button:
![SAML Okta create SAML 2.0 integration](../../images/saml-okta-2.png)
3. On the General Settings screen, give the application a unique, Infisical-specific name and select Next.
4. On the Configure SAML screen, configure the following fields:
- Single sign on URL: `https://app.infisical.com/api/v1/sso/saml2/:identifier`; we'll update the `:identifier` part later in step 6.
- Audience URI (SP Entity ID): `https://app.infisical.com`
![SAML Okta configure IDP fields](../../images/saml-okta-3.png)
<Note>
If you're self-hosting Infisical, then you will want to replace `https://app.infisical.com` with your own domain.
</Note>
4. Also on the Configure SAML screen, configure the Attribute Statements to map:
- `id -> user.id`,
- `email -> user.email`,
- `firstName -> user.firstName`
- `lastName -> user.lastName`
![SAML Okta attribute statements](../../images/saml-okta-4.png)
Once configured, select the Next button to proceed to the Feedback screen and select Finish.
5. Get IDP values
Once your application is created, select the Sign On tab for the app and select the View Setup Instructions button located on the right side of the screen:
Copy the Identity Provider Single Sign-On URL, the Identity Provider Issuer, and the X.509 Certificate to be pasted into your Infisical SAML SSO configuration details with the following map:
- `Audience -> Okta Audience URI (SP Entity ID)`
- `Entrypoint -> Okta Identity Provider Single Sign-On URL`
- `Issuer -> Identity Provider Issuer`
- `Certificate -> X.509 Certificate`.
![SAML Okta IDP values](../../images/saml-okta-5.png)
![SAML Okta paste values into Infisical](../../images/saml-okta-6.png)
6. Create the SSO configuration and copy your SSO identifier in Infisical; update `:identifier` from step 4 earlier to be this value.
![SAML Okta assignments](../../images/saml-okta-7.png)
7. Assignments
Finally, Navigate to the Assignments tab and select the Assign button:
You can assign access to the application on a user-by-user basis using the Assign to People option, or in-bulk using the Assign to Groups option.
![SAML Okta assignment](../../images/saml-okta-8.png)
At this point, you have configured everything you need within the context of the Okta Admin Portal.
8. Return to Infisical and enable SAML SSO.
Enabling SAML SSO enforces all members in your organization to only be able to log into Infisical via Okta.
</Accordion>
</AccordionGroup>

View File

@ -45,8 +45,8 @@ To add an import, simply click on the `Add import` button and provide the enviro
![secret import change order](../../images/secret-import-add.png)
The hierarchy of importing secrets is governed by a "last-one-wins" rule. This means the sequence in which you import matters - the final folder imported will override secrets from any prior folders.
Moreover, any secrets you define directly in your environment will take precedence over secrets from any imported folders.
Additionally, any secrets you define directly in your environment will override any secrets that are imported with the same name.
You can modify this sequence by dragging and rearranging the folders using the `Change Order` drag handle.
You can modify the order of folders to control overrides using the `Change Order` drag handle.
![secret import change order](../../images/secret-import-change-order.png)

View File

@ -0,0 +1,84 @@
---
title: "Azure SAML"
description: "Configure Azure SAML for Infisical SSO"
---
1. In Infisical, head over to your organization Settings > Authentication > SAML SSO Configuration and select **Set up SAML SSO**.
Next, copy the **Reply URL (Assertion Consumer Service URL)** and **Identifier (Entity ID)** to use when configuring the Azure SAML application.
![Azure SAML initial configuration](../../../images/sso/azure/init-config.png)
2. In the Azure Portal, navigate to the Azure Active Directory and select **Enterprise applications**. On this screen, select
**+ New application**.
![Azure SAML enterprise applications](../../../images/sso/azure/enterprise-applications.png)
![Azure SAML new application](../../../images/sso/azure/new-application.png)
2. On the next screen, press the **+ Create your own application** button.
Give the application a unique name like Infisical; choose the "Integrate any other application you don't find in the gallery (Non-gallery)"
option and hit the **Create** button.
![Azure SAML create own application](../../../images/sso/azure/create-own-application.png)
3. On the application overview screen, select **Single sign-on** from the left sidebar. From there,
select the **SAML** single sign-on method.
![Azure SAML sign on method](../../../images/sso/azure/sso-method.png)
4. Next, select **Edit** in the **Basic SAML Configuration** section and add/set the **Identifier (Entity ID)**
to **Entity ID** and add/set the **Reply URL (Assertion Consumer Service URL)** to **ACS URL** from step 1.
![Azure SAML edit basic configuration](../../../images/sso/azure/edit-basic-config.png)
![Azure SAML edit basic configuration 2](../../../images/sso/azure/edit-basic-config-2.png)
<Note>
If you're self-hosting Infisical, then you will want to replace
`https://app.infisical.com` with your own domain.
</Note>
5. Back in the **Set up Single Sign-On with SAML** screen, select **Edit** in the **Attributes & Claims** section and configure the following map:
- `email -> user.userprinciplename`
- `firstName -> user.firstName`
- `lastName -> user.lastName`
![Azure SAML edit attributes and claims](../../../images/sso/azure/edit-attributes-claims.png)
![Azure SAML edit attributes and claims 2](../../../images/sso/azure/edit-attributes-claims-2.png)
6. Back in the **Set up Single Sign-On with SAML** screen, select **Edit** in the **SAML Certificates** section and set the **Signing Option** field to **Sign SAML response and assertion**.
![Azure SAML edit certificate](../../../images/sso/azure/edit-saml-certificate.png)
![Azure SAML edit certificate signing option](../../../images/sso/azure/edit-saml-certificate-2.png)
7. Get IdP values:
Back in the **Set up Single Sign-On with SAML** screen, copy the **Login URL**, **Azure AD Identifier** and **SAML Certificate** to use when finishing configuring Azure SAML in Infisical.
Back in Infisical, set **Login URL** and **Azure AD Identifier** from above. Once you've done that, press **Update** to complete the required configuration.
![Azure SAML identity provider values](../../../images/sso/azure/idp-values.png)
![Azure SAML paste identity provider values](../../../images/sso/azure/idp-values-2.png)
<Note>
When pasting the certificate into Infisical, you'll want to retain `-----BEGIN
CERTIFICATE-----` and `-----END CERTIFICATE-----` at the first and last line
of the text area respectively.
Having trouble?, try copying the X509 certificate information from the Federation Metadata XML file in Azure.
</Note>
7. Assignments
Back in Azure, navigate to the **Users and groups** tab and select **+ Add user/group** to assign access to the login with SSO application on a user or group-level.
![Azure SAML assignment](../../../images/sso/azure/assignment.png)
8. Return to Infisical and enable SAML SSO.
Enabling SAML SSO enforces all members in your organization to only be able to log into Infisical via Azure.
![Azure SAML assignment](../../../images/sso/azure/enable-saml.png)

View File

@ -0,0 +1,67 @@
---
title: "JumpCloud SAML"
description: "Configure JumpCloud SAML for Infisical SSO"
---
1. In Infisical, head over to your organization Settings > Authentication > SAML SSO Configuration and select **Set up SAML SSO**.
Next, copy the **ACS URL** and **SP Entity ID** to use when configuring the JumpCloud SAML application.
![JumpCloud SAML initial configuration](../../../images/sso/jumpcloud/init-config.png)
2. In the JumpCloud Admin Portal, navigate to User Authentication > SSO and create an application. If this is your first application, select **Get Started**;
if not, select **+Add New Application**
![JumpCloud SAML new application](../../../images/sso/jumpcloud/new-application.png)
3. Next, select **Custom SAML App** to open up the **New SSO** dialog.
![JumpCloud custom SAML app](../../../images/sso/jumpcloud/custom-saml-app.png)
4. In the **General Info** tab, give the application a unique name like Infisical.
![JumpCloud general info](../../../images/sso/jumpcloud/general-info.png)
5. In the **SSO** tab, set the **SP Entity ID** and **ACS URL** from step 1; set the **IdP Entity ID** to the same value as the **SP Entity ID**.
![JumpCloud edit basic config](../../../images/sso/jumpcloud/edit-basic-config.png)
6. On the same tab, check the **Sign Assertion** checkbox and fill the **IDP URL** to something unique.
Copy the **IDP URL** to use when finishing configuring the JumpCloud SAML in Infisical.
![JumpCloud edit basic config 2](../../../images/sso/jumpcloud/edit-basic-config-2.png)
7. On the same tab, in the **Attributes** section, configure the following map:
- `email -> email`
- `firstName -> firstname`
- `lastName -> lastname`
![JumpCloud attribute statements](../../../images/sso/jumpcloud/attribute-statements.png)
Finally press activate to create the SAML application.
8. Next, select the newly created SAML application and select **Download certificate** under the **IDP Certificate Valid** dropdown
![JumpCloud download certificate](../../../images/sso/jumpcloud/download-saml-certificate.png)
9. Back in Infisical, set the **IDP URL** from step 6 and the **IdP Entity ID** from step 5. Also, paste the certificate from the previous step.
![JumpCloud IdP values](../../../images/sso/jumpcloud/idp-values.png)
<Note>
When pasting the certificate into Infisical, you'll want to retain `-----BEGIN
CERTIFICATE-----` and `-----END CERTIFICATE-----` at the first and last line
of the text area respectively.
</Note>
10. Assignments
Back in JumpCloud, navigate to the **User Groups** tab and assign users to the newly created application.
![JumpCloud SAML assignment](../../../images/sso/jumpcloud/assignment.png)
11. Return to Infisical and enable SAML SSO.
Enabling SAML SSO enforces all members in your organization to only be able to log into Infisical via JumpCloud.
![JumpCloud SAML assignment](../../../images/sso/jumpcloud/enable-saml.png)

View File

@ -0,0 +1,72 @@
---
title: "Okta SAML"
description: "Configure Okta SAML 2.0 for Infisical SSO"
---
1. In Infisical, head over to your organization Settings > Authentication > SAML SSO Configuration and select **Set up SAML SSO**.
Next, copy the **Single sign-on URL** and **Audience URI (SP Entity ID)** to use when configuring the Okta SAML 2.0 application.
![Okta SAML initial configuration](../../../images/sso/okta/init-config.png)
2. In the Okta Admin Portal, select Applications > Applications from the
navigation. On the Applications screen, select the **Create App Integration**
button.
![SAML Okta create app integration](../../../images/sso/okta/create-app-integration.png)
3. In the Create a New Application Integration dialog, select the **SAML 2.0** radio button:
![SAML Okta create SAML 2.0 integration](../../../images/sso/okta/create-saml-app.png)
4. On the General Settings screen, give the application a unique name like Infisical and select **Next**.
![SAML Okta create SAML 2.0 integration](../../../images/sso/okta/general-settings.png)
5. On the Configure SAML screen, set the **Single sign-on URL** and **Audience URI (SP Entity ID)** from step 1.
![SAML Okta configure IdP fields](../../../images/sso/okta/configure-saml.png)
<Note>
If you're self-hosting Infisical, then you will want to replace
`https://app.infisical.com` with your own domain.
</Note>
6. Also on the Configure SAML screen, configure the **Attribute Statements** to map:
- `id -> user.id`,
- `email -> user.email`,
- `firstName -> user.firstName`
- `lastName -> user.lastName`
![SAML Okta attribute statements](../../../images/sso/okta/attribute-statements.png)
Once configured, select **Next** to proceed to the Feedback screen and select **Finish**.
7. Get IdP values
Once your application is created, select the **Sign On** tab for the app and select the **View Setup Instructions** button located on the right side of the screen:
![SAML Okta view setup instructions](../../../images/sso/okta/view-setup-instructions.png)
Copy the **Identity Provider Single Sign-On URL**, the **Identity Provider Issuer**, and the **X.509 Certificate** to use when finishing configuring Okta SAML in Infisical.
![SAML Okta IdP values](../../../images/sso/okta/idp-values.png)
Back in Infisical, set **Identity Provider Single Sign-On URL**, **Identity Provider Issuer**,
and **Certificate** to **X.509 Certificate** from above. Once you've done that, press **Update** to complete the required configuration.
![SAML Okta paste values into Infisical](../../../images/sso/okta/idp-values-2.png)
8. Finally, navigate to the **Assignments** tab and select **Assign**
You can assign access to the application on a user-by-user basis using the Assign to People option, or in-bulk using the Assign to Groups option.
![SAML Okta assignment](../../../images/sso/okta/assignment.png)
At this point, you have configured everything you need within the context of the Okta Admin Portal.
9. Return to Infisical and enable SAML SSO.
Enabling SAML SSO enforces all members in your organization to only be able to log into Infisical via Okta.
![SAML Okta assignment](../../../images/sso/okta/enable-saml.png)

View File

@ -0,0 +1,20 @@
---
title: "SSO Overview"
description: "Log in to Infisical via SSO protocols"
---
<Warning>
Infisical currently has confirmed support for SAML SSO authentication with
Okta, Azure AD, and JumpCloud. We're expanding support for other IdPs in the
coming months, so stay tuned and feel free to request a IdP at this
[issue](https://github.com/Infisical/infisical/issues/442).
</Warning>
You can configure your organization in Infisical to have members authenticate with the platform via protocols like [SAML 2.0](https://en.wikipedia.org/wiki/SAML_2.0).
To note, configuring SSO retains the end-to-end encrypted architecture of Infisical because we decouple the **authentication** and **decryption** steps. In all login with SSO implementations,
your IdP cannot and will not have access to the decryption key needed to decrypt your secrets.
- [Okta SAML](/documentation/platform/sso/okta)
- [Azure SAML](/documentation/platform/sso/azure)
- [JumpCloud SAML](/documentation/platform/sso/jumpcloud)

Binary file not shown.

After

Width:  |  Height:  |  Size: 444 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 484 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 509 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 434 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 670 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 218 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 228 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 241 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 202 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 301 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 220 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 345 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 443 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 467 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 234 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 270 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 671 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 MiB

After

Width:  |  Height:  |  Size: 662 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 313 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 439 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 423 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 443 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 563 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 385 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 500 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 292 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 493 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 611 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 502 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 427 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 437 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 532 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 412 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 521 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 524 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 469 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 394 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 484 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 450 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 397 KiB

Some files were not shown because too many files have changed in this diff Show More