Compare commits

..

31 Commits

Author SHA1 Message Date
d0d6419d4d add pre commit install command to README.md 2023-05-23 20:20:10 -04:00
8b05ce11f7 add pre commit to husky 2023-05-23 20:15:39 -04:00
a7fb0786f9 improve pre commit docs 2023-05-23 19:45:10 -04:00
f2de1778cb catch case when hook path is default 2023-05-23 19:34:31 -04:00
952cf47b9a Merge branch 'main' of https://github.com/Infisical/infisical 2023-05-23 15:41:43 -07:00
1d17596af1 added boolean flag for the plan in posthog logging 2023-05-23 15:41:34 -07:00
01385687e0 make posthog failed calls level=debug 2023-05-23 18:16:13 -04:00
d2e3aa15b0 patch standalone docker image 2023-05-23 17:16:32 -04:00
96607153dc Modularize getOrganizationPlan function 2023-05-23 23:54:54 +03:00
a8502377c7 Add endpoint for updating organization plan 2023-05-23 23:14:20 +03:00
5aa99001cc Merge pull request #597 from Infisical/connect-to-license-server
Added add/remove/get organization payment methods and get cloud plans…
2023-05-23 22:39:35 +03:00
83dd35299c Added add/remove/get organization payment methods and get cloud plans from license server 2023-05-23 22:28:41 +03:00
b5b2f402ad add missing required envrs 2023-05-23 14:09:45 -04:00
ec34572087 patch invite only 2023-05-23 13:18:21 -04:00
7f7d120c2f Merge pull request #595 from Infisical/connect-to-license-server
Add support for fetching plan details from license server
2023-05-23 17:02:20 +03:00
899d46514c Add forwarding usedSeats and subscription quantity to license server on org member add/delete 2023-05-23 16:59:13 +03:00
658df21189 Add auto install pre commit 2023-05-23 00:09:00 -04:00
8341faddc5 Add support for pulling plan details from license server with LICENSE_KEY, LICENSE_SERVER_KEY 2023-05-22 15:43:33 +03:00
8e3a23e6d8 fix prod node img for standalone 2023-05-22 08:18:50 -04:00
1c89474159 hello 2023-05-19 17:23:15 -04:00
2f765600b1 add pre-commit hook 2023-05-19 17:20:27 -04:00
d9057216b5 remove keyring access during telemetry 2023-05-19 16:10:59 -04:00
6aab90590f add version to cli run telemtry 2023-05-19 12:24:49 -04:00
f7466d4855 update cli telemetry 2023-05-19 12:20:37 -04:00
ea2565ed35 Merge pull request #591 from Infisical/cli-telemetry
Cli telemetry
2023-05-19 10:55:27 -04:00
7722231656 Merge pull request #590 from Infisical/infisical-scan-docs
Infisical scan docs
2023-05-18 15:59:51 -04:00
845a476974 add secret scanning to README.md 2023-05-18 15:57:48 -04:00
fc19a17f4b update readme with scaning feature 2023-05-18 15:42:25 -04:00
0890b1912f Merge pull request #589 from Infisical/infisical-scan-docs
add docs for infisical scan
2023-05-18 15:20:26 -04:00
82ecc2d7dc add secret scanning to resources 2023-05-18 15:18:29 -04:00
446a63a917 add docs for infisical scan 2023-05-18 14:55:39 -04:00
48 changed files with 1527 additions and 152 deletions

2
.dockerignore Normal file
View File

@ -0,0 +1,2 @@
backend/node_modules
frontend/node_modules

View File

@ -15,13 +15,12 @@ monorepo:
tag_prefix: infisical-cli/
dir: cli
env:
- POSTHOG_API_KEY_FOR_CLI={{ .Env.POSTHOG_API_KEY_FOR_CLI }}
builds:
- id: darwin-build
binary: infisical
ldflags: -X github.com/Infisical/infisical-merge/packages/util.CLI_VERSION={{ .Version }}
ldflags:
- -X github.com/Infisical/infisical-merge/packages/util.CLI_VERSION={{ .Version }}
- -X github.com/Infisical/infisical-merge/packages/telemetry.POSTHOG_API_KEY_FOR_CLI={{ .Env.POSTHOG_API_KEY_FOR_CLI }}
flags:
- -trimpath
env:
@ -39,7 +38,9 @@ builds:
env:
- CGO_ENABLED=0
binary: infisical
ldflags: -X github.com/Infisical/infisical-merge/packages/util.CLI_VERSION={{ .Version }}
ldflags:
- -X github.com/Infisical/infisical-merge/packages/util.CLI_VERSION={{ .Version }}
- -X github.com/Infisical/infisical-merge/packages/telemetry.POSTHOG_API_KEY_FOR_CLI={{ .Env.POSTHOG_API_KEY_FOR_CLI }}
flags:
- -trimpath
goos:

View File

@ -3,3 +3,5 @@
. "$(dirname -- "$0")/_/husky.sh"
npx lint-staged
infisical scan git-changes --staged -v

5
.pre-commit-config.yaml Normal file
View File

@ -0,0 +1,5 @@
repos:
- repo: https://github.com/gitleaks/gitleaks
rev: v8.16.3
hooks:
- id: gitleaks

6
.pre-commit-hooks.yaml Normal file
View File

@ -0,0 +1,6 @@
- id: infisical-scan
name: Scan for hardcoded secrets
description: Will scan for hardcoded secrets using Infisical CLI
entry: infisical scan git-changes --verbose --redact --staged
language: golang
pass_filenames: false

View File

@ -77,7 +77,7 @@ RUN npm ci --only-production
COPY --from=backend-build /app .
# Production stage
FROM node:14-alpine AS production
FROM node:16-alpine AS production
WORKDIR /

View File

@ -89,6 +89,30 @@ git clone https://github.com/Infisical/infisical && cd infisical && copy .env.ex
Create an account at `http://localhost:80`
### Scan and prevent secret leaks
On top managing secrets with Infisical, you can also scan for over 140+ secret types in your files, directories and git repositories.
To scan your full git history, run:
```
infisical scan --verbose
```
To scan your uncommitted git changes, run:
```
infisical scan git-changes --verbose
```
Install pre commit hook to scan each commit before you push to your repository
```
infisical scan install --pre-commit-hook
```
Lean about Infisical's code scanning feature [here](https://infisical.com/docs/cli/scanning-overview)
## Open-source vs. paid
This repo available under the [MIT expat license](https://github.com/Infisical/infisical/blob/main/LICENSE), with the exception of the `ee` directory which will contain premium enterprise features requiring a Infisical license in the future.

View File

@ -40,6 +40,7 @@
"libsodium-wrappers": "^0.7.10",
"lodash": "^4.17.21",
"mongoose": "^6.10.5",
"node-cache": "^5.1.2",
"nodemailer": "^6.8.0",
"posthog-node": "^2.6.0",
"query-string": "^7.1.3",

View File

@ -31,6 +31,7 @@
"libsodium-wrappers": "^0.7.10",
"lodash": "^4.17.21",
"mongoose": "^6.10.5",
"node-cache": "^5.1.2",
"nodemailer": "^6.8.0",
"posthog-node": "^2.6.0",
"query-string": "^7.1.3",

View File

@ -5,7 +5,7 @@ const client = new InfisicalClient({
});
export const getPort = async () => (await client.getSecret('PORT')).secretValue || 4000;
export const getInviteOnlySignup = async () => (await client.getSecret('INVITE_ONLY_SIGNUP')).secretValue == undefined ? false : (await client.getSecret('INVITE_ONLY_SIGNUP')).secretValue;
export const getInviteOnlySignup = async () => (await client.getSecret('INVITE_ONLY_SIGNUP')).secretValue === 'true'
export const getEncryptionKey = async () => (await client.getSecret('ENCRYPTION_KEY')).secretValue;
export const getSaltRounds = async () => parseInt((await client.getSecret('SALT_ROUNDS')).secretValue) || 10;
export const getJwtAuthLifetime = async () => (await client.getSecret('JWT_AUTH_LIFETIME')).secretValue || '10d';
@ -45,12 +45,19 @@ export const getSmtpUsername = async () => (await client.getSecret('SMTP_USERNAM
export const getSmtpPassword = async () => (await client.getSecret('SMTP_PASSWORD')).secretValue;
export const getSmtpFromAddress = async () => (await client.getSecret('SMTP_FROM_ADDRESS')).secretValue;
export const getSmtpFromName = async () => (await client.getSecret('SMTP_FROM_NAME')).secretValue || 'Infisical';
export const getLicenseKey = async () => (await client.getSecret('LICENSE_KEY')).secretValue;
export const getLicenseServerKey = async () => (await client.getSecret('LICENSE_SERVER_KEY')).secretValue;
export const getLicenseServerUrl = async () => (await client.getSecret('LICENSE_SERVER_URL')).secretValue || 'https://portal.infisical.com';
// TODO: deprecate from here
export const getStripeProductStarter = async () => (await client.getSecret('STRIPE_PRODUCT_STARTER')).secretValue;
export const getStripeProductPro = async () => (await client.getSecret('STRIPE_PRODUCT_PRO')).secretValue;
export const getStripeProductTeam = async () => (await client.getSecret('STRIPE_PRODUCT_TEAM')).secretValue;
export const getStripePublishableKey = async () => (await client.getSecret('STRIPE_PUBLISHABLE_KEY')).secretValue;
export const getStripeSecretKey = async () => (await client.getSecret('STRIPE_SECRET_KEY')).secretValue;
export const getStripeWebhookSecret = async () => (await client.getSecret('STRIPE_WEBHOOK_SECRET')).secretValue;
export const getTelemetryEnabled = async () => (await client.getSecret('TELEMETRY_ENABLED')).secretValue !== 'false' && true;
export const getLoopsApiKey = async () => (await client.getSecret('LOOPS_API_KEY')).secretValue;
export const getSmtpConfigured = async () => (await client.getSecret('SMTP_HOST')).secretValue == '' || (await client.getSecret('SMTP_HOST')).secretValue == undefined ? false : true

View File

@ -1,10 +1,24 @@
import axios from 'axios';
import axiosRetry from 'axios-retry';
import {
getLicenseServerKeyAuthToken,
setLicenseServerKeyAuthToken,
getLicenseKeyAuthToken,
setLicenseKeyAuthToken
} from './storage';
import {
getLicenseKey,
getLicenseServerKey,
getLicenseServerUrl
} from './index';
const axiosInstance = axios.create();
// should have JWT to interact with the license server
export const licenseServerKeyRequest = axios.create();
export const licenseKeyRequest = axios.create();
export const standardRequest = axios.create();
// add retry functionality to the axios instance
axiosRetry(axiosInstance, {
axiosRetry(standardRequest, {
retries: 3,
retryDelay: axiosRetry.exponentialDelay, // exponential back-off delay between retries
retryCondition: (error) => {
@ -13,4 +27,98 @@ axiosRetry(axiosInstance, {
},
});
export default axiosInstance;
export const refreshLicenseServerKeyToken = async () => {
const licenseServerKey = await getLicenseServerKey();
const licenseServerUrl = await getLicenseServerUrl();
const { data: { token } } = await standardRequest.post(
`${licenseServerUrl}/api/auth/v1/license-server-login`, {},
{
headers: {
'X-API-KEY': licenseServerKey
}
}
);
setLicenseServerKeyAuthToken(token);
return token;
}
export const refreshLicenseKeyToken = async () => {
const licenseKey = await getLicenseKey();
const licenseServerUrl = await getLicenseServerUrl();
const { data: { token } } = await standardRequest.post(
`${licenseServerUrl}/api/auth/v1/license-login`, {},
{
headers: {
'X-API-KEY': licenseKey
}
}
);
setLicenseKeyAuthToken(token);
return token;
}
licenseServerKeyRequest.interceptors.request.use((config) => {
const token = getLicenseServerKeyAuthToken();
if (token && config.headers) {
// eslint-disable-next-line no-param-reassign
config.headers.Authorization = `Bearer ${token}`;
}
return config;
}, (err) => {
return Promise.reject(err);
});
licenseServerKeyRequest.interceptors.response.use((response) => {
return response
}, async function (err) {
const originalRequest = err.config;
if (err.response.status === 401 && !originalRequest._retry) {
originalRequest._retry = true;
// refresh
const token = await refreshLicenseServerKeyToken();
axios.defaults.headers.common['Authorization'] = 'Bearer ' + token;
return licenseServerKeyRequest(originalRequest);
}
return Promise.reject(err);
});
licenseKeyRequest.interceptors.request.use((config) => {
const token = getLicenseKeyAuthToken();
if (token && config.headers) {
// eslint-disable-next-line no-param-reassign
config.headers.Authorization = `Bearer ${token}`;
}
return config;
}, (err) => {
return Promise.reject(err);
});
licenseKeyRequest.interceptors.response.use((response) => {
return response
}, async function (err) {
const originalRequest = err.config;
if (err.response.status === 401 && !originalRequest._retry) {
originalRequest._retry = true;
// refresh
const token = await refreshLicenseKeyToken();
axios.defaults.headers.common['Authorization'] = 'Bearer ' + token;
return licenseKeyRequest(originalRequest);
}
return Promise.reject(err);
});

View File

@ -0,0 +1,30 @@
const MemoryLicenseServerKeyTokenStorage = () => {
let authToken: string;
return {
setToken: (token: string) => {
authToken = token;
},
getToken: () => authToken
};
};
const MemoryLicenseKeyTokenStorage = () => {
let authToken: string;
return {
setToken: (token: string) => {
authToken = token;
},
getToken: () => authToken
};
};
const licenseServerTokenStorage = MemoryLicenseServerKeyTokenStorage();
const licenseTokenStorage = MemoryLicenseKeyTokenStorage();
export const getLicenseServerKeyAuthToken = licenseServerTokenStorage.getToken;
export const setLicenseServerKeyAuthToken = licenseServerTokenStorage.setToken;
export const getLicenseKeyAuthToken = licenseTokenStorage.getToken;
export const setLicenseKeyAuthToken = licenseTokenStorage.setToken;

View File

@ -16,7 +16,7 @@ import {
INTEGRATION_VERCEL_API_URL,
INTEGRATION_RAILWAY_API_URL
} from '../../variables';
import request from '../../config/request';
import { standardRequest } from '../../config/request';
/***
* Return integration authorization with id [integrationAuthId]
@ -229,7 +229,7 @@ export const getIntegrationAuthVercelBranches = async (req: Request, res: Respon
let branches: string[] = [];
if (appId && appId !== '') {
const { data }: { data: VercelBranch[] } = await request.get(
const { data }: { data: VercelBranch[] } = await standardRequest.get(
`${INTEGRATION_VERCEL_API_URL}/v1/integrations/git-branches`,
{
params,
@ -292,7 +292,7 @@ export const getIntegrationAuthRailwayEnvironments = async (req: Request, res: R
projectId: appId
}
const { data: { data: { environments: { edges } } } } = await request.post(INTEGRATION_RAILWAY_API_URL, {
const { data: { data: { environments: { edges } } } } = await standardRequest.post(INTEGRATION_RAILWAY_API_URL, {
query,
variables,
}, {
@ -372,7 +372,7 @@ export const getIntegrationAuthRailwayServices = async (req: Request, res: Respo
id: appId
}
const { data: { data: { project: { services: { edges } } } } } = await request.post(INTEGRATION_RAILWAY_API_URL, {
const { data: { data: { project: { services: { edges } } } } } = await standardRequest.post(INTEGRATION_RAILWAY_API_URL, {
query,
variables
}, {

View File

@ -135,6 +135,7 @@ export const inviteUserToOrganization = async (req: Request, res: Response) => {
}
if (!inviteeMembershipOrg) {
await new MembershipOrg({
user: invitee,
inviteEmail: inviteeEmail,
@ -246,6 +247,10 @@ export const verifyUserToOrganization = async (req: Request, res: Response) => {
// membership can be approved and redirected to login/dashboard
membershipOrg.status = ACCEPTED;
await membershipOrg.save();
await updateSubscriptionOrgQuantity({
organizationId
});
return res.status(200).send({
message: 'Successfully verified email',

View File

@ -21,14 +21,6 @@ export const beginEmailSignup = async (req: Request, res: Response) => {
try {
email = req.body.email;
if (await getInviteOnlySignup()) {
// Only one user can create an account without being invited. The rest need to be invited in order to make an account
const userCount = await User.countDocuments({})
if (userCount != 0) {
throw BadRequestError({ message: "New user sign ups are not allowed at this time. You must be invited to sign up." })
}
}
const user = await User.findOne({ email }).select('+publicKey');
if (user && user?.publicKey) {
// case: user has already completed account
@ -74,6 +66,14 @@ export const verifyEmailSignup = async (req: Request, res: Response) => {
});
}
if (await getInviteOnlySignup()) {
// Only one user can create an account without being invited. The rest need to be invited in order to make an account
const userCount = await User.countDocuments({})
if (userCount != 0) {
throw BadRequestError({ message: "New user sign ups are not allowed at this time. You must be invited to sign up." })
}
}
// verify email
if (await getSmtpConfigured()) {
await checkEmailVerification({

View File

@ -1,7 +1,7 @@
import to from 'await-to-js';
import { Types } from 'mongoose';
import { Request, Response } from 'express';
import { ISecret, Secret } from '../../models';
import { ISecret, Secret, Workspace } from '../../models';
import { IAction, SecretVersion } from '../../ee/models';
import {
SECRET_PERSONAL,
@ -14,7 +14,7 @@ import {
import { UnauthorizedRequestError, ValidationError } from '../../utils/errors';
import { EventService } from '../../services';
import { eventPushSecrets } from '../../events';
import { EESecretService, EELogService } from '../../ee/services';
import { EESecretService, EELogService, EELicenseService } from '../../ee/services';
import { TelemetryService, SecretService } from '../../services';
import { getChannelFromUserAgent } from '../../utils/posthog';
import { PERMISSION_WRITE_SECRETS } from '../../variables';
@ -48,6 +48,14 @@ export const batchSecrets = async (req: Request, res: Response) => {
environment: string;
requests: BatchSecretRequest[];
} = req.body;
const organizationId = (
await Workspace.findOne({
_id: workspaceId
})
)?.organization?.toString();
const orgPlan = await EELicenseService.getOrganizationPlan(organizationId || '');
const isPaid = orgPlan.slug != 'starter';
const createSecrets: BatchSecret[] = [];
const updateSecrets: BatchSecret[] = [];
@ -139,7 +147,8 @@ export const batchSecrets = async (req: Request, res: Response) => {
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}
@ -226,7 +235,8 @@ export const batchSecrets = async (req: Request, res: Response) => {
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}
@ -261,7 +271,8 @@ export const batchSecrets = async (req: Request, res: Response) => {
environment,
workspaceId,
channel: channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}
@ -376,6 +387,14 @@ export const createSecrets = async (req: Request, res: Response) => {
}
}
const organizationId = (
await Workspace.findOne({
_id: workspaceId
})
)?.organization?.toString();
const orgPlan = await EELicenseService.getOrganizationPlan(organizationId || '');
const isPaid = orgPlan.slug != 'starter';
let listOfSecretsToCreate;
if (Array.isArray(req.body.secrets)) {
// case: create multiple secrets
@ -531,7 +550,8 @@ export const createSecrets = async (req: Request, res: Response) => {
environment,
workspaceId,
channel: channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}
@ -595,6 +615,14 @@ export const getSecrets = async (req: Request, res: Response) => {
const normalizedPath = normalizePath(secretsPath as string)
const folders = await getFoldersInDirectory(workspaceId as string, environment as string, normalizedPath)
const organizationId = (
await Workspace.findOne({
_id: workspaceId
})
)?.organization?.toString();
const orgPlan = await EELicenseService.getOrganizationPlan(organizationId || '');
const isPaid = orgPlan.slug != 'starter';
// secrets to return
let secrets: ISecret[] = [];
@ -727,7 +755,8 @@ export const getSecrets = async (req: Request, res: Response) => {
environment,
workspaceId,
channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}
@ -938,6 +967,14 @@ export const updateSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(key)
})
const organizationId = (
await Workspace.findOne({
_id: key
})
)?.organization?.toString();
const orgPlan = await EELicenseService.getOrganizationPlan(organizationId || '');
const isPaid = orgPlan.slug != 'starter';
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient) {
postHogClient.capture({
@ -950,7 +987,8 @@ export const updateSecrets = async (req: Request, res: Response) => {
environment: workspaceSecretObj[key][0].environment,
workspaceId: key,
channel: channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}
@ -1072,6 +1110,14 @@ export const deleteSecrets = async (req: Request, res: Response) => {
workspaceId: new Types.ObjectId(key)
});
const organizationId = (
await Workspace.findOne({
_id: key
})
)?.organization?.toString();
const orgPlan = await EELicenseService.getOrganizationPlan(organizationId || '');
const isPaid = orgPlan.slug != 'starter';
const postHogClient = await TelemetryService.getPostHogClient();
if (postHogClient) {
postHogClient.capture({
@ -1084,7 +1130,8 @@ export const deleteSecrets = async (req: Request, res: Response) => {
environment: workspaceSecretObj[key][0].environment,
workspaceId: key,
channel: channel,
userAgent: req.headers?.['user-agent']
userAgent: req.headers?.['user-agent'],
isPaid
}
});
}

View File

@ -7,8 +7,9 @@ import {
} from '../../helpers/signup';
import { issueAuthTokens } from '../../helpers/auth';
import { INVITED, ACCEPTED } from '../../variables';
import request from '../../config/request';
import { standardRequest } from '../../config/request';
import { getLoopsApiKey, getHttpsEnabled } from '../../config';
import { updateSubscriptionOrgQuantity } from '../../helpers/organization';
/**
* Complete setting up user by adding their personal and auth information as part of the
@ -87,6 +88,19 @@ export const completeAccountSignup = async (req: Request, res: Response) => {
user
});
// update organization membership statuses that are
// invited to completed with user attached
const membershipsToUpdate = await MembershipOrg.find({
inviteEmail: email,
status: INVITED
});
membershipsToUpdate.forEach(async (membership) => {
await updateSubscriptionOrgQuantity({
organizationId: membership.organization.toString()
});
});
// update organization membership statuses that are
// invited to completed with user attached
await MembershipOrg.updateMany(
@ -109,7 +123,7 @@ export const completeAccountSignup = async (req: Request, res: Response) => {
// sending a welcome email to new users
if (await getLoopsApiKey()) {
await request.post("https://app.loops.so/api/v1/events/send", {
await standardRequest.post("https://app.loops.so/api/v1/events/send", {
"email": email,
"eventName": "Sign Up",
"firstName": firstName,
@ -206,9 +220,20 @@ export const completeAccountInvite = async (req: Request, res: Response) => {
if (!user)
throw new Error('Failed to complete account for non-existent user');
// update organization membership statuses that are
// invited to completed with user attached
const membershipsToUpdate = await MembershipOrg.find({
inviteEmail: email,
status: INVITED
});
membershipsToUpdate.forEach(async (membership) => {
await updateSubscriptionOrgQuantity({
organizationId: membership.organization.toString()
});
});
await MembershipOrg.updateMany(
{
inviteEmail: email,

View File

@ -0,0 +1,34 @@
import * as Sentry from '@sentry/node';
import { Request, Response } from 'express';
import { EELicenseService } from '../../services';
import { getLicenseServerUrl } from '../../../config';
import { licenseServerKeyRequest } from '../../../config/request';
/**
* Return available cloud product information.
* Note: Nicely formatted to easily construct a table from
* @param req
* @param res
* @returns
*/
export const getCloudProducts = async (req: Request, res: Response) => {
try {
const billingCycle = req.query['billing-cycle'] as string;
if (EELicenseService.instanceType === 'cloud') {
const { data } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/cloud-products?billing-cycle=${billingCycle}`
);
return res.status(200).send(data);
}
} catch (err) {
Sentry.setUser({ email: req.user.email });
Sentry.captureException(err);
}
return res.status(200).send({
head: [],
rows: []
});
}

View File

@ -1,15 +1,19 @@
import * as stripeController from './stripeController';
import * as secretController from './secretController';
import * as secretSnapshotController from './secretSnapshotController';
import * as organizationsController from './organizationsController';
import * as workspaceController from './workspaceController';
import * as actionController from './actionController';
import * as membershipController from './membershipController';
import * as cloudProductsController from './cloudProductsController';
export {
stripeController,
secretController,
secretSnapshotController,
organizationsController,
workspaceController,
actionController,
membershipController
membershipController,
cloudProductsController
}

View File

@ -0,0 +1,84 @@
import { Request, Response } from 'express';
import { getLicenseServerUrl } from '../../../config';
import { licenseServerKeyRequest } from '../../../config/request';
import { EELicenseService } from '../../services';
/**
* Return the organization's current plan and allowed feature set
*/
export const getOrganizationPlan = async (req: Request, res: Response) => {
const plan = await EELicenseService.getOrganizationPlan(req.organization._id.toString());
// cache fetched plan for organization
EELicenseService.localFeatureSet.set(req.organization._id.toString(), plan);
return res.status(200).send({
plan
});
}
/**
* Update the organization plan to product with id [productId]
* @param req
* @param res
* @returns
*/
export const updateOrganizationPlan = async (req: Request, res: Response) => {
const {
productId
} = req.body;
const { data } = await licenseServerKeyRequest.patch(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/cloud-plan`,
{
productId
}
);
return res.status(200).send(data);
}
/**
* Return the organization's payment methods on file
*/
export const getOrganizationPmtMethods = async (req: Request, res: Response) => {
const { data: { pmtMethods } } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/payment-methods`
);
return res.status(200).send({
pmtMethods
});
}
/**
* Return a Stripe session URL to add payment method for organization
*/
export const addOrganizationPmtMethod = async (req: Request, res: Response) => {
const {
success_url,
cancel_url
} = req.body;
const { data: { url } } = await licenseServerKeyRequest.post(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/payment-methods`,
{
success_url,
cancel_url
}
);
return res.status(200).send({
url
});
}
export const deleteOrganizationPmtMethod = async (req: Request, res: Response) => {
const { pmtMethodId } = req.params;
const { data } = await licenseServerKeyRequest.delete(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${req.organization.customerId}/billing-details/payment-methods/${pmtMethodId}`,
);
return res.status(200).send(data);
}

View File

@ -0,0 +1,20 @@
import express from 'express';
const router = express.Router();
import {
requireAuth,
validateRequest
} from '../../../middleware';
import { query } from 'express-validator';
import { cloudProductsController } from '../../controllers/v1';
router.get(
'/',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
query('billing-cycle').exists().isIn(['monthly', 'yearly']),
validateRequest,
cloudProductsController.getCloudProducts
);
export default router;

View File

@ -1,11 +1,15 @@
import secret from './secret';
import secretSnapshot from './secretSnapshot';
import organizations from './organizations';
import workspace from './workspace';
import action from './action';
import cloudProducts from './cloudProducts';
export {
secret,
secretSnapshot,
organizations,
workspace,
action
action,
cloudProducts
}

View File

@ -0,0 +1,87 @@
import express from 'express';
const router = express.Router();
import {
requireAuth,
requireOrganizationAuth,
validateRequest
} from '../../../middleware';
import { param, body } from 'express-validator';
import { organizationsController } from '../../controllers/v1';
import {
OWNER, ADMIN, MEMBER, ACCEPTED
} from '../../../variables';
router.get(
'/:organizationId/plan',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED]
}),
param('organizationId').exists().trim(),
validateRequest,
organizationsController.getOrganizationPlan
);
router.patch(
'/:organizationId/plan',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED]
}),
param('organizationId').exists().trim(),
body('productId').exists().isString(),
validateRequest,
organizationsController.updateOrganizationPlan
);
router.get(
'/:organizationId/billing-details/payment-methods',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED]
}),
param('organizationId').exists().trim(),
validateRequest,
organizationsController.getOrganizationPmtMethods
);
router.post(
'/:organizationId/billing-details/payment-methods',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED]
}),
param('organizationId').exists().trim(),
body('success_url').exists().isString(),
body('cancel_url').exists().isString(),
validateRequest,
organizationsController.addOrganizationPmtMethod
);
router.delete(
'/:organizationId/billing-details/payment-methods/:pmtMethodId',
requireAuth({
acceptedAuthModes: ['jwt', 'apiKey']
}),
requireOrganizationAuth({
acceptedRoles: [OWNER, ADMIN, MEMBER],
acceptedStatuses: [ACCEPTED]
}),
param('organizationId').exists().trim(),
validateRequest,
organizationsController.deleteOrganizationPmtMethod
);
export default router;

View File

@ -7,7 +7,7 @@ import {
requireAuth,
validateRequest
} from '../../../middleware';
import { param, body } from 'express-validator';
import { param } from 'express-validator';
import { ADMIN, MEMBER } from '../../../variables';
import { secretSnapshotController } from '../../controllers/v1';

View File

@ -1,12 +1,124 @@
import NodeCache from 'node-cache';
import * as Sentry from '@sentry/node';
import {
getLicenseKey,
getLicenseServerKey,
getLicenseServerUrl
} from '../../config';
import {
licenseKeyRequest,
licenseServerKeyRequest,
refreshLicenseServerKeyToken,
refreshLicenseKeyToken
} from '../../config/request';
import { Organization } from '../../models';
import { OrganizationNotFoundError } from '../../utils/errors';
interface FeatureSet {
_id: string | null;
slug: 'starter' | 'team' | 'pro' | 'enterprise' | null;
tier: number | null;
projectLimit: number | null;
memberLimit: number | null;
secretVersioning: boolean;
pitRecovery: boolean;
rbac: boolean;
customRateLimits: boolean;
customAlerts: boolean;
auditLogs: boolean;
}
/**
* Class to handle Enterprise Edition license actions
* Class to handle license/plan configurations:
* - Infisical Cloud: Fetch and cache customer plans in [localFeatureSet]
* - Self-hosted regular: Use default global feature set
* - Self-hosted enterprise: Fetch and update global feature set
*/
class EELicenseService {
private readonly _isLicenseValid: boolean;
private readonly _isLicenseValid: boolean; // TODO: deprecate
public instanceType: 'self-hosted' | 'enterprise-self-hosted' | 'cloud' = 'self-hosted';
public globalFeatureSet: FeatureSet = {
_id: null,
slug: null,
tier: null,
projectLimit: null,
memberLimit: null,
secretVersioning: true,
pitRecovery: true,
rbac: true,
customRateLimits: true,
customAlerts: true,
auditLogs: false
}
public localFeatureSet: NodeCache;
constructor(licenseKey: string) {
constructor() {
this._isLicenseValid = true;
this.localFeatureSet = new NodeCache({
stdTTL: 300
});
}
public async getOrganizationPlan(organizationId: string) {
try {
if (this.instanceType === 'cloud') {
const cachedPlan = this.localFeatureSet.get(organizationId);
if (cachedPlan) return cachedPlan;
const organization = await Organization.findById(organizationId);
if (!organization) throw OrganizationNotFoundError();
const { data: { currentPlan } } = await licenseServerKeyRequest.get(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${organization.customerId}/cloud-plan`
);
return currentPlan;
}
} catch (err) {
return this.globalFeatureSet;
}
return this.globalFeatureSet;
}
public async initGlobalFeatureSet() {
const licenseServerKey = await getLicenseServerKey();
const licenseKey = await getLicenseKey();
try {
if (licenseServerKey) {
// license server key is present -> validate it
const token = await refreshLicenseServerKeyToken()
if (token) {
this.instanceType = 'cloud';
}
return;
}
if (licenseKey) {
// license key is present -> validate it
const token = await refreshLicenseKeyToken();
if (token) {
const { data: { currentPlan } } = await licenseKeyRequest.get(
`${await getLicenseServerUrl()}/api/license/v1/plan`
);
this.globalFeatureSet = currentPlan;
this.instanceType = 'enterprise-self-hosted';
}
}
} catch (err) {
// case: self-hosted free
Sentry.setUser(null);
Sentry.captureException(err);
}
}
public get isLicenseValid(): boolean {
@ -14,4 +126,4 @@ class EELicenseService {
}
}
export default new EELicenseService('N/A');
export default new EELicenseService();

View File

@ -29,6 +29,16 @@ import {
} from "../utils/errors";
import { validateUserClientForOrganization } from "../helpers/user";
import { validateServiceAccountClientForOrganization } from "../helpers/serviceAccount";
import {
EELicenseService
} from '../ee/services';
import {
getLicenseServerUrl
} from '../config';
import {
licenseServerKeyRequest,
licenseKeyRequest
} from '../config/request';
/**
* Validate accepted clients for organization with id [organizationId]
@ -228,30 +238,35 @@ const updateSubscriptionOrgQuantity = async ({
});
if (organization && organization.customerId) {
const quantity = await MembershipOrg.countDocuments({
organization: organizationId,
status: ACCEPTED,
});
const stripe = new Stripe(await getStripeSecretKey(), {
apiVersion: "2022-08-01",
});
const subscription = (
await stripe.subscriptions.list({
customer: organization.customerId,
})
).data[0];
stripeSubscription = await stripe.subscriptions.update(subscription.id, {
items: [
if (EELicenseService.instanceType === 'cloud') {
// instance of Infisical is a cloud instance
const quantity = await MembershipOrg.countDocuments({
organization: new Types.ObjectId(organizationId),
status: ACCEPTED,
});
await licenseServerKeyRequest.patch(
`${await getLicenseServerUrl()}/api/license-server/v1/customers/${organization.customerId}/cloud-plan`,
{
id: subscription.items.data[0].id,
price: subscription.items.data[0].price.id,
quantity,
},
],
});
quantity
}
);
}
if (EELicenseService.instanceType === 'enterprise-self-hosted') {
// instance of Infisical is an enterprise self-hosted instance
const usedSeats = await MembershipOrg.countDocuments({
status: ACCEPTED
});
await licenseKeyRequest.patch(
`${await getLicenseServerUrl()}/api/license/v1/license`,
{
usedSeats
}
);
}
}
return stripeSubscription;

View File

@ -89,7 +89,7 @@ const validateClientForWorkspace = async ({
requiredPermissions
});
return ({ membership });
return ({ membership, workspace });
}
if (authData.authMode === AUTH_MODE_SERVICE_ACCOUNT && authData.authPayload instanceof ServiceAccount) {
@ -123,7 +123,7 @@ const validateClientForWorkspace = async ({
requiredPermissions
});
return ({ membership });
return ({ membership, workspace });
}
throw UnauthorizedRequestError({

View File

@ -1,4 +1,3 @@
import mongoose from 'mongoose';
import dotenv from 'dotenv';
dotenv.config();
import express from 'express';
@ -6,6 +5,7 @@ import helmet from 'helmet';
import cors from 'cors';
import * as Sentry from '@sentry/node';
import { DatabaseService } from './services';
import { EELicenseService } from './ee/services';
import { setUpHealthEndpoint } from './services/health';
import { initSmtp } from './services/smtp';
import { TelemetryService } from './services';
@ -25,7 +25,9 @@ import {
workspace as eeWorkspaceRouter,
secret as eeSecretRouter,
secretSnapshot as eeSecretSnapshotRouter,
action as eeActionRouter
action as eeActionRouter,
organizations as eeOrganizationsRouter,
cloudProducts as eeCloudProductsRouter
} from './ee/routes/v1';
import {
signup as v1SignupRouter,
@ -74,14 +76,15 @@ import {
getNodeEnv,
getPort,
getSentryDSN,
getSiteURL,
getSmtpHost
getSiteURL
} from './config';
const main = async () => {
TelemetryService.logTelemetryMessage();
setTransporter(await initSmtp());
await EELicenseService.initGlobalFeatureSet();
await DatabaseService.initDatabase(await getMongoURL());
if ((await getNodeEnv()) !== 'test') {
Sentry.init({
@ -119,6 +122,8 @@ const main = async () => {
app.use('/api/v1/secret-snapshot', eeSecretSnapshotRouter);
app.use('/api/v1/workspace', eeWorkspaceRouter);
app.use('/api/v1/action', eeActionRouter);
app.use('/api/v1/organizations', eeOrganizationsRouter);
app.use('/api/v1/cloud-products', eeCloudProductsRouter);
// v1 routes (default)
app.use('/api/v1/signup', v1SignupRouter);

View File

@ -1,6 +1,6 @@
import { Octokit } from "@octokit/rest";
import { IIntegrationAuth } from "../models";
import request from "../config/request";
import { standardRequest } from "../config/request";
import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_AWS_PARAMETER_STORE,
@ -134,7 +134,7 @@ const getApps = async ({
*/
const getAppsHeroku = async ({ accessToken }: { accessToken: string }) => {
const res = (
await request.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
await standardRequest.get(`${INTEGRATION_HEROKU_API_URL}/apps`, {
headers: {
Accept: "application/vnd.heroku+json; version=3",
Authorization: `Bearer ${accessToken}`,
@ -164,7 +164,7 @@ const getAppsVercel = async ({
accessToken: string;
}) => {
const res = (
await request.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
await standardRequest.get(`${INTEGRATION_VERCEL_API_URL}/v9/projects`, {
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
@ -208,7 +208,7 @@ const getAppsNetlify = async ({ accessToken }: { accessToken: string }) => {
filter: 'all'
});
const { data } = await request.get(
const { data } = await standardRequest.get(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/sites`,
{
params,
@ -310,7 +310,7 @@ const getAppsGithub = async ({ accessToken }: { accessToken: string }) => {
*/
const getAppsRender = async ({ accessToken }: { accessToken: string }) => {
const res = (
await request.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
await standardRequest.get(`${INTEGRATION_RENDER_API_URL}/v1/services`, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
@ -358,7 +358,7 @@ const getAppsRailway = async ({ accessToken }: { accessToken: string }) => {
projects: { edges },
},
},
} = await request.post(
} = await standardRequest.post(
INTEGRATION_RAILWAY_API_URL,
{
query,
@ -402,7 +402,7 @@ const getAppsFlyio = async ({ accessToken }: { accessToken: string }) => {
`;
const res = (
await request.post(
await standardRequest.post(
INTEGRATION_FLYIO_API_URL,
{
query,
@ -436,7 +436,7 @@ const getAppsFlyio = async ({ accessToken }: { accessToken: string }) => {
*/
const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
const res = (
await request.get(`${INTEGRATION_CIRCLECI_API_URL}/v1.1/projects`, {
await standardRequest.get(`${INTEGRATION_CIRCLECI_API_URL}/v1.1/projects`, {
headers: {
"Circle-Token": accessToken,
"Accept-Encoding": "application/json",
@ -455,7 +455,7 @@ const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
const getAppsTravisCI = async ({ accessToken }: { accessToken: string }) => {
const res = (
await request.get(`${INTEGRATION_TRAVISCI_API_URL}/repos`, {
await standardRequest.get(`${INTEGRATION_TRAVISCI_API_URL}/repos`, {
headers: {
Authorization: `token ${accessToken}`,
"Accept-Encoding": "application/json",
@ -502,7 +502,7 @@ const getAppsGitlab = async ({
per_page: String(perPage),
});
const { data } = await request.get(
const { data } = await standardRequest.get(
`${INTEGRATION_GITLAB_API_URL}/v4/groups/${teamId}/projects`,
{
params,
@ -530,7 +530,7 @@ const getAppsGitlab = async ({
// case: fetch projects for individual in GitLab
const { id } = (
await request.get(`${INTEGRATION_GITLAB_API_URL}/v4/user`, {
await standardRequest.get(`${INTEGRATION_GITLAB_API_URL}/v4/user`, {
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json",
@ -544,7 +544,7 @@ const getAppsGitlab = async ({
per_page: String(perPage),
});
const { data } = await request.get(
const { data } = await standardRequest.get(
`${INTEGRATION_GITLAB_API_URL}/v4/users/${id}/projects`,
{
params,
@ -581,7 +581,7 @@ const getAppsGitlab = async ({
* @returns {String} apps.name - name of Supabase app
*/
const getAppsSupabase = async ({ accessToken }: { accessToken: string }) => {
const { data } = await request.get(
const { data } = await standardRequest.get(
`${INTEGRATION_SUPABASE_API_URL}/v1/projects`,
{
headers: {

View File

@ -1,4 +1,4 @@
import request from "../config/request";
import { standardRequest } from "../config/request";
import {
INTEGRATION_AZURE_KEY_VAULT,
INTEGRATION_HEROKU,
@ -142,7 +142,7 @@ const exchangeCodeAzure = async ({ code }: { code: string }) => {
const accessExpiresAt = new Date();
const res: ExchangeCodeAzureResponse = (
await request.post(
await standardRequest.post(
INTEGRATION_AZURE_TOKEN_URL,
new URLSearchParams({
grant_type: "authorization_code",
@ -178,7 +178,7 @@ const exchangeCodeHeroku = async ({ code }: { code: string }) => {
const accessExpiresAt = new Date();
const res: ExchangeCodeHerokuResponse = (
await request.post(
await standardRequest.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: "authorization_code",
@ -209,7 +209,7 @@ const exchangeCodeHeroku = async ({ code }: { code: string }) => {
*/
const exchangeCodeVercel = async ({ code }: { code: string }) => {
const res: ExchangeCodeVercelResponse = (
await request.post(
await standardRequest.post(
INTEGRATION_VERCEL_TOKEN_URL,
new URLSearchParams({
code: code,
@ -240,7 +240,7 @@ const exchangeCodeVercel = async ({ code }: { code: string }) => {
*/
const exchangeCodeNetlify = async ({ code }: { code: string }) => {
const res: ExchangeCodeNetlifyResponse = (
await request.post(
await standardRequest.post(
INTEGRATION_NETLIFY_TOKEN_URL,
new URLSearchParams({
grant_type: "authorization_code",
@ -252,14 +252,14 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
)
).data;
const res2 = await request.get("https://api.netlify.com/api/v1/sites", {
const res2 = await standardRequest.get("https://api.netlify.com/api/v1/sites", {
headers: {
Authorization: `Bearer ${res.access_token}`,
},
});
const res3 = (
await request.get("https://api.netlify.com/api/v1/accounts", {
await standardRequest.get("https://api.netlify.com/api/v1/accounts", {
headers: {
Authorization: `Bearer ${res.access_token}`,
},
@ -287,7 +287,7 @@ const exchangeCodeNetlify = async ({ code }: { code: string }) => {
*/
const exchangeCodeGithub = async ({ code }: { code: string }) => {
const res: ExchangeCodeGithubResponse = (
await request.get(INTEGRATION_GITHUB_TOKEN_URL, {
await standardRequest.get(INTEGRATION_GITHUB_TOKEN_URL, {
params: {
client_id: await getClientIdGitHub(),
client_secret: await getClientSecretGitHub(),
@ -321,7 +321,7 @@ const exchangeCodeGithub = async ({ code }: { code: string }) => {
const exchangeCodeGitlab = async ({ code }: { code: string }) => {
const accessExpiresAt = new Date();
const res: ExchangeCodeGitlabResponse = (
await request.post(
await standardRequest.post(
INTEGRATION_GITLAB_TOKEN_URL,
new URLSearchParams({
grant_type: "authorization_code",

View File

@ -1,4 +1,4 @@
import request from "../config/request";
import { standardRequest } from "../config/request";
import { IIntegrationAuth } from "../models";
import {
INTEGRATION_AZURE_KEY_VAULT,
@ -121,7 +121,7 @@ const exchangeRefreshAzure = async ({
refreshToken: string;
}) => {
const accessExpiresAt = new Date();
const { data }: { data: RefreshTokenAzureResponse } = await request.post(
const { data }: { data: RefreshTokenAzureResponse } = await standardRequest.post(
INTEGRATION_AZURE_TOKEN_URL,
new URLSearchParams({
client_id: await getClientIdAzure(),
@ -158,7 +158,7 @@ const exchangeRefreshHeroku = async ({
data,
}: {
data: RefreshTokenHerokuResponse;
} = await request.post(
} = await standardRequest.post(
INTEGRATION_HEROKU_TOKEN_URL,
new URLSearchParams({
grant_type: "refresh_token",
@ -193,7 +193,7 @@ const exchangeRefreshGitLab = async ({
data,
}: {
data: RefreshTokenGitLabResponse;
} = await request.post(
} = await standardRequest.post(
INTEGRATION_GITLAB_TOKEN_URL,
new URLSearchParams({
grant_type: "refresh_token",

View File

@ -37,8 +37,7 @@ import {
INTEGRATION_TRAVISCI_API_URL,
INTEGRATION_SUPABASE_API_URL
} from "../variables";
import request from '../config/request';
import axios from "axios";
import { standardRequest} from '../config/request';
/**
* Sync/push [secrets] to [app] in integration named [integration]
@ -215,7 +214,7 @@ const syncSecretsAzureKeyVault = async ({
let result: GetAzureKeyVaultSecret[] = [];
try {
while (url) {
const res = await request.get(url, {
const res = await standardRequest.get(url, {
headers: {
Authorization: `Bearer ${accessToken}`
}
@ -242,7 +241,7 @@ const syncSecretsAzureKeyVault = async ({
lastSlashIndex = getAzureKeyVaultSecret.id.lastIndexOf('/');
}
const azureKeyVaultSecret = await request.get(`${getAzureKeyVaultSecret.id}?api-version=7.3`, {
const azureKeyVaultSecret = await standardRequest.get(`${getAzureKeyVaultSecret.id}?api-version=7.3`, {
headers: {
'Authorization': `Bearer ${accessToken}`
}
@ -308,7 +307,7 @@ const syncSecretsAzureKeyVault = async ({
while (!isSecretSet && maxTries > 0) {
// try to set secret
try {
await request.put(
await standardRequest.put(
`${integration.app}/secrets/${key}?api-version=7.3`,
{
value
@ -325,7 +324,7 @@ const syncSecretsAzureKeyVault = async ({
} catch (err) {
const error: any = err;
if (error?.response?.data?.error?.innererror?.code === 'ObjectIsDeletedButRecoverable') {
await request.post(
await standardRequest.post(
`${integration.app}/deletedsecrets/${key}/recover?api-version=7.3`, {},
{
headers: {
@ -355,7 +354,7 @@ const syncSecretsAzureKeyVault = async ({
for await (const deleteSecret of deleteSecrets) {
const { key } = deleteSecret;
await request.delete(`${integration.app}/secrets/${key}?api-version=7.3`, {
await standardRequest.delete(`${integration.app}/secrets/${key}?api-version=7.3`, {
headers: {
'Authorization': `Bearer ${accessToken}`
}
@ -568,7 +567,7 @@ const syncSecretsHeroku = async ({
}) => {
try {
const herokuSecrets = (
await request.get(
await standardRequest.get(
`${INTEGRATION_HEROKU_API_URL}/apps/${integration.app}/config-vars`,
{
headers: {
@ -586,7 +585,7 @@ const syncSecretsHeroku = async ({
}
});
await request.patch(
await standardRequest.patch(
`${INTEGRATION_HEROKU_API_URL}/apps/${integration.app}/config-vars`,
secrets,
{
@ -642,7 +641,7 @@ const syncSecretsVercel = async ({
: {}),
};
const vercelSecrets: VercelSecret[] = (await request.get(
const vercelSecrets: VercelSecret[] = (await standardRequest.get(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env`,
{
params,
@ -675,7 +674,7 @@ const syncSecretsVercel = async ({
for await (const vercelSecret of vercelSecrets) {
if (vercelSecret.type === 'encrypted') {
// case: secret is encrypted -> need to decrypt
const decryptedSecret = (await request.get(
const decryptedSecret = (await standardRequest.get(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${vercelSecret.id}`,
{
params,
@ -747,7 +746,7 @@ const syncSecretsVercel = async ({
// Sync/push new secrets
if (newSecrets.length > 0) {
await request.post(
await standardRequest.post(
`${INTEGRATION_VERCEL_API_URL}/v10/projects/${integration.app}/env`,
newSecrets,
{
@ -763,7 +762,7 @@ const syncSecretsVercel = async ({
for await (const secret of updateSecrets) {
if (secret.type !== 'sensitive') {
const { id, ...updatedSecret } = secret;
await request.patch(
await standardRequest.patch(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
updatedSecret,
{
@ -778,7 +777,7 @@ const syncSecretsVercel = async ({
}
for await (const secret of deleteSecrets) {
await request.delete(
await standardRequest.delete(
`${INTEGRATION_VERCEL_API_URL}/v9/projects/${integration.app}/env/${secret.id}`,
{
params,
@ -837,7 +836,7 @@ const syncSecretsNetlify = async ({
});
const res = (
await request.get(
await standardRequest.get(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env`,
{
params: getParams,
@ -951,7 +950,7 @@ const syncSecretsNetlify = async ({
});
if (newSecrets.length > 0) {
await request.post(
await standardRequest.post(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env`,
newSecrets,
{
@ -966,7 +965,7 @@ const syncSecretsNetlify = async ({
if (updateSecrets.length > 0) {
updateSecrets.forEach(async (secret: NetlifySecret) => {
await request.patch(
await standardRequest.patch(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${secret.key}`,
{
context: secret.values[0].context,
@ -985,7 +984,7 @@ const syncSecretsNetlify = async ({
if (deleteSecrets.length > 0) {
deleteSecrets.forEach(async (key: string) => {
await request.delete(
await standardRequest.delete(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${key}`,
{
params: syncParams,
@ -1000,7 +999,7 @@ const syncSecretsNetlify = async ({
if (deleteSecretValues.length > 0) {
deleteSecretValues.forEach(async (secret: NetlifySecret) => {
await request.delete(
await standardRequest.delete(
`${INTEGRATION_NETLIFY_API_URL}/api/v1/accounts/${integrationAuth.accountId}/env/${secret.key}/value/${secret.values[0].id}`,
{
params: syncParams,
@ -1151,7 +1150,7 @@ const syncSecretsRender = async ({
accessToken: string;
}) => {
try {
await request.put(
await standardRequest.put(
`${INTEGRATION_RENDER_API_URL}/v1/services/${integration.appId}/env-vars`,
Object.keys(secrets).map((key) => ({
key,
@ -1203,7 +1202,7 @@ const syncSecretsRailway = async ({
variables: secrets
};
await request.post(INTEGRATION_RAILWAY_API_URL, {
await standardRequest.post(INTEGRATION_RAILWAY_API_URL, {
query,
variables: {
input,
@ -1261,7 +1260,7 @@ const syncSecretsFlyio = async ({
}
`;
await request.post(INTEGRATION_FLYIO_API_URL, {
await standardRequest.post(INTEGRATION_FLYIO_API_URL, {
query: SetSecrets,
variables: {
input: {
@ -1296,7 +1295,7 @@ const syncSecretsFlyio = async ({
}
}`;
const getSecretsRes = (await request.post(INTEGRATION_FLYIO_API_URL, {
const getSecretsRes = (await standardRequest.post(INTEGRATION_FLYIO_API_URL, {
query: GetSecrets,
variables: {
appName: integration.app,
@ -1332,7 +1331,7 @@ const syncSecretsFlyio = async ({
}
}`;
await request.post(INTEGRATION_FLYIO_API_URL, {
await standardRequest.post(INTEGRATION_FLYIO_API_URL, {
query: DeleteSecrets,
variables: {
input: {
@ -1373,7 +1372,7 @@ const syncSecretsCircleCI = async ({
}) => {
try {
const circleciOrganizationDetail = (
await request.get(`${INTEGRATION_CIRCLECI_API_URL}/v2/me/collaborations`, {
await standardRequest.get(`${INTEGRATION_CIRCLECI_API_URL}/v2/me/collaborations`, {
headers: {
"Circle-Token": accessToken,
"Accept-Encoding": "application/json",
@ -1386,7 +1385,7 @@ const syncSecretsCircleCI = async ({
// sync secrets to CircleCI
Object.keys(secrets).forEach(
async (key) =>
await request.post(
await standardRequest.post(
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar`,
{
name: key,
@ -1403,7 +1402,7 @@ const syncSecretsCircleCI = async ({
// get secrets from CircleCI
const getSecretsRes = (
await request.get(
await standardRequest.get(
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar`,
{
headers: {
@ -1417,7 +1416,7 @@ const syncSecretsCircleCI = async ({
// delete secrets from CircleCI
getSecretsRes.forEach(async (sec: any) => {
if (!(sec.name in secrets)) {
await request.delete(
await standardRequest.delete(
`${INTEGRATION_CIRCLECI_API_URL}/v2/project/${slug}/${integration.app}/envvar/${sec.name}`,
{
headers: {
@ -1454,7 +1453,7 @@ const syncSecretsTravisCI = async ({
try {
// get secrets from travis-ci
const getSecretsRes = (
await request.get(
await standardRequest.get(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars?repository_id=${integration.appId}`,
{
headers: {
@ -1476,7 +1475,7 @@ const syncSecretsTravisCI = async ({
if (!(key in getSecretsRes)) {
// case: secret does not exist in travis ci
// -> add secret
await request.post(
await standardRequest.post(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars?repository_id=${integration.appId}`,
{
env_var: {
@ -1495,7 +1494,7 @@ const syncSecretsTravisCI = async ({
} else {
// case: secret exists in travis ci
// -> update/set secret
await request.patch(
await standardRequest.patch(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars/${getSecretsRes[key].id}?repository_id=${getSecretsRes[key].repository_id}`,
{
env_var: {
@ -1517,7 +1516,7 @@ const syncSecretsTravisCI = async ({
for await (const key of Object.keys(getSecretsRes)) {
if (!(key in secrets)){
// delete secret
await request.delete(
await standardRequest.delete(
`${INTEGRATION_TRAVISCI_API_URL}/settings/env_vars/${getSecretsRes[key].id}?repository_id=${getSecretsRes[key].repository_id}`,
{
headers: {
@ -1562,7 +1561,7 @@ const syncSecretsGitLab = async ({
// get secrets from gitlab
const getSecretsRes: GitLabSecret[] = (
await request.get(
await standardRequest.get(
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables`,
{
headers: {
@ -1580,7 +1579,7 @@ const syncSecretsGitLab = async ({
for await (const key of Object.keys(secrets)) {
const existingSecret = getSecretsRes.find((s: any) => s.key == key);
if (!existingSecret) {
await request.post(
await standardRequest.post(
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables`,
{
key: key,
@ -1601,7 +1600,7 @@ const syncSecretsGitLab = async ({
} else {
// update secret
if (secrets[key] !== existingSecret.value) {
await request.put(
await standardRequest.put(
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables/${existingSecret.key}?filter[environment_scope]=${integration.targetEnvironment}`,
{
...existingSecret,
@ -1622,7 +1621,7 @@ const syncSecretsGitLab = async ({
// delete secrets
for await (const sec of getSecretsRes) {
if (!(sec.key in secrets)) {
await request.delete(
await standardRequest.delete(
`${INTEGRATION_GITLAB_API_URL}/v4/projects/${integration?.appId}/variables/${sec.key}?filter[environment_scope]=${integration.targetEnvironment}`,
{
headers: {
@ -1657,7 +1656,7 @@ const syncSecretsSupabase = async ({
accessToken: string;
}) => {
try {
const { data: getSecretsRes } = await request.get(
const { data: getSecretsRes } = await standardRequest.get(
`${INTEGRATION_SUPABASE_API_URL}/v1/projects/${integration.appId}/secrets`,
{
headers: {
@ -1677,7 +1676,7 @@ const syncSecretsSupabase = async ({
}
);
await request.post(
await standardRequest.post(
`${INTEGRATION_SUPABASE_API_URL}/v1/projects/${integration.appId}/secrets`,
modifiedFormatForSecretInjection,
{
@ -1695,7 +1694,7 @@ const syncSecretsSupabase = async ({
}
});
await request.delete(
await standardRequest.delete(
`${INTEGRATION_SUPABASE_API_URL}/v1/projects/${integration.appId}/secrets`,
{
headers: {

View File

@ -5,7 +5,7 @@ import {
INTEGRATION_GITLAB,
INTEGRATION_GITLAB_API_URL
} from '../variables';
import request from '../config/request';
import { standardRequest } from '../config/request';
interface Team {
name: string;
@ -56,7 +56,7 @@ const getTeamsGitLab = async ({
accessToken: string;
}) => {
let teams: Team[] = [];
const res = (await request.get(
const res = (await standardRequest.get(
`${INTEGRATION_GITLAB_API_URL}/v4/groups`,
{
headers: {

View File

@ -1,8 +1,6 @@
import { Request, Response, NextFunction } from 'express';
import { Types } from 'mongoose';
import { validateMembership } from '../helpers/membership';
import { validateClientForWorkspace } from '../helpers/workspace';
import { UnauthorizedRequestError } from '../utils/errors';
type req = 'params' | 'body' | 'query';
@ -31,7 +29,7 @@ const requireWorkspaceAuth = ({
const environment = locationEnvironment ? req[locationEnvironment]?.environment : undefined;
// validate clients
const { membership } = await validateClientForWorkspace({
const { membership, workspace } = await validateClientForWorkspace({
authData: req.authData,
workspaceId: new Types.ObjectId(workspaceId),
environment,
@ -43,6 +41,10 @@ const requireWorkspaceAuth = ({
if (membership) {
req.membership = membership;
}
if (workspace) {
req.workspace = workspace;
}
return next();
};

View File

@ -21,6 +21,7 @@ export interface IIntegration {
workspace: Types.ObjectId;
environment: string;
isActive: boolean;
url: string;
app: string;
appId: string;
owner: string;
@ -63,6 +64,11 @@ const integrationSchema = new Schema<IIntegration>(
type: Boolean,
required: true,
},
url: {
// for custom self-hosted integrations (e.g. self-hosted GitHub enterprise)
type: String,
default: null
},
app: {
// name of app in provider
type: String,

View File

@ -0,0 +1,20 @@
# MANAGED BY INFISICAL CLI (Do not modify): START
infisicalScanEnabled=$(git config --bool hooks.infisical-scan)
if [ "$infisicalScanEnabled" != "false" ]; then
infisical scan git-changes -v --staged
exitCode=$?
if [ $exitCode -eq 1 ]; then
echo "Commit blocked: Infisical scan has uncovered secrets in your git commit"
echo "To disable the Infisical scan precommit hook run the following command:"
echo ""
echo " git config hooks.infisical-scan false"
echo ""
exit 1
fi
else
echo 'Warning: infisical scan precommit disabled'
fi
# MANAGED BY INFISICAL CLI (Do not modify): END

View File

@ -0,0 +1,20 @@
#!/bin/sh
# MANAGED BY INFISICAL CLI (Do not modify): START
infisicalScanEnabled=$(git config --bool hooks.infisical-scan)
if [ "$infisicalScanEnabled" != "false" ]; then
infisical scan git-changes -v --staged
exitCode=$?
if [ $exitCode -eq 1 ]; then
echo "Commit blocked: Infisical scan has uncovered secrets in your git commit"
echo "To disable the Infisical scan precommit hook run the following command:"
echo ""
echo " git config hooks.infisical-scan false"
echo ""
exit 1
fi
else
echo 'Warning: infisical scan precommit disabled'
fi
# MANAGED BY INFISICAL CLI (Do not modify): END

View File

@ -126,7 +126,7 @@ var runCmd = &cobra.Command{
log.Debug().Msgf("injecting the following environment variables into shell: %v", env)
Telemetry.CaptureEvent("cli-command:run", posthog.NewProperties().Set("secretsCount", len(secrets)).Set("environment", environmentName).Set("isUsingServiceToken", infisicalToken != "").Set("single-command", strings.Join(args, " ")).Set("multi-command", cmd.Flag("command").Value.String()))
Telemetry.CaptureEvent("cli-command:run", posthog.NewProperties().Set("secretsCount", len(secrets)).Set("environment", environmentName).Set("isUsingServiceToken", infisicalToken != "").Set("single-command", strings.Join(args, " ")).Set("multi-command", cmd.Flag("command").Value.String()).Set("version", util.CLI_VERSION))
if cmd.Flags().Changed("command") {
command := cmd.Flag("command").Value.String()

View File

@ -23,7 +23,11 @@
package cmd
import (
_ "embed"
"fmt"
"io/ioutil"
"os"
"os/exec"
"path/filepath"
"strings"
"time"
@ -32,6 +36,7 @@ import (
"github.com/Infisical/infisical-merge/detect"
"github.com/Infisical/infisical-merge/packages/util"
"github.com/Infisical/infisical-merge/report"
"github.com/manifoldco/promptui"
"github.com/posthog/posthog-go"
"github.com/rs/zerolog/log"
"github.com/spf13/cobra"
@ -45,6 +50,17 @@ order of precedence:
3. (--source/-s)/.infisical-scan.toml
If none of the three options are used, then Infisical will use the default scan config`
//go:embed pre-commit-script/pre-commit.sh
var preCommitTemplate []byte
//go:embed pre-commit-script/pre-commit-without-bang.sh
var preCommitTemplateAppend []byte
const (
defaultHooksPath = ".git/hooks/"
preCommitFile = "pre-commit"
)
func init() {
// scan flag for only scan command
scanCmd.Flags().String("log-opts", "", "git log options")
@ -77,6 +93,9 @@ func init() {
// add flags to main
scanCmd.AddCommand(scanGitChangesCmd)
rootCmd.AddCommand(scanCmd)
installCmd.Flags().Bool("pre-commit-hook", false, "installs pre commit hook for Git repository")
scanCmd.AddCommand(installCmd)
}
func initScanConfig(cmd *cobra.Command) {
@ -132,6 +151,50 @@ func initScanConfig(cmd *cobra.Command) {
}
}
var installCmd = &cobra.Command{
Use: "install",
Short: "Install scanning scripts and tools. Use --help flag to see all options",
Args: cobra.ExactArgs(0),
Run: func(cmd *cobra.Command, args []string) {
installPrecommit := cmd.Flags().Changed("pre-commit-hook")
if installPrecommit {
hooksPath, err := getHooksPath()
if err != nil {
fmt.Printf("Error: %s\n", err)
return
}
if hooksPath != ".git/hooks" {
defaultHookOverride, err := overrideDefaultHooksPath(hooksPath)
if err != nil {
fmt.Printf("Error: %s\n", err)
}
if defaultHookOverride {
ConfigureGitHooksPath()
log.Info().Msgf("To switch back previous githooks manager run: git config core.hooksPath %s\n", hooksPath)
return
} else {
log.Warn().Msgf("To automatically configure this hook, you need to switch the path of the Hooks. Alternatively, you can manually configure this hook by setting your pre-commit script to run command [infisical scan git-changes -v --staged].\n")
return
}
}
err = createOrUpdatePreCommitFile(hooksPath)
if err != nil {
fmt.Printf("Error: %s\n", err)
return
}
log.Info().Msgf("Pre-commit hook successfully added. Infisical scan should now run on each commit you make\n")
Telemetry.CaptureEvent("cli-command:install --pre-commit-hook", posthog.NewProperties().Set("version", util.CLI_VERSION))
return
}
}}
var scanCmd = &cobra.Command{
Use: "scan",
Short: "Scan for leaked secrets in git history, directories, and files",
@ -417,3 +480,107 @@ func FormatDuration(d time.Duration) string {
}
return d.Round(scale / 100).String()
}
func overrideDefaultHooksPath(managedHook string) (bool, error) {
YES := "Yes"
NO := "No"
options := []string{YES, NO}
optionsPrompt := promptui.Select{
Label: fmt.Sprintf("Your hooks path is set to [%s] but needs to be [.git/hooks] for automatic configuration. Would you like to switch? ", managedHook),
Items: options,
Size: 2,
}
_, selectedOption, err := optionsPrompt.Run()
if err != nil {
return false, err
}
return selectedOption == YES, err
}
func ConfigureGitHooksPath() {
cmd := exec.Command("git", "config", "core.hooksPath", ".git/hooks")
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
if err := cmd.Run(); err != nil {
log.Fatal().Msgf("Failed to configure git hooks path: %v", err)
}
}
// GetGitRoot returns the root directory of the current Git repository.
func GetGitRoot() (string, error) {
cmd := exec.Command("git", "rev-parse", "--show-toplevel")
output, err := cmd.Output()
if err != nil {
return "", fmt.Errorf("failed to get git root directory: %w", err)
}
gitRoot := strings.TrimSpace(string(output)) // Remove any trailing newline
return gitRoot, nil
}
func getHooksPath() (string, error) {
out, err := exec.Command("git", "config", "core.hooksPath").Output()
if err != nil {
if len(out) == 0 {
out = []byte(".git/hooks") // set the default hook
} else {
log.Error().Msgf("Failed to get Git hooks path: %s\nOutput: %s\n", err, out)
}
}
hooksPath := strings.TrimSpace(string(out))
return hooksPath, nil
}
func createOrUpdatePreCommitFile(hooksPath string) error {
// File doesn't exist, create a new one
rootGitRepoPath, err := GetGitRoot()
if err != nil {
return err
}
filePath := fmt.Sprintf("%s/%s/%s", rootGitRepoPath, hooksPath, preCommitFile)
_, err = os.Stat(filePath)
if err == nil {
// File already exists, check if it contains the managed comments
content, err := ioutil.ReadFile(filePath)
if err != nil {
return fmt.Errorf("failed to read pre-commit file: %s", err)
}
if strings.Contains(string(content), "# MANAGED BY INFISICAL CLI (Do not modify): START") &&
strings.Contains(string(content), "# MANAGED BY INFISICAL CLI (Do not modify): END") {
return nil
}
// File already exists, append the template content
file, err := os.OpenFile(filePath, os.O_APPEND|os.O_WRONLY, 0755)
if err != nil {
return fmt.Errorf("failed to open pre-commit file: %s", err)
}
defer file.Close()
_, err = file.Write(preCommitTemplateAppend)
if err != nil {
return fmt.Errorf("failed to append to pre-commit file: %s", err)
}
} else if os.IsNotExist(err) {
err = os.WriteFile(filePath, preCommitTemplate, 0755)
if err != nil {
return fmt.Errorf("failed to create pre-commit file: %s", err)
}
} else {
// Error occurred while checking file status
return fmt.Errorf("failed to check pre-commit file status: %s", err)
}
return nil
}

View File

@ -1,24 +1,36 @@
package telemetry
import (
"os"
"github.com/Infisical/infisical-merge/packages/util"
"github.com/denisbrodbeck/machineid"
"github.com/posthog/posthog-go"
"github.com/rs/zerolog/log"
)
var POSTHOG_API_KEY_FOR_CLI string
type Telemetry struct {
isEnabled bool
posthogClient posthog.Client
}
type NoOpLogger struct{}
func (NoOpLogger) Logf(format string, args ...interface{}) {
log.Debug().Msgf(format, args...)
}
func (NoOpLogger) Errorf(format string, args ...interface{}) {
log.Debug().Msgf(format, args...)
}
func NewTelemetry(telemetryIsEnabled bool) *Telemetry {
posthogAPIKey := os.Getenv("POSTHOG_API_KEY_FOR_CLI")
if posthogAPIKey != "" {
if POSTHOG_API_KEY_FOR_CLI != "" {
client, _ := posthog.NewWithConfig(
posthogAPIKey,
posthog.Config{},
POSTHOG_API_KEY_FOR_CLI,
posthog.Config{
Logger: NoOpLogger{},
},
)
return &Telemetry{isEnabled: telemetryIsEnabled, posthogClient: client}
@ -53,13 +65,13 @@ func (t *Telemetry) GetDistinctId() (string, error) {
outputErr = err
}
userDetails, err := util.GetCurrentLoggedInUserDetails()
infisicalConfig, err := util.GetConfigFile()
if err != nil {
outputErr = err
}
if userDetails.IsUserLoggedIn && userDetails.UserCredentials.Email != "" {
distinctId = userDetails.UserCredentials.Email
if infisicalConfig.LoggedInUserEmail != "" {
distinctId = infisicalConfig.LoggedInUserEmail
} else if machineId != "" {
distinctId = "anonymous_cli_" + machineId
} else {

View File

@ -37,6 +37,8 @@ services:
- MONGO_URL=mongodb://root:example@mongo:27017/?authSource=admin
networks:
- infisical-dev
extra_hosts:
- "host.docker.internal:host-gateway"
frontend:
container_name: infisical-dev-frontend

View File

@ -0,0 +1,104 @@
---
title: "scan git-changes"
description: "Scan for secrets in your uncommitted code"
---
```bash
infisical scan git-changes
# Display the full secret findings
infisical scan git-changes --verbose
```
## Description
Scanning for secrets before you commit your changes is great way to prevent leaks. Infisical makes this easy with the sub command `git-changes`.
The `git-changes` scans for uncommitted changes in a Git repository, and is especially designed for use on developer machines, aligning with the ['shift left'](https://cloud.google.com/architecture/devops/devops-tech-shifting-left-on-security) security approach.
When `git-changes` is run on a Git repository, Infisical parses the output from a `git diff` command.
To scan changes in commits that have been staged via `git add`, you can add the `--staged` flag to the sub command. This flag is particularly useful when using Infisical CLI as a pre-commit tool.
### Flags
<Accordion title="--staged">
**Description**
detect secrets in a --staged state
Default value: `false`
</Accordion>
<Accordion title="--log-opts">
**Description**
git log options
</Accordion>
<Accordion title="--baseline-path">
Short hand: `-b`
**Description**
path to baseline with issues that can be ignored
</Accordion>
<Accordion title="--config">
Short hand: `-c`
**Description**
config file path
order of precedence:
1. --config flag
2. env var INFISICAL_SCAN_CONFIG
3. (--source/-s)/.infisical-scan.toml
If none of the three options are used, then Infisical will use the default config
</Accordion>
<Accordion title="--exit-code">
**Description**
exit code when leaks have been encountered (default 1)
</Accordion>
<Accordion title="--max-target-megabytes">
**Description**
files larger than this will be skipped
</Accordion>
<Accordion title="--no-color">
**Description**
turn off color for verbose output
</Accordion>
<Accordion title="--redact">
**Description**
redact secrets from logs and stdout
</Accordion>
<Accordion title="--report-format">
**Description**
output format (json, csv, sarif) (default "json")
</Accordion>
<Accordion title="--report-path">
**Description**
report file
</Accordion>
<Accordion title="--source">
**Description**
path to source (default ".")
</Accordion>
<Accordion title="--verbose">
**Description**
show verbose output from scan
</Accordion>

View File

@ -0,0 +1,23 @@
---
title: "scan install"
description: "Add various scanning tools seamlessly into your development lifecycle"
---
```bash
infisical scan install --pre-commit-hook
```
## Description
The command `infisical scan install` is designed to incorporate various scanning tools seamlessly into your development lifecycle.
Initially, we are offering users the ability to install a pre-commit hook. This hook conducts an automatic scan for any exposed secrets in your commits before they are pushed.
### Flags
<Accordion title="--pre-commit-hook">
```bash
infisical scan install --pre-commit-hook
```
**Description**
Installs a git pre-commit hook that triggers Infisical to scan your staged changes for any exposed secrets prior to pushing.
</Accordion>

126
docs/cli/commands/scan.mdx Normal file
View File

@ -0,0 +1,126 @@
---
title: "scan"
description: "Scan git history, directories, and files for secrets"
---
```bash
infisical scan
# Display the full secret findings
infisical scan --verbose
```
## Description
The `infisical scan` command serves to scan repositories, directories, and files. It's compatible with both individual developer machines and Continuous Integration (CI) environments.
When you run `infisical scan` on a Git repository, Infisical will parses the output of a `git log -p` command. This command generates [patches](https://stackoverflow.com/questions/8279602/what-is-a-patch-in-git-version-control) that Infisical uses to identify secrets in your code.
You can configure the range of commits that `git log` will cover using the `--log-opts` flag.
Any options you can use with `git log -p` are valid for `--log-opts`.
For instance, to instruct Infisical to scan a specific range of commits, use the following command: `infisical scan --log-opts="--all commitA..commitB"`. For more details, refer to the [Git log documentation](https://git-scm.com/docs/git-log).
To scan individual files and directories, use the `--no-git` flag.
### Flags
<Accordion title="--log-opts">
**Description**
git log options
</Accordion>
<Accordion title="--no-git">
**Description**
treat git repo as a regular directory and scan those files, --log-opts has no effect on the scan when --no-git is set
Default value: `false`
</Accordion>
<Accordion title="--pipe">
Short hand: `-b`
**Description**
scan input from stdin, ex: `cat some_file | infisical scan --pipe`
Default value: `false`
</Accordion>
<Accordion title="--follow-symlinks">
Short hand: `-b`
**Description**
scan files that are symlinks to other files
Default value: `false`
</Accordion>
<Accordion title="--baseline-path">
Short hand: `-b`
**Description**
path to baseline with issues that can be ignored
</Accordion>
<Accordion title="--config">
Short hand: `-c`
**Description**
config file path
order of precedence:
1. --config flag
2. env var INFISICAL_SCAN_CONFIG
3. (--source/-s)/.infisical-scan.toml
If none of the three options are used, then Infisical will use the default config
</Accordion>
<Accordion title="--exit-code">
**Description**
exit code when leaks have been encountered (default 1)
</Accordion>
<Accordion title="--max-target-megabytes">
**Description**
files larger than this will be skipped
</Accordion>
<Accordion title="--no-color">
**Description**
turn off color for verbose output
</Accordion>
<Accordion title="--redact">
**Description**
redact secrets from logs and stdout
</Accordion>
<Accordion title="--report-format">
**Description**
output format (json, csv, sarif) (default "json")
</Accordion>
<Accordion title="--report-path">
**Description**
report file
</Accordion>
<Accordion title="--source">
**Description**
path to source (default ".")
</Accordion>
<Accordion title="--verbose">
**Description**
show verbose output from scan
</Accordion>

View File

@ -0,0 +1,244 @@
---
title: 'Secret scanning'
description: "Scan and prevent secret leaks in your code base"
---
Building upon its core functionality of fetching and injecting secrets into your applications, Infisical now takes a significant step forward in bolstering your code security.
We've enhanced our CLI tool to include a powerful scanning feature, capable of identifying more than 140 different types of secret leaks in your codebase.
In addition to scanning for past leaks, this new addition also actively aids in preventing future leaks.
# Scanning
<Tabs>
<Tab title="Scanning files, directories and or git history">
```bash
infisical scan
# Display the full secret findings
infisical scan --verbose
```
The `infisical scan` command serves to scan repositories, directories, and files. It's compatible with both individual developer machines and Continuous Integration (CI) environments.
When you run `infisical scan` on a Git repository, Infisical will parses the output of a `git log -p` command. This command generates [patches](https://stackoverflow.com/questions/8279602/what-is-a-patch-in-git-version-control) that Infisical uses to identify secrets in your code.
You can configure the range of commits that `git log` will cover using the `--log-opts` flag.
Any options you can use with `git log -p` are valid for `--log-opts`.
For instance, to instruct Infisical to scan a specific range of commits, use the following command: `infisical scan --log-opts="--all commitA..commitB"`. For more details, refer to the [Git log documentation](https://git-scm.com/docs/git-log).
To scan individual files and directories, use the `--no-git` flag.
**View [full details for this command](./commands/scan)**
</Tab>
<Tab title="Scanning uncommitted files ">
```bash
infisical scan git-changes
# Display the full secret findings
infisical git-changes --verbose
```
Scanning for secrets before you commit your changes is great way to prevent leaks. Infisical makes this easy with the sub command `git-changes`.
The `git-changes` scans for uncommitted changes in a Git repository, and is especially designed for use on developer machines, aligning with the ['shift left'](https://cloud.google.com/architecture/devops/devops-tech-shifting-left-on-security) security approach.
When `git-changes` is run on a Git repository, Infisical parses the output from a `git diff` command.
To scan changes in commits that have been staged via `git add`, you can add the `--staged` flag to the sub command. This flag is particularly useful when using Infisical CLI as a pre-commit tool.
**View [full details for this command](./commands/scan-git-changes)**
<Info>
`git-changes` command is only for Git repositories; using it on files or directories will result in an error.
</Info>
</Tab>
</Tabs>
#
#
# Automatically scan changes before you commit
To lower the risk of committing hardcoded secrets to your code repository, we have designed a custom git pre-commit hook.
This hook scans the changes you're about to commit for any exposed secrets. If any hardcoded secrets are detected, it will block your commit.
### Install pre-commit hook
To install this git hook, go into your local git repository and run the following command.
```bash
infisical scan install --pre-commit-hook
```
To disable this hook after installing it, run the command `git config --bool hooks.infisical-scan false`
### Third party hooks management
If you prefer to manage your pre-commit hook outside of the .git/hooks directory, you can easily accomplish this by adding the following command to your pre-commit script
```bash
infisical scan git-changes --staged --verbose
```
#
#
# Creating a baseline
When scanning large repositories or repositories with a long history, it can be helpful to use a baseline.
A baseline allows Infisical to ignore any old findings that are already present in the baseline findings. You can create a infisical scan report by running `infisical scan` with the `--report-path` flag.
To create a Infisical scan report and save it in a file called leaks-report.json, use the following command:
```
infisical scan --report-path leaks-report.json
```
Once a baseline is created, you can apply it when running the `infisical scan` command again. Use the following command:
```
infisical scan --baseline-path leaks-report.json --report-path findings.json
```
After running the `scan` command with the `--baseline-path` flag, the report output in findings.json will only contain new issues.
#
#
# Configuration file
To customize the scan, such as specifying your own rules or establishing exceptions for certain files or paths that should not be flagged as risks, you can define these specifications in the configuration file.
<Accordion title="Example custom configuration file">
```toml infisical-scan.toml
# Title for the configuration file
title = "Some title"
# This configuration is the foundation that can be expanded. If there are any overlapping rules
# between this base and the expanded configuration, the rules in this base will take priority.
# Another aspect of extending configurations is the ability to link multiple files, up to a depth of 2.
# "Allowlist" arrays get appended and may have repeated elements.
# "useDefault" and "path" cannot be used simultaneously. Please choose one.
[extend]
# useDefault will extend the base configuration with the default config:
# https://raw.githubusercontent.com/Infisical/infisical/main/cli/config/infisical-scan.toml
useDefault = true
# or you can supply a path to a configuration. Path is relative to where infisical cli
# was invoked, not the location of the base config.
path = "common_config.toml"
# An array of tables that contain information that define instructions
# on how to detect secrets
[[rules]]
# Unique identifier for this rule
id = "some-identifier-for-rule"
# Short human readable description of the rule.
description = "awesome rule 1"
# Golang regular expression used to detect secrets. Note Golang's regex engine
# does not support lookaheads.
regex = '''one-go-style-regex-for-this-rule'''
# Golang regular expression used to match paths. This can be used as a standalone rule or it can be used
# in conjunction with a valid `regex` entry.
path = '''a-file-path-regex'''
# Array of strings used for metadata and reporting purposes.
tags = ["tag","another tag"]
# A regex match may have many groups, this allows you to specify the group that should be used as (which group the secret is contained in)
# its entropy checked if `entropy` is set.
secretGroup = 3
# Float representing the minimum shannon entropy a regex group must have to be considered a secret.
# Shannon entropy measures how random a data is. Since secrets are usually composed of many random characters, they typically have high entropy
entropy = 3.5
# Keywords are used for pre-regex check filtering.
# If rule has keywords but the text fragment being scanned doesn't have at least one of it's keywords, it will be skipped for processing further.
# Ideally these values should either be part of the idenitifer or unique strings specific to the rule's regex
# (introduced in v8.6.0)
keywords = [
"auth",
"password",
"token",
]
# You can include an allowlist table for a single rule to reduce false positives or ignore commits
# with known/rotated secrets
[rules.allowlist]
description = "ignore commit A"
commits = [ "commit-A", "commit-B"]
paths = [
'''go\.mod''',
'''go\.sum'''
]
# note: (rule) regexTarget defaults to check the _Secret_ in the finding.
# if regexTarget is not specified then _Secret_ will be used.
# Acceptable values for regexTarget are "match" and "line"
regexTarget = "match"
regexes = [
'''process''',
'''getenv''',
]
# note: stopwords targets the extracted secret, not the entire regex match
# if the extracted secret is found in the stopwords list, the finding will be skipped (i.e not included in report)
stopwords = [
'''client''',
'''endpoint''',
]
# This is a global allowlist which has a higher order of precedence than rule-specific allowlists.
# If a commit listed in the `commits` field below is encountered then that commit will be skipped and no
# secrets will be detected for said commit. The same logic applies for regexes and paths.
[allowlist]
description = "global allow list"
commits = [ "commit-A", "commit-B", "commit-C"]
paths = [
'''gitleaks\.toml''',
'''(.*?)(jpg|gif|doc)'''
]
# note: (global) regexTarget defaults to check the _Secret_ in the finding.
# if regexTarget is not specified then _Secret_ will be used.
# Acceptable values for regexTarget are "match" and "line"
regexTarget = "match"
regexes = [
'''219-09-9999''',
'''078-05-1120''',
'''(9[0-9]{2}|666)-\d{2}-\d{4}''',
]
# note: stopwords targets the extracted secret, not the entire regex match
# if the extracted secret is found in the stopwords list, the finding will be skipped (i.e not included in report)
stopwords = [
'''client''',
'''endpoint''',
]
```
</Accordion>
#
#
# Ignoring Known Secrets
If you're intentionally committing a test secret that `infisical scan` might flag, you can instruct Infisical to overlook that secret with the methods listed below.
### infisical-scan:ignore
To ignore a secret contained in line of code, simply add `infisical-scan:ignore ` at the end of the line as comment in the given programming.
```js example.js
function helloWorld() {
console.log("8dyfuiRyq=vVc3RRr_edRk-fK__JItpZ"); // infisical-scan:ignore
}
```
### .infisicalignore
An alternative method to exclude specific findings involves creating a .infisicalignore file at your repository's root.
You can then add the fingerprints of the findings you wish to exclude. The Infisical scan report provides a unique Fingerprint for each secret found.
By incorporating these Fingerprints into the .infisicalignore file, Infisical will skip the corresponding secret findings in subsequent scans.
```.ignore .infisicalignore
bea0ff6e05a4de73a5db625d4ae181a015b50855:frontend/components/utilities/attemptLogin.js:stripe-access-token:147
bea0ff6e05a4de73a5db625d4ae181a015b50855:backend/src/json/integrations.json:generic-api-key:5
1961b92340e5d2613acae528b886c842427ce5d0:frontend/components/utilities/attemptLogin.js:stripe-access-token:148
```

View File

@ -79,6 +79,13 @@ Start syncing environment variables with [Infisical Cloud](https://app.infisical
>
Explore integrations for Next.js, Express, Django, and more
</Card>
<Card
href="/cli/scanning-overview"
title="Secret scanning"
color="#0285c7"
>
Scan and prevent 140+ secret type leaks in your codebase
</Card>
<Card
href="https://calendly.com/team-infisical/infisical-demo"
title="Contact Us"

View File

@ -140,7 +140,7 @@
"cli/overview",
"cli/usage",
{
"group": "Commands",
"group": "Core commands",
"pages": [
"cli/commands/login",
"cli/commands/init",
@ -149,9 +149,18 @@
"cli/commands/export",
"cli/commands/vault",
"cli/commands/user",
"cli/commands/reset"
"cli/commands/reset",
{
"group": "infisical scan",
"pages": [
"cli/commands/scan",
"cli/commands/scan-git-changes",
"cli/commands/scan-install"
]
}
]
},
"cli/scanning-overview",
"cli/project-config",
"cli/faq"
]

View File

@ -21,6 +21,11 @@ docker pull infisical/infisical:latest
The Infisical Docker image requires a .env file to manage environment variables.
Create a new file called .env in your preferred location. Add the required environment variables listed below. View [all configurable environment variables](../configuration/envars)
<ParamField query="ENCRYPTION_KEY" type="string" default="none" required>
Must be a random 16 byte hex string. Can be generated with `openssl rand -hex 16`
</ParamField>
<ParamField query="JWT_SIGNUP_SECRET" type="string" default="none" required>
Must be a random 16 byte hex string. Can be generated with `openssl rand -hex 16`
</ParamField>