Compare commits

...

65 Commits

Author SHA1 Message Date
Daniel Hougaard
b46bbea0c5 fix: removed debug data & re-add compression 2024-10-09 01:48:23 +04:00
Daniel Hougaard
6dad24ffde Update build-binaries.yml 2024-10-09 01:39:53 +04:00
Daniel Hougaard
f8759b9801 Update build-binaries.yml 2024-10-09 01:14:24 +04:00
Daniel Hougaard
049c77c902 Update build-binaries.yml 2024-10-09 00:50:32 +04:00
Daniel Hougaard
c8d40c6905 fix for corrupt data 2024-10-09 00:17:48 +04:00
Daniel Hougaard
ff815b5f42 Update build-binaries.yml 2024-10-08 23:38:20 +04:00
Daniel Hougaard
f6c65584bf Update build-binaries.yml 2024-10-08 22:40:33 +04:00
Daniel Hougaard
246020729e Update build-binaries.yml 2024-10-08 22:18:15 +04:00
Daniel Hougaard
63cc4e347d Update build-binaries.yml 2024-10-08 22:17:59 +04:00
Daniel Hougaard
f2a7f164e1 Trigger build 2024-10-08 21:58:49 +04:00
Daniel Hougaard
dfbdc46971 fix: rpm binary 2024-10-08 21:56:58 +04:00
Daniel Hougaard
56798f09bf Merge pull request #2544 from Infisical/daniel/project-env-position-fixes
fix: project environment positions
2024-10-07 21:22:38 +04:00
Sheen
4c1253dc87 Merge pull request #2534 from Infisical/doc/oidc-auth-circle-ci
doc: circle ci oidc auth
2024-10-07 23:26:31 +08:00
Meet Shah
09793979c7 Merge pull request #2547 from Infisical/meet/eng-1577-lots-of-content-header-issues-in-console
fix: add CSP directive to allow posthog
2024-10-07 18:56:12 +05:30
Meet
fa360b8208 fix: add CSP directive to allow posthog 2024-10-07 18:28:14 +05:30
Daniel Hougaard
f94e100c30 Update project-env.spec.ts 2024-10-07 13:30:32 +04:00
Daniel Hougaard
33b54e78f9 fix: project environment positions 2024-10-07 12:52:59 +04:00
Maidul Islam
f50b0876e4 Merge pull request #2441 from Infisical/maidul-sdsafdf
Remove service token notice
2024-10-06 17:43:02 -07:00
Maidul Islam
c30763c98f Merge pull request #2529 from Infisical/databricks-integration
Databricks integration
2024-10-06 17:36:14 -07:00
Maidul Islam
6fc95c3ff8 Merge pull request #2541 from scott-ray-wilson/kms-keys-temp-slug-col
Fix: Mitigate KMS Key Slug to Name Transition Side-Effects
2024-10-06 17:35:48 -07:00
Scott Wilson
eef1f2b6ef remove trigger functions 2024-10-05 18:05:50 -07:00
Scott Wilson
128b1cf856 fix: create separate triggers for insert/update 2024-10-05 11:01:30 -07:00
Maidul Islam
6b9944001e Merge pull request #2537 from akhilmhdh/fix/identity-list
feat: corrected identity pagination in org level
2024-10-05 10:54:09 -07:00
Scott Wilson
1cc22a6195 improvement: minizime kms key slug -> name transition impact 2024-10-05 10:43:57 -07:00
=
af643468fd feat: corrected identity pagination in org level 2024-10-05 10:50:05 +05:30
Maidul Islam
f8358a0807 Merge pull request #2536 from Infisical/maidul-resolve-identity-count
Resolve identity count issue
2024-10-04 19:00:17 -07:00
Maidul Islam
3eefb98f30 resolve identity count 2024-10-04 18:58:12 -07:00
Vladyslav Matsiiako
8f39f953f8 fix PR review comments for databricks integration 2024-10-04 16:04:00 -07:00
Daniel Hougaard
5e4af7e568 Merge pull request #2530 from Infisical/daniel/terraform-imports-prerequsuite
feat: terraform imports prerequisite / api improvements
2024-10-05 02:18:46 +04:00
Maidul Islam
24bd13403a Merge pull request #2531 from scott-ray-wilson/kms-fix-doc-link
Fix: Correct KMS Doc Link
2024-10-04 13:43:59 -07:00
Maidul Islam
4149cbdf07 Merge pull request #2535 from Infisical/meet/fix-handlebars-import
fix handlebars import
2024-10-04 12:52:27 -07:00
Meet
ced3ab97e8 chore: fix handlebars import 2024-10-05 01:18:13 +05:30
Sheen Capadngan
3f7f0a7b0a doc: circle ci oidc auth 2024-10-05 01:56:33 +08:00
Maidul Islam
20bcf8aab8 allow billing page on eu 2024-10-04 07:53:33 -07:00
Daniel Hougaard
0814245ce6 cleanup 2024-10-04 18:43:29 +04:00
Vladyslav Matsiiako
2d2f27ea46 accounted for not scopes in databricks use case 2024-10-04 00:27:17 -07:00
Vladyslav Matsiiako
4aeb2bf65e fix pr review for databricks integration 2024-10-04 00:09:33 -07:00
Meet Shah
24da76db19 Merge pull request #2532 from Infisical/meet/switch-templating-engine
chore: switch templating engine away from mustache
2024-10-04 09:04:47 +05:30
Meet
3c49936eee chore: lint fix 2024-10-04 08:57:55 +05:30
Meet
b416e79d63 chore: switch templating engine away from mustache 2024-10-04 08:08:36 +05:30
Scott Wilson
92c529587b fix: correct doc link 2024-10-03 18:55:58 -07:00
Daniel Hougaard
3b74c232dc Update pull_request_template.md 2024-10-04 04:04:00 +04:00
Daniel Hougaard
6164dc32d7 chore: api docs 2024-10-04 04:00:43 +04:00
Daniel Hougaard
37e7040eea feat: include path and environment on secret folder 2024-10-04 03:59:28 +04:00
Daniel Hougaard
a7ebb4b241 feat: get secret import by ID 2024-10-04 03:58:39 +04:00
Vladyslav Matsiiako
2fc562ff2d update image for databricks integartion 2024-10-03 16:36:07 -07:00
Vladyslav Matsiiako
b5c83fea4d fixed databricks integration docs 2024-10-03 16:28:19 -07:00
Vladyslav Matsiiako
b586f98926 fixed databricks integration docs 2024-10-03 16:26:38 -07:00
Vladyslav Matsiiako
e6205c086f fix license changes 2024-10-03 16:23:39 -07:00
Vladyslav Matsiiako
2ca34099ed added custom instance URLs to databricks 2024-10-03 16:21:47 -07:00
Scott Wilson
5da6c12941 Merge pull request #2497 from scott-ray-wilson/kms-feature
Feature: KMS MVP
2024-10-03 15:15:08 -07:00
Scott Wilson
e2612b75fc chore: move migration file to latest 2024-10-03 15:04:00 -07:00
Scott Wilson
ca5edb95f1 fix: revert mint api url 2024-10-03 14:46:06 -07:00
Tuan Dang
724e2b3692 Update docs for Infisical KMS 2024-10-03 14:29:26 -07:00
Scott Wilson
2c93561a3b improvement: format docs and change wording 2024-10-03 13:31:53 -07:00
Scott Wilson
0b24cc8631 fix: address missing slug -> name ref 2024-10-03 13:05:10 -07:00
Daniel Hougaard
6c6e932899 Merge pull request #2514 from Infisical/daniel/create-multiple-project-envs
fix: allow creation of multiple project envs
2024-10-04 00:04:10 +04:00
Scott Wilson
c66a711890 improvements: address requested changes 2024-10-03 12:55:53 -07:00
Vladyslav Matsiiako
25b55087cf added databricks integration 2024-10-02 22:49:02 -07:00
Scott Wilson
7cd85cf84a fix: correct order of drop sequence 2024-10-02 16:57:24 -07:00
Scott Wilson
cf5c886b6f chore: revert prem permission 2024-10-02 16:38:02 -07:00
Scott Wilson
e667c7c988 improvement: finish address changes 2024-10-02 16:35:53 -07:00
Maidul Islam
0df80c5b2d Merge pull request #2444 from Infisical/maidul-dhqduqyw
add trip on identityId for identity logins
2024-09-17 12:31:09 -04:00
Maidul Islam
c577f51c19 add trip on identityId for identity logins 2024-09-17 12:15:34 -04:00
Maidul Islam
24d121ab59 Remove service token notice 2024-09-16 21:25:53 -04:00
149 changed files with 3850 additions and 495 deletions

View File

@@ -6,6 +6,7 @@
- [ ] Bug fix
- [ ] New feature
- [ ] Improvement
- [ ] Breaking change
- [ ] Documentation

View File

@@ -7,7 +7,6 @@ on:
description: "Version number"
required: true
type: string
defaults:
run:
working-directory: ./backend
@@ -49,9 +48,9 @@ jobs:
- name: Package into node binary
run: |
if [ "${{ matrix.os }}" != "linux" ]; then
pkg --no-bytecode --public-packages "*" --public --target ${{ matrix.target }}-${{ matrix.arch }} --output ./binary/infisical-core-${{ matrix.os }}-${{ matrix.arch }} .
pkg --no-bytecode --public-packages "*" --public --compress GZip --target ${{ matrix.target }}-${{ matrix.arch }} --output ./binary/infisical-core-${{ matrix.os }}-${{ matrix.arch }} .
else
pkg --no-bytecode --public-packages "*" --public --target ${{ matrix.target }}-${{ matrix.arch }} --output ./binary/infisical-core .
pkg --no-bytecode --public-packages "*" --public --compress GZip --target ${{ matrix.target }}-${{ matrix.arch }} --output ./binary/infisical-core .
fi
# Set up .deb package structure (Debian/Ubuntu only)
@@ -83,6 +82,86 @@ jobs:
dpkg-deb --build infisical-core
mv infisical-core.deb ./binary/infisical-core-${{matrix.arch}}.deb
### RPM
# Set up .rpm package structure
- name: Set up .rpm package structure
if: matrix.os == 'linux'
run: |
mkdir -p infisical-core-rpm/usr/local/bin
cp ./binary/infisical-core infisical-core-rpm/usr/local/bin/
chmod +x infisical-core-rpm/usr/local/bin/infisical-core
# Install RPM build tools
- name: Install RPM build tools
if: matrix.os == 'linux'
run: sudo apt-get update && sudo apt-get install -y rpm
# Create .spec file for RPM
- name: Create .spec file for RPM
if: matrix.os == 'linux'
run: |
cat <<EOF > infisical-core.spec
%global _enable_debug_package 0
%global debug_package %{nil}
%global __os_install_post /usr/lib/rpm/brp-compress %{nil}
Name: infisical-core
Version: ${{ github.event.inputs.version }}
Release: 1%{?dist}
Summary: Infisical Core standalone executable
License: Proprietary
URL: https://app.infisical.com
%description
Infisical Core standalone executable (app.infisical.com)
%install
mkdir -p %{buildroot}/usr/local/bin
cp %{_sourcedir}/infisical-core %{buildroot}/usr/local/bin/
%files
/usr/local/bin/infisical-core
%pre
%post
%preun
%postun
EOF
# Build .rpm file
- name: Build .rpm package
if: matrix.os == 'linux'
run: |
# Create necessary directories
mkdir -p rpmbuild/{BUILD,RPMS,SOURCES,SPECS,SRPMS}
# Copy the binary directly to SOURCES
cp ./binary/infisical-core rpmbuild/SOURCES/
# Run rpmbuild with verbose output
rpmbuild -vv -bb \
--define "_topdir $(pwd)/rpmbuild" \
--define "_sourcedir $(pwd)/rpmbuild/SOURCES" \
--define "_rpmdir $(pwd)/rpmbuild/RPMS" \
--target ${{ matrix.arch == 'x64' && 'x86_64' || 'aarch64' }} \
infisical-core.spec
# Try to find the RPM file
find rpmbuild -name "*.rpm"
# Move the RPM file if found
if [ -n "$(find rpmbuild -name '*.rpm')" ]; then
mv $(find rpmbuild -name '*.rpm') ./binary/infisical-core-${{matrix.arch}}.rpm
else
echo "RPM file not found!"
exit 1
fi
- uses: actions/setup-python@v4
with:
python-version: "3.x" # Specify the Python version you need
@@ -97,6 +176,12 @@ jobs:
working-directory: ./backend
run: cloudsmith push deb --republish --no-wait-for-sync --api-key=${{ secrets.CLOUDSMITH_API_KEY }} infisical/infisical-core/any-distro/any-version ./binary/infisical-core-${{ matrix.arch }}.deb
# Publish .rpm file to Cloudsmith (Red Hat-based systems only)
- name: Publish .rpm to Cloudsmith
if: matrix.os == 'linux'
working-directory: ./backend
run: cloudsmith push rpm --republish --no-wait-for-sync --api-key=${{ secrets.CLOUDSMITH_API_KEY }} infisical/infisical-core/any-distro/any-version ./binary/infisical-core-${{ matrix.arch }}.rpm
# Publish .exe file to Cloudsmith (Windows only)
- name: Publish to Cloudsmith (Windows)
if: matrix.os == 'win'

View File

@@ -73,6 +73,11 @@ We're on a mission to make security tooling more accessible to everyone, not jus
- **[Infisical PKI Issuer for Kubernetes](https://infisical.com/docs/documentation/platform/pki/pki-issuer)**: Deliver TLS certificates to your Kubernetes workloads with automatic renewal.
- **[Enrollment over Secure Transport](https://infisical.com/docs/documentation/platform/pki/est)**: Enroll and manage certificates via EST protocol.
### Key Management (KMS):
- **[Cryptograhic Keys](https://infisical.com/docs/documentation/platform/kms)**: Centrally manage keys across projects through a user-friendly interface or via the API.
- **[Encrypt and Decrypt Data](https://infisical.com/docs/documentation/platform/kms#guide-to-encrypting-data)**: Use symmetric keys to encrypt and decrypt data.
### General Platform:
- **Authentication Methods**: Authenticate machine identities with Infisical using a cloud-native or platform agnostic authentication method ([Kubernetes Auth](https://infisical.com/docs/documentation/platform/identities/kubernetes-auth), [GCP Auth](https://infisical.com/docs/documentation/platform/identities/gcp-auth), [Azure Auth](https://infisical.com/docs/documentation/platform/identities/azure-auth), [AWS Auth](https://infisical.com/docs/documentation/platform/identities/aws-auth), [OIDC Auth](https://infisical.com/docs/documentation/platform/identities/oidc-auth/general), [Universal Auth](https://infisical.com/docs/documentation/platform/identities/universal-auth)).
- **[Access Controls](https://infisical.com/docs/documentation/platform/access-controls/overview)**: Define advanced authorization controls for users and machine identities with [RBAC](https://infisical.com/docs/documentation/platform/access-controls/role-based-access-controls), [additional privileges](https://infisical.com/docs/documentation/platform/access-controls/additional-privileges), [temporary access](https://infisical.com/docs/documentation/platform/access-controls/temporary-access), [access requests](https://infisical.com/docs/documentation/platform/access-controls/access-requests), [approval workflows](https://infisical.com/docs/documentation/platform/pr-workflows), and more.

View File

@@ -123,7 +123,7 @@ describe("Project Environment Router", async () => {
id: deletedProjectEnvironment.id,
name: mockProjectEnv.name,
slug: mockProjectEnv.slug,
position: 4,
position: 5,
createdAt: expect.any(String),
updatedAt: expect.any(String)
})

View File

@@ -61,12 +61,11 @@
"jwks-rsa": "^3.1.0",
"knex": "^3.0.1",
"ldapjs": "^3.0.7",
"ldif": "^0.5.1",
"ldif": "0.5.1",
"libsodium-wrappers": "^0.7.13",
"lodash.isequal": "^4.5.0",
"mongodb": "^6.8.1",
"ms": "^2.1.3",
"mustache": "^4.2.0",
"mysql2": "^3.9.8",
"nanoid": "^3.3.4",
"nodemailer": "^6.9.9",
@@ -111,7 +110,6 @@
"@types/jsrp": "^0.2.6",
"@types/libsodium-wrappers": "^0.7.13",
"@types/lodash.isequal": "^4.5.8",
"@types/mustache": "^4.2.5",
"@types/node": "^20.9.5",
"@types/nodemailer": "^6.4.14",
"@types/passport-github": "^1.1.12",
@@ -7079,13 +7077,6 @@
"resolved": "https://registry.npmjs.org/@types/ms/-/ms-0.7.34.tgz",
"integrity": "sha512-nG96G3Wp6acyAgJqGasjODb+acrI7KltPiRxzHPXnP3NgI28bpQDRv53olbqGXbfcgF5aiiHmO3xpwEpS5Ld9g=="
},
"node_modules/@types/mustache": {
"version": "4.2.5",
"resolved": "https://registry.npmjs.org/@types/mustache/-/mustache-4.2.5.tgz",
"integrity": "sha512-PLwiVvTBg59tGFL/8VpcGvqOu3L4OuveNvPi0EYbWchRdEVP++yRUXJPFl+CApKEq13017/4Nf7aQ5lTtHUNsA==",
"dev": true,
"license": "MIT"
},
"node_modules/@types/node": {
"version": "20.9.5",
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.9.5.tgz",
@@ -13729,15 +13720,6 @@
"node": ">= 6"
}
},
"node_modules/mustache": {
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/mustache/-/mustache-4.2.0.tgz",
"integrity": "sha512-71ippSywq5Yb7/tVYyGbkBggbU8H3u5Rz56fH60jGFgr8uHwxs+aSKeqmluIVzM0m0kB7xQjKS6qPfd0b2ZoqQ==",
"license": "MIT",
"bin": {
"mustache": "bin/mustache"
}
},
"node_modules/mylas": {
"version": "2.1.13",
"resolved": "https://registry.npmjs.org/mylas/-/mylas-2.1.13.tgz",

View File

@@ -71,7 +71,6 @@
"@types/jsrp": "^0.2.6",
"@types/libsodium-wrappers": "^0.7.13",
"@types/lodash.isequal": "^4.5.8",
"@types/mustache": "^4.2.5",
"@types/node": "^20.9.5",
"@types/nodemailer": "^6.4.14",
"@types/passport-github": "^1.1.12",
@@ -160,12 +159,11 @@
"jwks-rsa": "^3.1.0",
"knex": "^3.0.1",
"ldapjs": "^3.0.7",
"ldif": "^0.5.1",
"ldif": "0.5.1",
"libsodium-wrappers": "^0.7.13",
"lodash.isequal": "^4.5.0",
"mongodb": "^6.8.1",
"ms": "^2.1.3",
"mustache": "^4.2.0",
"mysql2": "^3.9.8",
"nanoid": "^3.3.4",
"nodemailer": "^6.9.9",

View File

@@ -38,6 +38,7 @@ import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-se
import { TCertificateServiceFactory } from "@app/services/certificate/certificate-service";
import { TCertificateAuthorityServiceFactory } from "@app/services/certificate-authority/certificate-authority-service";
import { TCertificateTemplateServiceFactory } from "@app/services/certificate-template/certificate-template-service";
import { TCmekServiceFactory } from "@app/services/cmek/cmek-service";
import { TExternalMigrationServiceFactory } from "@app/services/external-migration/external-migration-service";
import { TGroupProjectServiceFactory } from "@app/services/group-project/group-project-service";
import { TIdentityServiceFactory } from "@app/services/identity/identity-service";
@@ -182,6 +183,7 @@ declare module "fastify" {
orgAdmin: TOrgAdminServiceFactory;
slack: TSlackServiceFactory;
workflowIntegration: TWorkflowIntegrationServiceFactory;
cmek: TCmekServiceFactory;
migration: TExternalMigrationServiceFactory;
};
// this is exclusive use for middlewares in which we need to inject data

View File

@@ -0,0 +1,46 @@
import { Knex } from "knex";
import { dropConstraintIfExists } from "@app/db/migrations/utils/dropConstraintIfExists";
import { TableName } from "@app/db/schemas";
export async function up(knex: Knex): Promise<void> {
if (await knex.schema.hasTable(TableName.KmsKey)) {
const hasOrgId = await knex.schema.hasColumn(TableName.KmsKey, "orgId");
const hasSlug = await knex.schema.hasColumn(TableName.KmsKey, "slug");
// drop constraint if exists (won't exist if rolled back, see below)
await dropConstraintIfExists(TableName.KmsKey, "kms_keys_orgid_slug_unique", knex);
// projectId for CMEK functionality
await knex.schema.alterTable(TableName.KmsKey, (table) => {
table.string("projectId").nullable().references("id").inTable(TableName.Project).onDelete("CASCADE");
if (hasOrgId) {
table.unique(["orgId", "projectId", "slug"]);
}
if (hasSlug) {
table.renameColumn("slug", "name");
}
});
}
}
export async function down(knex: Knex): Promise<void> {
if (await knex.schema.hasTable(TableName.KmsKey)) {
const hasOrgId = await knex.schema.hasColumn(TableName.KmsKey, "orgId");
const hasName = await knex.schema.hasColumn(TableName.KmsKey, "name");
// remove projectId for CMEK functionality
await knex.schema.alterTable(TableName.KmsKey, (table) => {
if (hasName) {
table.renameColumn("name", "slug");
}
if (hasOrgId) {
table.dropUnique(["orgId", "projectId", "slug"]);
}
table.dropColumn("projectId");
});
}
}

View File

@@ -0,0 +1,30 @@
import { Knex } from "knex";
import { TableName } from "@app/db/schemas";
export async function up(knex: Knex): Promise<void> {
if (await knex.schema.hasTable(TableName.KmsKey)) {
const hasSlug = await knex.schema.hasColumn(TableName.KmsKey, "slug");
if (!hasSlug) {
// add slug back temporarily and set value equal to name
await knex.schema
.alterTable(TableName.KmsKey, (table) => {
table.string("slug", 32);
})
.then(() => knex(TableName.KmsKey).update("slug", knex.ref("name")));
}
}
}
export async function down(knex: Knex): Promise<void> {
if (await knex.schema.hasTable(TableName.KmsKey)) {
const hasSlug = await knex.schema.hasColumn(TableName.KmsKey, "slug");
if (hasSlug) {
await knex.schema.alterTable(TableName.KmsKey, (table) => {
table.dropColumn("slug");
});
}
}
}

View File

@@ -0,0 +1,6 @@
import { Knex } from "knex";
import { TableName } from "@app/db/schemas";
export const dropConstraintIfExists = (tableName: TableName, constraintName: string, knex: Knex) =>
knex.raw(`ALTER TABLE ${tableName} DROP CONSTRAINT IF EXISTS ${constraintName};`);

View File

@@ -54,7 +54,7 @@ export const getSecretManagerDataKey = async (knex: Knex, projectId: string) =>
} else {
const [kmsDoc] = await knex(TableName.KmsKey)
.insert({
slug: slugify(alphaNumericNanoId(8).toLowerCase()),
name: slugify(alphaNumericNanoId(8).toLowerCase()),
orgId: project.orgId,
isReserved: false
})

View File

@@ -13,9 +13,11 @@ export const KmsKeysSchema = z.object({
isDisabled: z.boolean().default(false).nullable().optional(),
isReserved: z.boolean().default(true).nullable().optional(),
orgId: z.string().uuid(),
slug: z.string(),
name: z.string(),
createdAt: z.date(),
updatedAt: z.date()
updatedAt: z.date(),
projectId: z.string().nullable().optional(),
slug: z.string().nullable().optional()
});
export type TKmsKeys = z.infer<typeof KmsKeysSchema>;

View File

@@ -26,7 +26,7 @@ const sanitizedExternalSchemaForGetAll = KmsKeysSchema.pick({
isDisabled: true,
createdAt: true,
updatedAt: true,
slug: true
name: true
})
.extend({
externalKms: ExternalKmsSchema.pick({
@@ -57,7 +57,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
},
schema: {
body: z.object({
slug: z.string().min(1).trim().toLowerCase(),
name: z.string().min(1).trim().toLowerCase(),
description: z.string().trim().optional(),
provider: ExternalKmsInputSchema
}),
@@ -74,7 +74,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
slug: req.body.slug,
name: req.body.name,
provider: req.body.provider,
description: req.body.description
});
@@ -87,7 +87,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
metadata: {
kmsId: externalKms.id,
provider: req.body.provider.type,
slug: req.body.slug,
name: req.body.name,
description: req.body.description
}
}
@@ -108,7 +108,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
id: z.string().trim().min(1)
}),
body: z.object({
slug: z.string().min(1).trim().toLowerCase().optional(),
name: z.string().min(1).trim().toLowerCase().optional(),
description: z.string().trim().optional(),
provider: ExternalKmsInputUpdateSchema
}),
@@ -125,7 +125,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
slug: req.body.slug,
name: req.body.name,
provider: req.body.provider,
description: req.body.description,
id: req.params.id
@@ -139,7 +139,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
metadata: {
kmsId: externalKms.id,
provider: req.body.provider.type,
slug: req.body.slug,
name: req.body.name,
description: req.body.description
}
}
@@ -182,7 +182,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
type: EventType.DELETE_KMS,
metadata: {
kmsId: externalKms.id,
slug: externalKms.slug
name: externalKms.name
}
}
});
@@ -224,7 +224,7 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
type: EventType.GET_KMS,
metadata: {
kmsId: externalKms.id,
slug: externalKms.slug
name: externalKms.name
}
}
});
@@ -260,13 +260,13 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
server.route({
method: "GET",
url: "/slug/:slug",
url: "/name/:name",
config: {
rateLimit: readLimit
},
schema: {
params: z.object({
slug: z.string().trim().min(1)
name: z.string().trim().min(1)
}),
response: {
200: z.object({
@@ -276,12 +276,12 @@ export const registerExternalKmsRouter = async (server: FastifyZodProvider) => {
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const externalKms = await server.services.externalKms.findBySlug({
const externalKms = await server.services.externalKms.findByName({
actor: req.permission.type,
actorId: req.permission.id,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
slug: req.params.slug
name: req.params.name
});
return { externalKms };
}

View File

@@ -203,7 +203,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
200: z.object({
secretManagerKmsKey: z.object({
id: z.string(),
slug: z.string(),
name: z.string(),
isExternal: z.boolean()
})
})
@@ -243,7 +243,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
200: z.object({
secretManagerKmsKey: z.object({
id: z.string(),
slug: z.string(),
name: z.string(),
isExternal: z.boolean()
})
})
@@ -268,7 +268,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
metadata: {
secretManagerKmsKey: {
id: secretManagerKmsKey.id,
slug: secretManagerKmsKey.slug
name: secretManagerKmsKey.name
}
}
}
@@ -336,7 +336,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
200: z.object({
secretManagerKmsKey: z.object({
id: z.string(),
slug: z.string(),
name: z.string(),
isExternal: z.boolean()
})
})

View File

@@ -1,3 +1,4 @@
import { SymmetricEncryption } from "@app/lib/crypto/cipher";
import { TProjectPermission } from "@app/lib/types";
import { ActorType } from "@app/services/auth/auth-type";
import { CaStatus } from "@app/services/certificate-authority/certificate-authority-types";
@@ -122,6 +123,7 @@ export enum EventType {
UPDATE_WEBHOOK_STATUS = "update-webhook-status",
DELETE_WEBHOOK = "delete-webhook",
GET_SECRET_IMPORTS = "get-secret-imports",
GET_SECRET_IMPORT = "get-secret-import",
CREATE_SECRET_IMPORT = "create-secret-import",
UPDATE_SECRET_IMPORT = "update-secret-import",
DELETE_SECRET_IMPORT = "delete-secret-import",
@@ -182,7 +184,13 @@ export enum EventType {
DELETE_SLACK_INTEGRATION = "delete-slack-integration",
GET_PROJECT_SLACK_CONFIG = "get-project-slack-config",
UPDATE_PROJECT_SLACK_CONFIG = "update-project-slack-config",
INTEGRATION_SYNCED = "integration-synced"
INTEGRATION_SYNCED = "integration-synced",
CREATE_CMEK = "create-cmek",
UPDATE_CMEK = "update-cmek",
DELETE_CMEK = "delete-cmek",
GET_CMEKS = "get-cmeks",
CMEK_ENCRYPT = "cmek-encrypt",
CMEK_DECRYPT = "cmek-decrypt"
}
interface UserActorMetadata {
@@ -1004,6 +1012,14 @@ interface GetSecretImportsEvent {
};
}
interface GetSecretImportEvent {
type: EventType.GET_SECRET_IMPORT;
metadata: {
secretImportId: string;
folderId: string;
};
}
interface CreateSecretImportEvent {
type: EventType.CREATE_SECRET_IMPORT;
metadata: {
@@ -1350,7 +1366,7 @@ interface CreateKmsEvent {
metadata: {
kmsId: string;
provider: string;
slug: string;
name: string;
description?: string;
};
}
@@ -1359,7 +1375,7 @@ interface DeleteKmsEvent {
type: EventType.DELETE_KMS;
metadata: {
kmsId: string;
slug: string;
name: string;
};
}
@@ -1368,7 +1384,7 @@ interface UpdateKmsEvent {
metadata: {
kmsId: string;
provider: string;
slug?: string;
name?: string;
description?: string;
};
}
@@ -1377,7 +1393,7 @@ interface GetKmsEvent {
type: EventType.GET_KMS;
metadata: {
kmsId: string;
slug: string;
name: string;
};
}
@@ -1386,7 +1402,7 @@ interface UpdateProjectKmsEvent {
metadata: {
secretManagerKmsKey: {
id: string;
slug: string;
name: string;
};
};
}
@@ -1541,6 +1557,53 @@ interface IntegrationSyncedEvent {
};
}
interface CreateCmekEvent {
type: EventType.CREATE_CMEK;
metadata: {
keyId: string;
name: string;
description?: string;
encryptionAlgorithm: SymmetricEncryption;
};
}
interface DeleteCmekEvent {
type: EventType.DELETE_CMEK;
metadata: {
keyId: string;
};
}
interface UpdateCmekEvent {
type: EventType.UPDATE_CMEK;
metadata: {
keyId: string;
name?: string;
description?: string;
};
}
interface GetCmeksEvent {
type: EventType.GET_CMEKS;
metadata: {
keyIds: string[];
};
}
interface CmekEncryptEvent {
type: EventType.CMEK_ENCRYPT;
metadata: {
keyId: string;
};
}
interface CmekDecryptEvent {
type: EventType.CMEK_DECRYPT;
metadata: {
keyId: string;
};
}
export type Event =
| GetSecretsEvent
| GetSecretEvent
@@ -1620,6 +1683,7 @@ export type Event =
| UpdateWebhookStatusEvent
| DeleteWebhookEvent
| GetSecretImportsEvent
| GetSecretImportEvent
| CreateSecretImportEvent
| UpdateSecretImportEvent
| DeleteSecretImportEvent
@@ -1680,4 +1744,10 @@ export type Event =
| GetSlackIntegration
| UpdateProjectSlackConfig
| GetProjectSlackConfig
| IntegrationSyncedEvent;
| IntegrationSyncedEvent
| CreateCmekEvent
| UpdateCmekEvent
| DeleteCmekEvent
| GetCmeksEvent
| CmekEncryptEvent
| CmekDecryptEvent;

View File

@@ -1,6 +1,6 @@
import handlebars from "handlebars";
import ldapjs from "ldapjs";
import ldif from "ldif";
import mustache from "mustache";
import { customAlphabet } from "nanoid";
import { z } from "zod";
@@ -40,7 +40,8 @@ const generateLDIF = ({
EncodedPassword: encodePassword(password)
};
const renderedLdif = mustache.render(ldifTemplate, data);
const renderTemplate = handlebars.compile(ldifTemplate);
const renderedLdif = renderTemplate(data);
return renderedLdif;
};

View File

@@ -30,7 +30,7 @@ export const externalKmsDALFactory = (db: TDbClient) => {
isDisabled: el.isDisabled,
isReserved: el.isReserved,
orgId: el.orgId,
slug: el.slug,
name: el.name,
createdAt: el.createdAt,
updatedAt: el.updatedAt,
externalKms: {

View File

@@ -43,7 +43,7 @@ export const externalKmsServiceFactory = ({
provider,
description,
actor,
slug,
name,
actorId,
actorOrgId,
actorAuthMethod
@@ -64,7 +64,7 @@ export const externalKmsServiceFactory = ({
});
}
const kmsSlug = slug ? slugify(slug) : slugify(alphaNumericNanoId(8).toLowerCase());
const kmsName = name ? slugify(name) : slugify(alphaNumericNanoId(8).toLowerCase());
let sanitizedProviderInput = "";
switch (provider.type) {
@@ -96,7 +96,7 @@ export const externalKmsServiceFactory = ({
{
isReserved: false,
description,
slug: kmsSlug,
name: kmsName,
orgId: actorOrgId
},
tx
@@ -120,7 +120,7 @@ export const externalKmsServiceFactory = ({
description,
actor,
id: kmsId,
slug,
name,
actorId,
actorOrgId,
actorAuthMethod
@@ -142,7 +142,7 @@ export const externalKmsServiceFactory = ({
});
}
const kmsSlug = slug ? slugify(slug) : undefined;
const kmsName = name ? slugify(name) : undefined;
const externalKmsDoc = await externalKmsDAL.findOne({ kmsKeyId: kmsDoc.id });
if (!externalKmsDoc) throw new NotFoundError({ message: "External kms not found" });
@@ -188,7 +188,7 @@ export const externalKmsServiceFactory = ({
kmsDoc.id,
{
description,
slug: kmsSlug
name: kmsName
},
tx
);
@@ -280,14 +280,14 @@ export const externalKmsServiceFactory = ({
}
};
const findBySlug = async ({
const findByName = async ({
actor,
actorId,
actorOrgId,
actorAuthMethod,
slug: kmsSlug
name: kmsName
}: TGetExternalKmsBySlugDTO) => {
const kmsDoc = await kmsDAL.findOne({ slug: kmsSlug, orgId: actorOrgId });
const kmsDoc = await kmsDAL.findOne({ name: kmsName, orgId: actorOrgId });
const { permission } = await permissionService.getOrgPermission(
actor,
actorId,
@@ -327,6 +327,6 @@ export const externalKmsServiceFactory = ({
deleteById,
list,
findById,
findBySlug
findByName
};
};

View File

@@ -3,14 +3,14 @@ import { TOrgPermission } from "@app/lib/types";
import { TExternalKmsInputSchema, TExternalKmsInputUpdateSchema } from "./providers/model";
export type TCreateExternalKmsDTO = {
slug?: string;
name?: string;
description?: string;
provider: TExternalKmsInputSchema;
} & Omit<TOrgPermission, "orgId">;
export type TUpdateExternalKmsDTO = {
id: string;
slug?: string;
name?: string;
description?: string;
provider?: TExternalKmsInputUpdateSchema;
} & Omit<TOrgPermission, "orgId">;
@@ -26,5 +26,5 @@ export type TGetExternalKmsByIdDTO = {
} & Omit<TOrgPermission, "orgId">;
export type TGetExternalKmsBySlugDTO = {
slug: string;
name: string;
} & Omit<TOrgPermission, "orgId">;

View File

@@ -14,6 +14,15 @@ export enum ProjectPermissionActions {
Delete = "delete"
}
export enum ProjectPermissionCmekActions {
Read = "read",
Create = "create",
Edit = "edit",
Delete = "delete",
Encrypt = "encrypt",
Decrypt = "decrypt"
}
export enum ProjectPermissionSub {
Role = "role",
Member = "member",
@@ -38,7 +47,8 @@ export enum ProjectPermissionSub {
CertificateTemplates = "certificate-templates",
PkiAlerts = "pki-alerts",
PkiCollections = "pki-collections",
Kms = "kms"
Kms = "kms",
Cmek = "cmek"
}
export type SecretSubjectFields = {
@@ -95,6 +105,7 @@ export type ProjectPermissionSet =
| [ProjectPermissionActions, ProjectPermissionSub.CertificateTemplates]
| [ProjectPermissionActions, ProjectPermissionSub.PkiAlerts]
| [ProjectPermissionActions, ProjectPermissionSub.PkiCollections]
| [ProjectPermissionCmekActions, ProjectPermissionSub.Cmek]
| [ProjectPermissionActions.Delete, ProjectPermissionSub.Project]
| [ProjectPermissionActions.Edit, ProjectPermissionSub.Project]
| [ProjectPermissionActions.Read, ProjectPermissionSub.SecretRollback]
@@ -282,6 +293,12 @@ export const ProjectPermissionSchema = z.discriminatedUnion("subject", [
action: CASL_ACTION_SCHEMA_ENUM([ProjectPermissionActions.Read]).describe(
"Describe what action an entity can take."
)
}),
z.object({
subject: z.literal(ProjectPermissionSub.Cmek).describe("The entity this permission pertains to."),
action: CASL_ACTION_SCHEMA_NATIVE_ENUM(ProjectPermissionCmekActions).describe(
"Describe what action an entity can take."
)
})
]);
@@ -325,6 +342,17 @@ const buildAdminPermissionRules = () => {
can([ProjectPermissionActions.Edit, ProjectPermissionActions.Delete], ProjectPermissionSub.Project);
can([ProjectPermissionActions.Read, ProjectPermissionActions.Create], ProjectPermissionSub.SecretRollback);
can([ProjectPermissionActions.Edit], ProjectPermissionSub.Kms);
can(
[
ProjectPermissionCmekActions.Create,
ProjectPermissionCmekActions.Edit,
ProjectPermissionCmekActions.Delete,
ProjectPermissionCmekActions.Read,
ProjectPermissionCmekActions.Encrypt,
ProjectPermissionCmekActions.Decrypt
],
ProjectPermissionSub.Cmek
);
return rules;
};
@@ -444,6 +472,18 @@ const buildMemberPermissionRules = () => {
can([ProjectPermissionActions.Read], ProjectPermissionSub.PkiAlerts);
can([ProjectPermissionActions.Read], ProjectPermissionSub.PkiCollections);
can(
[
ProjectPermissionCmekActions.Create,
ProjectPermissionCmekActions.Edit,
ProjectPermissionCmekActions.Delete,
ProjectPermissionCmekActions.Read,
ProjectPermissionCmekActions.Encrypt,
ProjectPermissionCmekActions.Decrypt
],
ProjectPermissionSub.Cmek
);
return rules;
};
@@ -470,6 +510,7 @@ const buildViewerPermissionRules = () => {
can(ProjectPermissionActions.Read, ProjectPermissionSub.IpAllowList);
can(ProjectPermissionActions.Read, ProjectPermissionSub.CertificateAuthorities);
can(ProjectPermissionActions.Read, ProjectPermissionSub.Certificates);
can(ProjectPermissionCmekActions.Read, ProjectPermissionSub.Cmek);
return rules;
};

View File

@@ -533,7 +533,8 @@ export const ENVIRONMENTS = {
CREATE: {
workspaceId: "The ID of the project to create the environment in.",
name: "The name of the environment to create.",
slug: "The slug of the environment to create."
slug: "The slug of the environment to create.",
position: "The position of the environment. The lowest number will be displayed as the first environment."
},
UPDATE: {
workspaceId: "The ID of the project to update the environment in.",
@@ -675,6 +676,9 @@ export const SECRET_IMPORTS = {
environment: "The slug of the environment to list secret imports from.",
path: "The path to list secret imports from."
},
GET: {
secretImportId: "The ID of the secret import to fetch."
},
CREATE: {
environment: "The slug of the environment to import into.",
path: "The path to import into.",
@@ -1347,3 +1351,37 @@ export const PROJECT_ROLE = {
projectSlug: "The slug of the project to list the roles of."
}
};
export const KMS = {
CREATE_KEY: {
projectId: "The ID of the project to create the key in.",
name: "The name of the key to be created. Must be slug-friendly.",
description: "An optional description of the key.",
encryptionAlgorithm: "The algorithm to use when performing cryptographic operations with the key."
},
UPDATE_KEY: {
keyId: "The ID of the key to be updated.",
name: "The updated name of this key. Must be slug-friendly.",
description: "The updated description of this key.",
isDisabled: "The flag to enable or disable this key."
},
DELETE_KEY: {
keyId: "The ID of the key to be deleted."
},
LIST_KEYS: {
projectId: "The ID of the project to list keys from.",
offset: "The offset to start from. If you enter 10, it will start from the 10th key.",
limit: "The number of keys to return.",
orderBy: "The column to order keys by.",
orderDirection: "The direction to order keys in.",
search: "The text string to filter key names by."
},
ENCRYPT: {
keyId: "The ID of the key to encrypt the data with.",
plaintext: "The plaintext to be encrypted (base64 encoded)."
},
DECRYPT: {
keyId: "The ID of the key to decrypt the data with.",
ciphertext: "The ciphertext to be decrypted (base64 encoded)."
}
};

View File

@@ -0,0 +1,28 @@
// Credit: https://github.com/miguelmota/is-base64
export const isBase64 = (
v: string,
opts = { allowEmpty: false, mimeRequired: false, allowMime: true, paddingRequired: false }
) => {
if (opts.allowEmpty === false && v === "") {
return false;
}
let regex = "(?:[A-Za-z0-9+\\/]{4})*(?:[A-Za-z0-9+\\/]{2}==|[A-Za-z0-9+/]{3}=)?";
const mimeRegex = "(data:\\w+\\/[a-zA-Z\\+\\-\\.]+;base64,)";
if (opts.mimeRequired === true) {
regex = mimeRegex + regex;
} else if (opts.allowMime === true) {
regex = `${mimeRegex}?${regex}`;
}
if (opts.paddingRequired === false) {
regex = "(?:[A-Za-z0-9+\\/]{4})*(?:[A-Za-z0-9+\\/]{2}(==)?|[A-Za-z0-9+\\/]{3}=?)?";
}
return new RegExp(`^${regex}$`, "gi").test(v);
};
export const getBase64SizeInBytes = (base64String: string) => {
return Buffer.from(base64String, "base64").length;
};

View File

@@ -96,6 +96,7 @@ import { certificateAuthorityServiceFactory } from "@app/services/certificate-au
import { certificateTemplateDALFactory } from "@app/services/certificate-template/certificate-template-dal";
import { certificateTemplateEstConfigDALFactory } from "@app/services/certificate-template/certificate-template-est-config-dal";
import { certificateTemplateServiceFactory } from "@app/services/certificate-template/certificate-template-service";
import { cmekServiceFactory } from "@app/services/cmek/cmek-service";
import { externalMigrationServiceFactory } from "@app/services/external-migration/external-migration-service";
import { groupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { groupProjectMembershipRoleDALFactory } from "@app/services/group-project/group-project-membership-role-dal";
@@ -1194,6 +1195,12 @@ export const registerRoutes = async (
workflowIntegrationDAL
});
const cmekService = cmekServiceFactory({
kmsDAL,
kmsService,
permissionService
});
const migrationService = externalMigrationServiceFactory({
projectService,
orgService,
@@ -1283,6 +1290,7 @@ export const registerRoutes = async (
secretSharing: secretSharingService,
userEngagement: userEngagementService,
externalKms: externalKmsService,
cmek: cmekService,
orgAdmin: orgAdminService,
slack: slackService,
workflowIntegration: workflowIntegrationService,

View File

@@ -0,0 +1,331 @@
import slugify from "@sindresorhus/slugify";
import { z } from "zod";
import { InternalKmsSchema, KmsKeysSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { KMS } from "@app/lib/api-docs";
import { getBase64SizeInBytes, isBase64 } from "@app/lib/base64";
import { SymmetricEncryption } from "@app/lib/crypto/cipher";
import { OrderByDirection } from "@app/lib/types";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
import { CmekOrderBy } from "@app/services/cmek/cmek-types";
const keyNameSchema = z
.string()
.trim()
.min(1)
.max(32)
.toLowerCase()
.refine((v) => slugify(v) === v, {
message: "Name must be slug friendly"
});
const keyDescriptionSchema = z.string().trim().max(500).optional();
const base64Schema = z.string().superRefine((val, ctx) => {
if (!isBase64(val)) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
message: "plaintext must be base64 encoded"
});
}
if (getBase64SizeInBytes(val) > 4096) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
message: "data cannot exceed 4096 bytes"
});
}
});
export const registerCmekRouter = async (server: FastifyZodProvider) => {
// create encryption key
server.route({
method: "POST",
url: "/keys",
config: {
rateLimit: writeLimit
},
schema: {
description: "Create KMS key",
body: z.object({
projectId: z.string().describe(KMS.CREATE_KEY.projectId),
name: keyNameSchema.describe(KMS.CREATE_KEY.name),
description: keyDescriptionSchema.describe(KMS.CREATE_KEY.description),
encryptionAlgorithm: z
.nativeEnum(SymmetricEncryption)
.optional()
.default(SymmetricEncryption.AES_GCM_256)
.describe(KMS.CREATE_KEY.encryptionAlgorithm) // eventually will support others
}),
response: {
200: z.object({
key: KmsKeysSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const {
body: { projectId, name, description, encryptionAlgorithm },
permission
} = req;
const cmek = await server.services.cmek.createCmek(
{ orgId: permission.orgId, projectId, name, description, encryptionAlgorithm },
permission
);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId,
event: {
type: EventType.CREATE_CMEK,
metadata: {
keyId: cmek.id,
name,
description,
encryptionAlgorithm
}
}
});
return { key: cmek };
}
});
// update KMS key
server.route({
method: "PATCH",
url: "/keys/:keyId",
config: {
rateLimit: writeLimit
},
schema: {
description: "Update KMS key",
params: z.object({
keyId: z.string().uuid().describe(KMS.UPDATE_KEY.keyId)
}),
body: z.object({
name: keyNameSchema.optional().describe(KMS.UPDATE_KEY.name),
isDisabled: z.boolean().optional().describe(KMS.UPDATE_KEY.isDisabled),
description: keyDescriptionSchema.describe(KMS.UPDATE_KEY.description)
}),
response: {
200: z.object({
key: KmsKeysSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const {
params: { keyId },
body,
permission
} = req;
const cmek = await server.services.cmek.updateCmekById({ keyId, ...body }, permission);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: permission.orgId,
event: {
type: EventType.UPDATE_CMEK,
metadata: {
keyId,
...body
}
}
});
return { key: cmek };
}
});
// delete KMS key
server.route({
method: "DELETE",
url: "/keys/:keyId",
config: {
rateLimit: writeLimit
},
schema: {
description: "Delete KMS key",
params: z.object({
keyId: z.string().uuid().describe(KMS.DELETE_KEY.keyId)
}),
response: {
200: z.object({
key: KmsKeysSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const {
params: { keyId },
permission
} = req;
const cmek = await server.services.cmek.deleteCmekById(keyId, permission);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: permission.orgId,
event: {
type: EventType.DELETE_CMEK,
metadata: {
keyId
}
}
});
return { key: cmek };
}
});
// list KMS keys
server.route({
method: "GET",
url: "/keys",
config: {
rateLimit: readLimit
},
schema: {
description: "List KMS keys",
querystring: z.object({
projectId: z.string().describe(KMS.LIST_KEYS.projectId),
offset: z.coerce.number().min(0).optional().default(0).describe(KMS.LIST_KEYS.offset),
limit: z.coerce.number().min(1).max(100).optional().default(100).describe(KMS.LIST_KEYS.limit),
orderBy: z.nativeEnum(CmekOrderBy).optional().default(CmekOrderBy.Name).describe(KMS.LIST_KEYS.orderBy),
orderDirection: z
.nativeEnum(OrderByDirection)
.optional()
.default(OrderByDirection.ASC)
.describe(KMS.LIST_KEYS.orderDirection),
search: z.string().trim().optional().describe(KMS.LIST_KEYS.search)
}),
response: {
200: z.object({
keys: KmsKeysSchema.merge(InternalKmsSchema.pick({ version: true, encryptionAlgorithm: true })).array(),
totalCount: z.number()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const {
query: { projectId, ...dto },
permission
} = req;
const { cmeks, totalCount } = await server.services.cmek.listCmeksByProjectId({ projectId, ...dto }, permission);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId,
event: {
type: EventType.GET_CMEKS,
metadata: {
keyIds: cmeks.map((key) => key.id)
}
}
});
return { keys: cmeks, totalCount };
}
});
// encrypt data
server.route({
method: "POST",
url: "/keys/:keyId/encrypt",
config: {
rateLimit: writeLimit
},
schema: {
description: "Encrypt data with KMS key",
params: z.object({
keyId: z.string().uuid().describe(KMS.ENCRYPT.keyId)
}),
body: z.object({
plaintext: base64Schema.describe(KMS.ENCRYPT.plaintext)
}),
response: {
200: z.object({
ciphertext: z.string()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const {
params: { keyId },
body: { plaintext },
permission
} = req;
const ciphertext = await server.services.cmek.cmekEncrypt({ keyId, plaintext }, permission);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: permission.orgId,
event: {
type: EventType.CMEK_ENCRYPT,
metadata: {
keyId
}
}
});
return { ciphertext };
}
});
server.route({
method: "POST",
url: "/keys/:keyId/decrypt",
config: {
rateLimit: writeLimit
},
schema: {
description: "Decrypt data with KMS key",
params: z.object({
keyId: z.string().uuid().describe(KMS.ENCRYPT.keyId)
}),
body: z.object({
ciphertext: base64Schema.describe(KMS.ENCRYPT.plaintext)
}),
response: {
200: z.object({
plaintext: z.string()
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const {
params: { keyId },
body: { ciphertext },
permission
} = req;
const plaintext = await server.services.cmek.cmekDecrypt({ keyId, ciphertext }, permission);
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
orgId: permission.orgId,
event: {
type: EventType.CMEK_DECRYPT,
metadata: {
keyId
}
}
});
return { plaintext };
}
});
};

View File

@@ -22,7 +22,7 @@ export const registerIdentityAwsAuthRouter = async (server: FastifyZodProvider)
schema: {
description: "Login with AWS Auth",
body: z.object({
identityId: z.string().describe(AWS_AUTH.LOGIN.identityId),
identityId: z.string().trim().describe(AWS_AUTH.LOGIN.identityId),
iamHttpRequestMethod: z.string().default("POST").describe(AWS_AUTH.LOGIN.iamHttpRequestMethod),
iamRequestBody: z.string().describe(AWS_AUTH.LOGIN.iamRequestBody),
iamRequestHeaders: z.string().describe(AWS_AUTH.LOGIN.iamRequestHeaders)

View File

@@ -21,7 +21,7 @@ export const registerIdentityAzureAuthRouter = async (server: FastifyZodProvider
schema: {
description: "Login with Azure Auth",
body: z.object({
identityId: z.string().describe(AZURE_AUTH.LOGIN.identityId),
identityId: z.string().trim().describe(AZURE_AUTH.LOGIN.identityId),
jwt: z.string()
}),
response: {

View File

@@ -19,7 +19,7 @@ export const registerIdentityGcpAuthRouter = async (server: FastifyZodProvider)
schema: {
description: "Login with GCP Auth",
body: z.object({
identityId: z.string().describe(GCP_AUTH.LOGIN.identityId),
identityId: z.string().trim().describe(GCP_AUTH.LOGIN.identityId).trim(),
jwt: z.string()
}),
response: {

View File

@@ -1,3 +1,4 @@
import { registerCmekRouter } from "@app/server/routes/v1/cmek-router";
import { registerDashboardRouter } from "@app/server/routes/v1/dashboard-router";
import { registerAdminRouter } from "./admin-router";
@@ -103,6 +104,6 @@ export const registerV1Routes = async (server: FastifyZodProvider) => {
await server.register(registerIdentityRouter, { prefix: "/identities" });
await server.register(registerSecretSharingRouter, { prefix: "/secret-sharing" });
await server.register(registerUserEngagementRouter, { prefix: "/user-engagement" });
await server.register(registerDashboardRouter, { prefix: "/dashboard" });
await server.register(registerCmekRouter, { prefix: "/kms" });
};

View File

@@ -4,7 +4,7 @@ import { z } from "zod";
import { ProjectEnvironmentsSchema } from "@app/db/schemas";
import { EventType } from "@app/ee/services/audit-log/audit-log-types";
import { ENVIRONMENTS } from "@app/lib/api-docs";
import { writeLimit } from "@app/server/config/rateLimiter";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type";
@@ -23,6 +23,7 @@ export const registerProjectEnvRouter = async (server: FastifyZodProvider) => {
}
],
params: z.object({
// NOTE(daniel): workspaceId isn't used, but we need to keep it for backwards compatibility. The endpoint defined below, uses no project ID, and is takes a pure environment ID.
workspaceId: z.string().trim().describe(ENVIRONMENTS.GET.workspaceId),
envId: z.string().trim().describe(ENVIRONMENTS.GET.id)
}),
@@ -39,7 +40,53 @@ export const registerProjectEnvRouter = async (server: FastifyZodProvider) => {
actor: req.permission.type,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
projectId: req.params.workspaceId,
id: req.params.envId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: environment.projectId,
event: {
type: EventType.GET_ENVIRONMENT,
metadata: {
id: environment.id
}
}
});
return { environment };
}
});
server.route({
method: "GET",
url: "/environments/:envId",
config: {
rateLimit: readLimit
},
schema: {
description: "Get Environment by ID",
security: [
{
bearerAuth: []
}
],
params: z.object({
envId: z.string().trim().describe(ENVIRONMENTS.GET.id)
}),
response: {
200: z.object({
environment: ProjectEnvironmentsSchema
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const environment = await server.services.projectEnv.getEnvironmentById({
actorId: req.permission.id,
actor: req.permission.type,
actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod,
id: req.params.envId
});
@@ -76,6 +123,7 @@ export const registerProjectEnvRouter = async (server: FastifyZodProvider) => {
}),
body: z.object({
name: z.string().trim().describe(ENVIRONMENTS.CREATE.name),
position: z.number().min(1).optional().describe(ENVIRONMENTS.CREATE.position),
slug: z
.string()
.trim()

View File

@@ -365,7 +365,15 @@ export const registerSecretFolderRouter = async (server: FastifyZodProvider) =>
}),
response: {
200: z.object({
folder: SecretFoldersSchema
folder: SecretFoldersSchema.extend({
environment: z.object({
envId: z.string(),
envName: z.string(),
envSlug: z.string()
}),
path: z.string(),
projectId: z.string()
})
})
}
},

View File

@@ -312,6 +312,64 @@ export const registerSecretImportRouter = async (server: FastifyZodProvider) =>
}
});
server.route({
url: "/:secretImportId",
method: "GET",
config: {
rateLimit: readLimit
},
schema: {
description: "Get single secret import",
security: [
{
bearerAuth: []
}
],
params: z.object({
secretImportId: z.string().trim().describe(SECRET_IMPORTS.GET.secretImportId)
}),
response: {
200: z.object({
secretImport: SecretImportsSchema.omit({ importEnv: true }).extend({
environment: z.object({
id: z.string(),
name: z.string(),
slug: z.string()
}),
projectId: z.string(),
importEnv: z.object({ name: z.string(), slug: z.string(), id: z.string() }),
secretPath: z.string()
})
})
}
},
onRequest: verifyAuth([AuthMode.JWT, AuthMode.API_KEY, AuthMode.SERVICE_TOKEN, AuthMode.IDENTITY_ACCESS_TOKEN]),
handler: async (req) => {
const secretImport = await server.services.secretImport.getImportById({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.secretImportId
});
await server.services.auditLog.createAuditLog({
...req.auditLogInfo,
projectId: secretImport.projectId,
event: {
type: EventType.GET_SECRET_IMPORT,
metadata: {
secretImportId: secretImport.id,
folderId: secretImport.folderId
}
}
});
return { secretImport };
}
});
server.route({
url: "/secrets",
method: "GET",

View File

@@ -0,0 +1,169 @@
import { ForbiddenError } from "@casl/ability";
import { FastifyRequest } from "fastify";
import { TPermissionServiceFactory } from "@app/ee/services/permission/permission-service";
import { ProjectPermissionCmekActions, ProjectPermissionSub } from "@app/ee/services/permission/project-permission";
import { BadRequestError, NotFoundError } from "@app/lib/errors";
import {
TCmekDecryptDTO,
TCmekEncryptDTO,
TCreateCmekDTO,
TListCmeksByProjectIdDTO,
TUpdabteCmekByIdDTO
} from "@app/services/cmek/cmek-types";
import { TKmsKeyDALFactory } from "@app/services/kms/kms-key-dal";
import { TKmsServiceFactory } from "@app/services/kms/kms-service";
type TCmekServiceFactoryDep = {
kmsService: TKmsServiceFactory;
kmsDAL: TKmsKeyDALFactory;
permissionService: TPermissionServiceFactory;
};
export type TCmekServiceFactory = ReturnType<typeof cmekServiceFactory>;
export const cmekServiceFactory = ({ kmsService, kmsDAL, permissionService }: TCmekServiceFactoryDep) => {
const createCmek = async ({ projectId, ...dto }: TCreateCmekDTO, actor: FastifyRequest["permission"]) => {
const { permission } = await permissionService.getProjectPermission(
actor.type,
actor.id,
projectId,
actor.authMethod,
actor.orgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionCmekActions.Create, ProjectPermissionSub.Cmek);
const cmek = await kmsService.generateKmsKey({
...dto,
projectId,
isReserved: false
});
return cmek;
};
const updateCmekById = async ({ keyId, ...data }: TUpdabteCmekByIdDTO, actor: FastifyRequest["permission"]) => {
const key = await kmsDAL.findById(keyId);
if (!key) throw new NotFoundError({ message: "Key not found" });
if (!key.projectId || key.isReserved) throw new BadRequestError({ message: "Key is not customer managed" });
const { permission } = await permissionService.getProjectPermission(
actor.type,
actor.id,
key.projectId,
actor.authMethod,
actor.orgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionCmekActions.Edit, ProjectPermissionSub.Cmek);
const cmek = await kmsDAL.updateById(keyId, data);
return cmek;
};
const deleteCmekById = async (keyId: string, actor: FastifyRequest["permission"]) => {
const key = await kmsDAL.findById(keyId);
if (!key) throw new NotFoundError({ message: "Key not found" });
if (!key.projectId || key.isReserved) throw new BadRequestError({ message: "Key is not customer managed" });
const { permission } = await permissionService.getProjectPermission(
actor.type,
actor.id,
key.projectId,
actor.authMethod,
actor.orgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionCmekActions.Delete, ProjectPermissionSub.Cmek);
const cmek = kmsDAL.deleteById(keyId);
return cmek;
};
const listCmeksByProjectId = async (
{ projectId, ...filters }: TListCmeksByProjectIdDTO,
actor: FastifyRequest["permission"]
) => {
const { permission } = await permissionService.getProjectPermission(
actor.type,
actor.id,
projectId,
actor.authMethod,
actor.orgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionCmekActions.Read, ProjectPermissionSub.Cmek);
const { keys: cmeks, totalCount } = await kmsDAL.findKmsKeysByProjectId({ projectId, ...filters });
return { cmeks, totalCount };
};
const cmekEncrypt = async ({ keyId, plaintext }: TCmekEncryptDTO, actor: FastifyRequest["permission"]) => {
const key = await kmsDAL.findById(keyId);
if (!key) throw new NotFoundError({ message: "Key not found" });
if (!key.projectId || key.isReserved) throw new BadRequestError({ message: "Key is not customer managed" });
if (key.isDisabled) throw new BadRequestError({ message: "Key is disabled" });
const { permission } = await permissionService.getProjectPermission(
actor.type,
actor.id,
key.projectId,
actor.authMethod,
actor.orgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionCmekActions.Encrypt, ProjectPermissionSub.Cmek);
const encrypt = await kmsService.encryptWithKmsKey({ kmsId: keyId });
const { cipherTextBlob } = await encrypt({ plainText: Buffer.from(plaintext, "base64") });
return cipherTextBlob.toString("base64");
};
const cmekDecrypt = async ({ keyId, ciphertext }: TCmekDecryptDTO, actor: FastifyRequest["permission"]) => {
const key = await kmsDAL.findById(keyId);
if (!key) throw new NotFoundError({ message: "Key not found" });
if (!key.projectId || key.isReserved) throw new BadRequestError({ message: "Key is not customer managed" });
if (key.isDisabled) throw new BadRequestError({ message: "Key is disabled" });
const { permission } = await permissionService.getProjectPermission(
actor.type,
actor.id,
key.projectId,
actor.authMethod,
actor.orgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionCmekActions.Decrypt, ProjectPermissionSub.Cmek);
const decrypt = await kmsService.decryptWithKmsKey({ kmsId: keyId });
const plaintextBlob = await decrypt({ cipherTextBlob: Buffer.from(ciphertext, "base64") });
return plaintextBlob.toString("base64");
};
return {
createCmek,
updateCmekById,
deleteCmekById,
listCmeksByProjectId,
cmekEncrypt,
cmekDecrypt
};
};

View File

@@ -0,0 +1,40 @@
import { SymmetricEncryption } from "@app/lib/crypto/cipher";
import { OrderByDirection } from "@app/lib/types";
export type TCreateCmekDTO = {
orgId: string;
projectId: string;
name: string;
description?: string;
encryptionAlgorithm: SymmetricEncryption;
};
export type TUpdabteCmekByIdDTO = {
keyId: string;
name?: string;
isDisabled?: boolean;
description?: string;
};
export type TListCmeksByProjectIdDTO = {
projectId: string;
offset?: number;
limit?: number;
orderBy?: CmekOrderBy;
orderDirection?: OrderByDirection;
search?: string;
};
export type TCmekEncryptDTO = {
keyId: string;
plaintext: string;
};
export type TCmekDecryptDTO = {
keyId: string;
ciphertext: string;
};
export enum CmekOrderBy {
Name = "name"
}

View File

@@ -1,11 +1,11 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { TableName, TIdentityOrgMemberships } from "@app/db/schemas";
import { TableName, TIdentityOrgMemberships, TOrgRoles } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols, sqlNestRelationships } from "@app/lib/knex";
import { OrderByDirection } from "@app/lib/types";
import { TListOrgIdentitiesByOrgIdDTO } from "@app/services/identity/identity-types";
import { OrgIdentityOrderBy, TListOrgIdentitiesByOrgIdDTO } from "@app/services/identity/identity-types";
export type TIdentityOrgDALFactory = ReturnType<typeof identityOrgDALFactory>;
@@ -33,7 +33,7 @@ export const identityOrgDALFactory = (db: TDbClient) => {
{
limit,
offset = 0,
orderBy,
orderBy = OrgIdentityOrderBy.Name,
orderDirection = OrderByDirection.ASC,
search,
...filter
@@ -42,26 +42,50 @@ export const identityOrgDALFactory = (db: TDbClient) => {
tx?: Knex
) => {
try {
const paginatedFetchIdentity = (tx || db.replicaNode())(TableName.Identity)
.where((queryBuilder) => {
if (limit) {
void queryBuilder.offset(offset).limit(limit);
}
})
.as(TableName.Identity);
const query = (tx || db.replicaNode())(TableName.IdentityOrgMembership)
const paginatedIdentity = (tx || db.replicaNode())(TableName.Identity)
.join(
TableName.IdentityOrgMembership,
`${TableName.IdentityOrgMembership}.identityId`,
`${TableName.Identity}.id`
)
.orderBy(`${TableName.Identity}.${orderBy}`, orderDirection)
.select(
selectAllTableCols(TableName.IdentityOrgMembership),
db.ref("name").withSchema(TableName.Identity).as("identityName"),
db.ref("authMethod").withSchema(TableName.Identity).as("identityAuthMethod")
)
.where(filter)
.join<Awaited<typeof paginatedFetchIdentity>>(paginatedFetchIdentity, (queryBuilder) => {
queryBuilder.on(`${TableName.IdentityOrgMembership}.identityId`, `${TableName.Identity}.id`);
})
.leftJoin(TableName.OrgRoles, `${TableName.IdentityOrgMembership}.roleId`, `${TableName.OrgRoles}.id`)
.as("paginatedIdentity");
if (search?.length) {
void paginatedIdentity.whereILike(`${TableName.Identity}.name`, `%${search}%`);
}
if (limit) {
void paginatedIdentity.offset(offset).limit(limit);
}
// akhilmhdh: refer this for pagination with multiple left queries
type TSubquery = Awaited<typeof paginatedIdentity>;
const query = (tx || db.replicaNode())
.from<TSubquery[number], TSubquery>(paginatedIdentity)
.leftJoin<TOrgRoles>(TableName.OrgRoles, `paginatedIdentity.roleId`, `${TableName.OrgRoles}.id`)
.leftJoin(TableName.IdentityMetadata, (queryBuilder) => {
void queryBuilder
.on(`${TableName.IdentityOrgMembership}.identityId`, `${TableName.IdentityMetadata}.identityId`)
.andOn(`${TableName.IdentityOrgMembership}.orgId`, `${TableName.IdentityMetadata}.orgId`);
.on(`paginatedIdentity.identityId`, `${TableName.IdentityMetadata}.identityId`)
.andOn(`paginatedIdentity.orgId`, `${TableName.IdentityMetadata}.orgId`);
})
.select(selectAllTableCols(TableName.IdentityOrgMembership))
.select(
db.ref("id").withSchema("paginatedIdentity"),
db.ref("role").withSchema("paginatedIdentity"),
db.ref("roleId").withSchema("paginatedIdentity"),
db.ref("orgId").withSchema("paginatedIdentity"),
db.ref("createdAt").withSchema("paginatedIdentity"),
db.ref("updatedAt").withSchema("paginatedIdentity"),
db.ref("identityId").withSchema("paginatedIdentity"),
db.ref("identityName").withSchema("paginatedIdentity"),
db.ref("identityAuthMethod").withSchema("paginatedIdentity")
)
// cr stands for custom role
.select(db.ref("id").as("crId").withSchema(TableName.OrgRoles))
.select(db.ref("name").as("crName").withSchema(TableName.OrgRoles))
@@ -69,32 +93,13 @@ export const identityOrgDALFactory = (db: TDbClient) => {
.select(db.ref("description").as("crDescription").withSchema(TableName.OrgRoles))
.select(db.ref("permissions").as("crPermission").withSchema(TableName.OrgRoles))
.select(db.ref("permissions").as("crPermission").withSchema(TableName.OrgRoles))
.select(db.ref("id").as("identityId").withSchema(TableName.Identity))
.select(
db.ref("name").as("identityName").withSchema(TableName.Identity),
db.ref("authMethod").as("identityAuthMethod").withSchema(TableName.Identity)
)
.select(
db.ref("id").withSchema(TableName.IdentityMetadata).as("metadataId"),
db.ref("key").withSchema(TableName.IdentityMetadata).as("metadataKey"),
db.ref("value").withSchema(TableName.IdentityMetadata).as("metadataValue")
);
if (orderBy) {
switch (orderBy) {
case "name":
void query.orderBy(`${TableName.Identity}.${orderBy}`, orderDirection);
break;
case "role":
void query.orderBy(`${TableName.IdentityOrgMembership}.${orderBy}`, orderDirection);
break;
default:
// do nothing
}
}
if (search?.length) {
void query.whereILike(`${TableName.Identity}.name`, `%${search}%`);
if (orderBy === OrgIdentityOrderBy.Name) {
void query.orderBy("identityName", orderDirection);
}
const docs = await query;

View File

@@ -41,6 +41,6 @@ export type TListOrgIdentitiesByOrgIdDTO = {
} & TOrgPermission;
export enum OrgIdentityOrderBy {
Name = "name",
Role = "role"
Name = "name"
// Role = "role"
}

View File

@@ -455,6 +455,31 @@ const getAppsCircleCI = async ({ accessToken }: { accessToken: string }) => {
return apps;
};
/**
* Return list of projects for Databricks integration
*/
const getAppsDatabricks = async ({ url, accessToken }: { url?: string | null; accessToken: string }) => {
const databricksApiUrl = `${url}/api`;
const res = await request.get<{ scopes: { name: string; backend_type: string }[] }>(
`${databricksApiUrl}/2.0/secrets/scopes/list`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
);
const scopes =
res.data?.scopes?.map((a) => ({
name: a.name, // name maps to unique scope name in Databricks
backend_type: a.backend_type
})) ?? [];
return scopes;
};
const getAppsTravisCI = async ({ accessToken }: { accessToken: string }) => {
const res = (
await request.get<{ id: string; slug: string }[]>(`${IntegrationUrls.TRAVISCI_API_URL}/repos`, {
@@ -1104,6 +1129,12 @@ export const getApps = async ({
accessToken
});
case Integrations.DATABRICKS:
return getAppsDatabricks({
url,
accessToken
});
case Integrations.LARAVELFORGE:
return getAppsLaravelForge({
accessToken,

View File

@@ -15,6 +15,7 @@ export enum Integrations {
FLYIO = "flyio",
LARAVELFORGE = "laravel-forge",
CIRCLECI = "circleci",
DATABRICKS = "databricks",
TRAVISCI = "travisci",
TEAMCITY = "teamcity",
SUPABASE = "supabase",
@@ -73,6 +74,7 @@ export enum IntegrationUrls {
RAILWAY_API_URL = "https://backboard.railway.app/graphql/v2",
FLYIO_API_URL = "https://api.fly.io/graphql",
CIRCLECI_API_URL = "https://circleci.com/api",
DATABRICKS_API_URL = "https:/xxxx.com/api",
TRAVISCI_API_URL = "https://api.travis-ci.com",
SUPABASE_API_URL = "https://api.supabase.com",
LARAVELFORGE_API_URL = "https://forge.laravel.com",
@@ -210,6 +212,15 @@ export const getIntegrationOptions = async () => {
clientId: "",
docsLink: ""
},
{
name: "Databricks",
slug: "databricks",
image: "Databricks.png",
isAvailable: true,
type: "pat",
clientId: "",
docsLink: ""
},
{
name: "GitLab",
slug: "gitlab",

View File

@@ -2085,6 +2085,80 @@ const syncSecretsCircleCI = async ({
);
};
/**
* Sync/push [secrets] to Databricks project
*/
const syncSecretsDatabricks = async ({
integration,
integrationAuth,
secrets,
accessToken
}: {
integration: TIntegrations;
integrationAuth: TIntegrationAuths;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
const databricksApiUrl = `${integrationAuth.url}/api`;
// sync secrets to Databricks
await Promise.all(
Object.keys(secrets).map(async (key) =>
request.post(
`${databricksApiUrl}/2.0/secrets/put`,
{
scope: integration.app,
key,
string_value: secrets[key].value
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Accept-Encoding": "application/json"
}
}
)
)
);
// get secrets from Databricks
const getSecretsRes = (
await request.get<{ secrets: { key: string; last_updated_timestamp: number }[] }>(
`${databricksApiUrl}/2.0/secrets/list`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json"
},
params: {
scope: integration.app
}
}
)
).data.secrets;
// delete secrets from Databricks
await Promise.all(
getSecretsRes.map(async (sec) => {
if (!(sec.key in secrets)) {
return request.post(
`${databricksApiUrl}/2.0/secrets/delete`,
{
scope: integration.app,
key: sec.key
},
{
headers: {
Authorization: `Bearer ${accessToken}`,
"Content-Type": "application/json"
}
}
);
}
})
);
};
/**
* Sync/push [secrets] to TravisCI project
*/
@@ -4021,6 +4095,14 @@ export const syncIntegrationSecrets = async ({
accessToken
});
break;
case Integrations.DATABRICKS:
await syncSecretsDatabricks({
integration,
integrationAuth,
secrets,
accessToken
});
break;
case Integrations.LARAVELFORGE:
await syncSecretsLaravelForge({
integration,

View File

@@ -0,0 +1,11 @@
import { SymmetricEncryption } from "@app/lib/crypto/cipher";
export const getByteLengthForAlgorithm = (encryptionAlgorithm: SymmetricEncryption) => {
switch (encryptionAlgorithm) {
case SymmetricEncryption.AES_GCM_128:
return 16;
case SymmetricEncryption.AES_GCM_256:
default:
return 32;
}
};

View File

@@ -1,9 +1,11 @@
import { Knex } from "knex";
import { TDbClient } from "@app/db";
import { KmsKeysSchema, TableName } from "@app/db/schemas";
import { KmsKeysSchema, TableName, TInternalKms, TKmsKeys } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
import { ormify, selectAllTableCols } from "@app/lib/knex";
import { OrderByDirection } from "@app/lib/types";
import { CmekOrderBy, TListCmeksByProjectIdDTO } from "@app/services/cmek/cmek-types";
export type TKmsKeyDALFactory = ReturnType<typeof kmskeyDALFactory>;
@@ -71,5 +73,50 @@ export const kmskeyDALFactory = (db: TDbClient) => {
}
};
return { ...kmsOrm, findByIdWithAssociatedKms };
const findKmsKeysByProjectId = async (
{
projectId,
offset = 0,
limit,
orderBy = CmekOrderBy.Name,
orderDirection = OrderByDirection.ASC,
search
}: TListCmeksByProjectIdDTO,
tx?: Knex
) => {
try {
const query = (tx || db.replicaNode())(TableName.KmsKey)
.where("projectId", projectId)
.where((qb) => {
if (search) {
void qb.whereILike("name", `%${search}%`);
}
})
.join(TableName.InternalKms, `${TableName.InternalKms}.kmsKeyId`, `${TableName.KmsKey}.id`)
.select<
(TKmsKeys &
Pick<TInternalKms, "version" | "encryptionAlgorithm"> & {
total_count: number;
})[]
>(
selectAllTableCols(TableName.KmsKey),
db.raw(`count(*) OVER() as total_count`),
db.ref("encryptionAlgorithm").withSchema(TableName.InternalKms),
db.ref("version").withSchema(TableName.InternalKms)
)
.orderBy(orderBy, orderDirection);
if (limit) {
void query.limit(limit).offset(offset);
}
const data = await query;
return { keys: data, totalCount: Number(data?.[0]?.total_count ?? 0) };
} catch (error) {
throw new DatabaseError({ error, name: "Find kms keys by project id" });
}
};
return { ...kmsOrm, findByIdWithAssociatedKms, findKmsKeysByProjectId };
};

View File

@@ -17,6 +17,7 @@ import { generateHash } from "@app/lib/crypto/encryption";
import { BadRequestError, ForbiddenRequestError, NotFoundError } from "@app/lib/errors";
import { logger } from "@app/lib/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { getByteLengthForAlgorithm } from "@app/services/kms/kms-fns";
import { TOrgDALFactory } from "../org/org-dal";
import { TProjectDALFactory } from "../project/project-dal";
@@ -71,17 +72,29 @@ export const kmsServiceFactory = ({
* This function is responsibile for generating the infisical internal KMS for various entities
* Like for secret manager, cert manager or for organization
*/
const generateKmsKey = async ({ orgId, isReserved = true, tx, slug }: TGenerateKMSDTO) => {
const generateKmsKey = async ({
orgId,
isReserved = true,
tx,
name,
projectId,
encryptionAlgorithm = SymmetricEncryption.AES_GCM_256,
description
}: TGenerateKMSDTO) => {
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const kmsKeyMaterial = randomSecureBytes(32);
const kmsKeyMaterial = randomSecureBytes(getByteLengthForAlgorithm(encryptionAlgorithm));
const encryptedKeyMaterial = cipher.encrypt(kmsKeyMaterial, ROOT_ENCRYPTION_KEY);
const sanitizedSlug = slug ? slugify(slug) : slugify(alphaNumericNanoId(8).toLowerCase());
const sanitizedName = name ? slugify(name) : slugify(alphaNumericNanoId(8).toLowerCase());
const dbQuery = async (db: Knex) => {
const kmsDoc = await kmsDAL.create(
{
slug: sanitizedSlug,
name: sanitizedName,
orgId,
isReserved
isReserved,
projectId,
description
},
db
);
@@ -90,7 +103,7 @@ export const kmsServiceFactory = ({
{
version: 1,
encryptedKey: encryptedKeyMaterial,
encryptionAlgorithm: SymmetricEncryption.AES_GCM_256,
encryptionAlgorithm,
kmsKeyId: kmsDoc.id
},
db
@@ -286,12 +299,13 @@ export const kmsServiceFactory = ({
}
// internal KMS
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const kmsKey = cipher.decrypt(kmsDoc.internalKms?.encryptedKey as Buffer, ROOT_ENCRYPTION_KEY);
const keyCipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const dataCipher = symmetricCipherService(kmsDoc.internalKms?.encryptionAlgorithm as SymmetricEncryption);
const kmsKey = keyCipher.decrypt(kmsDoc.internalKms?.encryptedKey as Buffer, ROOT_ENCRYPTION_KEY);
return ({ cipherTextBlob: versionedCipherTextBlob }: Pick<TDecryptWithKmsDTO, "cipherTextBlob">) => {
const cipherTextBlob = versionedCipherTextBlob.subarray(0, -KMS_VERSION_BLOB_LENGTH);
const decryptedBlob = cipher.decrypt(cipherTextBlob, kmsKey);
const decryptedBlob = dataCipher.decrypt(cipherTextBlob, kmsKey);
return Promise.resolve(decryptedBlob);
};
};
@@ -347,11 +361,11 @@ export const kmsServiceFactory = ({
}
// internal KMS
// akhilmhdh: as more encryption are added do a check here on kmsDoc.encryptionAlgorithm
const cipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const keyCipher = symmetricCipherService(SymmetricEncryption.AES_GCM_256);
const dataCipher = symmetricCipherService(kmsDoc.internalKms?.encryptionAlgorithm as SymmetricEncryption);
return ({ plainText }: Pick<TEncryptWithKmsDTO, "plainText">) => {
const kmsKey = cipher.decrypt(kmsDoc.internalKms?.encryptedKey as Buffer, ROOT_ENCRYPTION_KEY);
const encryptedPlainTextBlob = cipher.encrypt(plainText, kmsKey);
const kmsKey = keyCipher.decrypt(kmsDoc.internalKms?.encryptedKey as Buffer, ROOT_ENCRYPTION_KEY);
const encryptedPlainTextBlob = dataCipher.encrypt(plainText, kmsKey);
// Buffer#1 encrypted text + Buffer#2 version number
const versionBlob = Buffer.from(KMS_VERSION, "utf8"); // length is 3
@@ -767,8 +781,8 @@ export const kmsServiceFactory = ({
message: "KMS not found"
});
}
const { id, slug, orgId, isExternal } = kms;
return { id, slug, orgId, isExternal };
const { id, name, orgId, isExternal } = kms;
return { id, name, orgId, isExternal };
};
// akhilmhdh: a copy of this is made in migrations/utils/kms

View File

@@ -1,5 +1,7 @@
import { Knex } from "knex";
import { SymmetricEncryption } from "@app/lib/crypto/cipher";
export enum KmsDataKey {
Organization,
SecretManager
@@ -22,8 +24,11 @@ export type TEncryptWithKmsDataKeyDTO =
export type TGenerateKMSDTO = {
orgId: string;
projectId?: string;
encryptionAlgorithm?: SymmetricEncryption;
isReserved?: boolean;
slug?: string;
name?: string;
description?: string;
tx?: Knex;
};

View File

@@ -65,10 +65,16 @@ export const projectEnvDALFactory = (db: TDbClient) => {
}
};
const shiftPositions = async (projectId: string, pos: number, tx?: Knex) => {
// Shift all positions >= the new position up by 1
await (tx || db)(TableName.Environment).where({ projectId }).where("position", ">=", pos).increment("position", 1);
};
return {
...projectEnvOrm,
findBySlugs,
findLastEnvPosition,
updateAllPosition
updateAllPosition,
shiftPositions
};
};

View File

@@ -37,6 +37,7 @@ export const projectEnvServiceFactory = ({
actor,
actorOrgId,
actorAuthMethod,
position,
name,
slug
}: TCreateEnvDTO) => {
@@ -83,9 +84,25 @@ export const projectEnvServiceFactory = ({
}
const env = await projectEnvDAL.transaction(async (tx) => {
if (position !== undefined) {
// Check if there's an environment at the specified position
const existingEnvWithPosition = await projectEnvDAL.findOne({ projectId, position }, tx);
// If there is, then shift positions
if (existingEnvWithPosition) {
await projectEnvDAL.shiftPositions(projectId, position, tx);
}
const doc = await projectEnvDAL.create({ slug, name, projectId, position }, tx);
await folderDAL.create({ name: "root", parentId: null, envId: doc.id, version: 1 }, tx);
return doc;
}
// If no position is specified, add to the end
const lastPos = await projectEnvDAL.findLastEnvPosition(projectId, tx);
const doc = await projectEnvDAL.create({ slug, name, projectId, position: lastPos + 1 }, tx);
await folderDAL.create({ name: "root", parentId: null, envId: doc.id, version: 1 }, tx);
return doc;
});
@@ -150,7 +167,11 @@ export const projectEnvServiceFactory = ({
const env = await projectEnvDAL.transaction(async (tx) => {
if (position) {
await projectEnvDAL.updateAllPosition(projectId, oldEnv.position, position, tx);
const existingEnvWithPosition = await projectEnvDAL.findOne({ projectId, position }, tx);
if (existingEnvWithPosition && existingEnvWithPosition.id !== oldEnv.id) {
await projectEnvDAL.updateAllPosition(projectId, oldEnv.position, position, tx);
}
}
return projectEnvDAL.updateById(oldEnv.id, { name, slug, position }, tx);
});
@@ -199,7 +220,6 @@ export const projectEnvServiceFactory = ({
name: "DeleteEnvironment"
});
await projectEnvDAL.updateAllPosition(projectId, doc.position, -1, tx);
return doc;
});
@@ -215,29 +235,26 @@ export const projectEnvServiceFactory = ({
}
};
const getEnvironmentById = async ({ projectId, actor, actorId, actorOrgId, actorAuthMethod, id }: TGetEnvDTO) => {
const getEnvironmentById = async ({ actor, actorId, actorOrgId, actorAuthMethod, id }: TGetEnvDTO) => {
const environment = await projectEnvDAL.findById(id);
if (!environment) {
throw new NotFoundError({
message: "Environment does not exist"
});
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
projectId,
environment.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Environments);
const [env] = await projectEnvDAL.find({
id,
projectId
});
if (!env) {
throw new NotFoundError({
message: "Environment does not exist"
});
}
return env;
return environment;
};
return {

View File

@@ -3,6 +3,7 @@ import { TProjectPermission } from "@app/lib/types";
export type TCreateEnvDTO = {
name: string;
slug: string;
position?: number;
} & TProjectPermission;
export type TUpdateEnvDTO = {
@@ -23,4 +24,4 @@ export type TReorderEnvDTO = {
export type TGetEnvDTO = {
id: string;
} & TProjectPermission;
} & Omit<TProjectPermission, "projectId">;

View File

@@ -502,12 +502,21 @@ export const secretFolderServiceFactory = ({
const getFolderById = async ({ actor, actorId, actorOrgId, actorAuthMethod, id }: TGetFolderByIdDTO) => {
const folder = await folderDAL.findById(id);
if (!folder) throw new NotFoundError({ message: "folder not found" });
if (!folder) throw new NotFoundError({ message: "Folder not found" });
// folder list is allowed to be read by anyone
// permission to check does user has access
await permissionService.getProjectPermission(actor, actorId, folder.projectId, actorAuthMethod, actorOrgId);
return folder;
const [folderWithPath] = await folderDAL.findSecretPathByFolderIds(folder.projectId, [folder.id]);
if (!folderWithPath) {
throw new NotFoundError({ message: "Folder path not found" });
}
return {
...folder,
path: folderWithPath.path
};
};
return {

View File

@@ -97,6 +97,34 @@ export const secretImportDALFactory = (db: TDbClient) => {
}
};
const findById = async (id: string, tx?: Knex) => {
try {
const doc = await (tx || db.replicaNode())(TableName.SecretImport)
.where({ [`${TableName.SecretImport}.id` as "id"]: id })
.join(TableName.Environment, `${TableName.SecretImport}.importEnv`, `${TableName.Environment}.id`)
.select(
db.ref("*").withSchema(TableName.SecretImport) as unknown as keyof TSecretImports,
db.ref("slug").withSchema(TableName.Environment),
db.ref("name").withSchema(TableName.Environment),
db.ref("id").withSchema(TableName.Environment).as("envId")
)
.first();
if (!doc) {
return null;
}
const { envId, slug, name, ...el } = doc;
return {
...el,
importEnv: { id: envId, slug, name }
};
} catch (error) {
throw new DatabaseError({ error, name: "Find secret imports" });
}
};
const getProjectImportCount = async (
{ search, ...filter }: Partial<TSecretImports & { projectId: string; search?: string }>,
tx?: Knex
@@ -144,6 +172,7 @@ export const secretImportDALFactory = (db: TDbClient) => {
return {
...secretImportOrm,
find,
findById,
findByFolderIds,
findLastImportPosition,
updateAllPosition,

View File

@@ -24,6 +24,7 @@ import { fnSecretsFromImports, fnSecretsV2FromImports } from "./secret-import-fn
import {
TCreateSecretImportDTO,
TDeleteSecretImportDTO,
TGetSecretImportByIdDTO,
TGetSecretImportsDTO,
TGetSecretsFromImportDTO,
TResyncSecretImportReplicationDTO,
@@ -455,6 +456,64 @@ export const secretImportServiceFactory = ({
return secImports;
};
const getImportById = async ({
actor,
actorId,
actorAuthMethod,
actorOrgId,
id: importId
}: TGetSecretImportByIdDTO) => {
const importDoc = await secretImportDAL.findById(importId);
if (!importDoc) {
throw new NotFoundError({ message: "Secret import not found" });
}
// the folder to import into
const folder = await folderDAL.findById(importDoc.folderId);
if (!folder) throw new NotFoundError({ message: "Secret import folder not found" });
// the folder to import into, with path
const [folderWithPath] = await folderDAL.findSecretPathByFolderIds(folder.projectId, [folder.id]);
if (!folderWithPath) throw new NotFoundError({ message: "Folder path not found" });
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
folder.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(
ProjectPermissionActions.Read,
subject(ProjectPermissionSub.Secrets, {
environment: folder.environment.envSlug,
secretPath: folderWithPath.path
})
);
const importIntoEnv = await projectEnvDAL.findOne({
projectId: folder.projectId,
slug: folder.environment.envSlug
});
if (!importIntoEnv) throw new NotFoundError({ message: "Environment to import into not found" });
return {
...importDoc,
projectId: folder.projectId,
secretPath: folderWithPath.path,
environment: {
id: importIntoEnv.id,
slug: importIntoEnv.slug,
name: importIntoEnv.name
}
};
};
const getSecretsFromImports = async ({
path: secretPath,
environment,
@@ -565,6 +624,7 @@ export const secretImportServiceFactory = ({
updateImport,
deleteImport,
getImports,
getImportById,
getSecretsFromImports,
getRawSecretsFromImports,
resyncSecretImportReplication,

View File

@@ -37,6 +37,10 @@ export type TGetSecretImportsDTO = {
offset?: number;
} & TProjectPermission;
export type TGetSecretImportByIdDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;
export type TGetSecretsFromImportDTO = {
environment: string;
path: string;

View File

@@ -0,0 +1,4 @@
---
title: "Create Key"
openapi: "POST /api/v1/kms/keys"
---

View File

@@ -0,0 +1,4 @@
---
title: "Decrypt Data"
openapi: "POST /api/v1/kms/keys/{keyId}/decrypt"
---

View File

@@ -0,0 +1,4 @@
---
title: "Delete Key"
openapi: "DELETE /api/v1/kms/keys/{keyId}"
---

View File

@@ -0,0 +1,4 @@
---
title: "Encrypt Data"
openapi: "POST /api/v1/kms/keys/{keyId}/encrypt"
---

View File

@@ -0,0 +1,4 @@
---
title: "List Keys"
openapi: "Get /api/v1/kms/keys"
---

View File

@@ -0,0 +1,4 @@
---
title: "Update Key"
openapi: "PATCH /api/v1/kms/keys/{keyId}"
---

View File

@@ -7,7 +7,7 @@ description: "Learn how to authenticate with Infisical for EC2 instances, Lambda
## Diagram
The following sequence digram illustrates the AWS Auth workflow for authenticating AWS IAM principals with Infisical.
The following sequence diagram illustrates the AWS Auth workflow for authenticating AWS IAM principals with Infisical.
```mermaid
sequenceDiagram

View File

@@ -7,7 +7,7 @@ description: "Learn how to authenticate with Infisical for services on Azure"
## Diagram
The following sequence digram illustrates the Azure Auth workflow for authenticating Azure [service principals](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser) with Infisical.
The following sequence diagram illustrates the Azure Auth workflow for authenticating Azure [service principals](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser) with Infisical.
```mermaid
sequenceDiagram

View File

@@ -13,7 +13,7 @@ description: "Learn how to authenticate with Infisical for services on Google Cl
## Diagram
The following sequence digram illustrates the GCP ID Token Auth workflow for authenticating GCP resources with Infisical.
The following sequence diagram illustrates the GCP ID Token Auth workflow for authenticating GCP resources with Infisical.
```mermaid
sequenceDiagram
@@ -182,7 +182,7 @@ access the Infisical API using the GCP ID Token authentication method.
## Diagram
The following sequence digram illustrates the GCP IAM Auth workflow for authenticating GCP IAM service accounts with Infisical.
The following sequence diagram illustrates the GCP IAM Auth workflow for authenticating GCP IAM service accounts with Infisical.
```mermaid
sequenceDiagram

View File

@@ -7,7 +7,7 @@ description: "Learn how to authenticate with Infisical in Kubernetes"
## Diagram
The following sequence digram illustrates the Kubernetes Auth workflow for authenticating applications running in pods with Infisical.
The following sequence diagram illustrates the Kubernetes Auth workflow for authenticating applications running in pods with Infisical.
```mermaid
sequenceDiagram

View File

@@ -0,0 +1,174 @@
---
title: CircleCI
description: "Learn how to authenticate CircleCI jobs with Infisical using OpenID Connect (OIDC)."
---
**OIDC Auth** is a platform-agnostic JWT-based authentication method that can be used to authenticate from any platform or environment using an identity provider with OpenID Connect.
## Diagram
The following sequence diagram illustrates the OIDC Auth workflow for authenticating CircleCI jobs with Infisical.
```mermaid
sequenceDiagram
participant Client as CircleCI Job
participant Idp as CircleCI Identity Provider
participant Infis as Infisical
Idp->>Client: Step 1: Inject JWT with verifiable claims
Note over Client,Infis: Step 2: Login Operation
Client->>Infis: Send signed JWT to /api/v1/auth/oidc-auth/login
Note over Infis,Idp: Step 3: Query verification
Infis->>Idp: Request JWT public key using OIDC Discovery
Idp-->>Infis: Return public key
Note over Infis: Step 4: JWT validation
Infis->>Client: Return short-lived access token
Note over Client,Infis: Step 5: Access Infisical API with Token
Client->>Infis: Make authenticated requests using the short-lived access token
```
## Concept
At a high-level, Infisical authenticates a client by verifying the JWT and checking that it meets specific requirements (e.g. it is issued by a trusted identity provider) at the `/api/v1/auth/oidc-auth/login` endpoint. If successful,
then Infisical returns a short-lived access token that can be used to make authenticated requests to the Infisical API.
To be more specific:
1. CircleCI provides the running job with a valid OIDC token specific to the execution.
2. The CircleCI OIDC token is sent to Infisical at the `/api/v1/auth/oidc-auth/login` endpoint.
3. Infisical fetches the public key that was used to sign the identity token provided by CircleCI.
4. Infisical validates the JWT using the public key provided by the identity provider and checks that the subject, audience, and claims of the token matches with the set criteria.
5. If all is well, Infisical returns a short-lived access token that CircleCI jobs can use to make authenticated requests to the Infisical API.
<Note>Infisical needs network-level access to the CircleCI servers.</Note>
## Guide
In the following steps, we explore how to create and use identities to access the Infisical API using the OIDC Auth authentication method.
<Steps>
<Step title="Creating an identity">
To create an identity, head to your Organization Settings > Access Control > Machine Identities and press **Create identity**.
![identities organization](/images/platform/identities/identities-org.png)
When creating an identity, you specify an organization level [role](/documentation/platform/role-based-access-controls) for it to assume; you can configure roles in Organization Settings > Access Control > Organization Roles.
![identities organization create](/images/platform/identities/identities-org-create.png)
Now input a few details for your new identity. Here's some guidance for each field:
- Name (required): A friendly name for the identity.
- Role (required): A role from the **Organization Roles** tab for the identity to assume. The organization role assigned will determine what organization level resources this identity can have access to.
Once you've created an identity, you'll be redirected to a page where you can manage the identity.
![identities page](/images/platform/identities/identities-page.png)
Since the identity has been configured with Universal Auth by default, you should re-configure it to use OIDC Auth instead. To do this, press to edit the **Authentication** section,
remove the existing Universal Auth configuration, and add a new OIDC Auth configuration onto the identity.
![identities page remove default auth](/images/platform/identities/identities-page-remove-default-auth.png)
![identities create oidc auth method](/images/platform/identities/identities-org-create-oidc-auth-method.png)
<Warning>Restrict access by configuring the Subject, Audiences, and Claims fields</Warning>
Here's some more guidance on each field:
- OIDC Discovery URL: The URL used to retrieve the OpenID Connect configuration from the identity provider. This will be used to fetch the public key needed for verifying the provided JWT. This should be set to `https://oidc.circleci.com/org/<organization_id>` where `organization_id` refers to the CircleCI organization where the job is being run.
- Issuer: The unique identifier of the identity provider issuing the JWT. This value is used to verify the iss (issuer) claim in the JWT to ensure the token is issued by a trusted provider. This should be set to `https://oidc.circleci.com/org/<organization_id>` as well.
- CA Certificate: The PEM-encoded CA cert for establishing secure communication with the Identity Provider endpoints. This can be left as blank.
- Subject: The expected principal that is the subject of the JWT. The format of the sub field for CircleCI OIDC tokens is `org/<organization_id>/project/<project_id>/user/<user_id>` where organization_id, project_id, and user_id are UUIDs that identify the CircleCI organization, project, and user, respectively. The user is the CircleCI user that caused this job to run.
- Audiences: A list of intended recipients. This value is checked against the aud (audience) claim in the token. Set this to the CircleCI `organization_id` corresponding to where the job is running.
- Claims: Additional information or attributes that should be present in the JWT for it to be valid. Refer to CircleCI's [documentation](https://circleci.com/docs/openid-connect-tokens) for the complete list of supported claims.
- Access Token TTL (default is `2592000` equivalent to 30 days): The lifetime for an acccess token in seconds. This value will be referenced at renewal time.
- Access Token Max TTL (default is `2592000` equivalent to 30 days): The maximum lifetime for an acccess token in seconds. This value will be referenced at renewal time.
- Access Token Max Number of Uses (default is `0`): The maximum number of times that an access token can be used; a value of `0` implies infinite number of uses.
- Access Token Trusted IPs: The IPs or CIDR ranges that access tokens can be used from. By default, each token is given the `0.0.0.0/0`, allowing usage from any network address.
<Tip>For more details on the appropriate values for the OIDC fields, refer to CircleCI's [documentation](https://circleci.com/docs/openid-connect-tokens). </Tip>
<Info>The `subject`, `audiences`, and `claims` fields support glob pattern matching; however, we highly recommend using hardcoded values whenever possible.</Info>
</Step>
<Step title="Adding an identity to a project">
To enable the identity to access project-level resources such as secrets within a specific project, you should add it to that project.
To do this, head over to the project you want to add the identity to and go to Project Settings > Access Control > Machine Identities and press **Add identity**.
Next, select the identity you want to add to the project and the project level role you want to allow it to assume. The project role assigned will determine what project level resources this identity can have access to.
![identities project](/images/platform/identities/identities-project.png)
![identities project create](/images/platform/identities/identities-project-create.png)
</Step>
<Step title="Using CircleCI OIDC token to authenticate with Infisical">
The following is an example of how to use the `$CIRCLE_OIDC_TOKEN` with the Infisical [terraform provider](https://registry.terraform.io/providers/Infisical/infisical/latest/docs) to manage resources in a CircleCI pipeline.
```yml config.yml
version: 2.1
jobs:
terraform-apply:
docker:
- image: hashicorp/terraform:latest
steps:
- checkout
- run:
command: |
export INFISICAL_AUTH_JWT="$CIRCLE_OIDC_TOKEN"
terraform init
terraform apply -auto-approve
workflows:
version: 2
build-and-test:
jobs:
- terraform-apply
```
The Infisical terraform provider expects the `INFISICAL_AUTH_JWT` environment variable to be set to the CircleCI OIDC token.
```hcl main.tf
terraform {
required_providers {
infisical = {
source = "infisical/infisical"
}
}
}
provider "infisical" {
host = "https://app.infisical.com"
auth = {
oidc = {
identity_id = "f2f5ee4c-6223-461a-87c3-406a6b481462"
}
}
}
resource "infisical_access_approval_policy" "prod-access-approval" {
project_id = "09eda1f8-85a3-47a9-8a6f-e27f133b2a36"
name = "my-approval-policy"
environment_slug = "prod"
secret_path = "/"
approvers = [
{
type = "user"
username = "sheen+200@infisical.com"
},
]
required_approvals = 1
enforcement_level = "soft"
}
```
<Note>
Each identity access token has a time-to-live (TLL) which you can infer from the response of the login operation;
the default TTL is `7200` seconds which can be adjusted.
If an identity access token expires, it can no longer authenticate with the Infisical API. In this case,
a new access token should be obtained by performing another login operation.
</Note>
</Step>
</Steps>

View File

@@ -7,7 +7,7 @@ description: "Learn how to authenticate to Infisical from any platform or enviro
## Diagram
The following sequence digram illustrates the Token Auth workflow for authenticating clients with Infisical.
The following sequence diagram illustrates the Token Auth workflow for authenticating clients with Infisical.
```mermaid
sequenceDiagram

View File

@@ -7,7 +7,7 @@ description: "Learn how to authenticate to Infisical from any platform or enviro
## Diagram
The following sequence digram illustrates the Universal Auth workflow for authenticating clients with Infisical.
The following sequence diagram illustrates the Universal Auth workflow for authenticating clients with Infisical.
```mermaid
sequenceDiagram

View File

@@ -1,5 +1,5 @@
---
title: "Key Management Service (KMS)"
title: "Key Management Service (KMS) Configuration"
sidebarTitle: "Overview"
description: "Learn how to configure your project's encryption"
---
@@ -25,4 +25,4 @@ For existing projects, you can configure the KMS from the Project Settings page.
## External KMS
Infisical supports the use of external KMS solutions to enhance security and compliance. You can configure your project to use services like [AWS Key Management Service](./aws-kms) for managing encryption.
Infisical supports the use of external KMS solutions to enhance security and compliance. You can configure your project to use services like [AWS Key Management Service](./aws-kms) for managing encryption.

View File

@@ -0,0 +1,208 @@
---
title: "Key Management Service (KMS)"
sidebarTitle: "Key Management (KMS)"
description: "Learn how to manage and use cryptographic keys with Infisical."
---
## Concept
Infisical can be used as a Key Management System (KMS), referred to as Infisical KMS, to centralize management of keys to be used for cryptographic operations like encryption/decryption.
<Note>
Keys managed in KMS are not extractable from the platform. Additionally, data
is never stored when performing cryptographic operations.
</Note>
## Workflow
The typical workflow for using Infisical KMS consists of the following steps:
1. Creating a KMS key. As part of this step, you specify a name for the key and the encryption algorithm meant to be used for it (e.g. `AES-GCM-128`, `AES-GCM-256`).
2. Encryption: To encrypt data, you would make a request to the Infisical KMS API endpoint, specifying the base64-encoded plaintext and the intended key to use for encryption; the API would return the base64-encoded ciphertext.
3. Decryption: To decrypt data, you would make a request to the Infisical KMS API endpoint, specifying the base64-encoded ciphertext and the intended key to use for decryption; the API would return the base64-encoded plaintext.
<Note>
Note that this workflow can be executed via the Infisical UI or manually such
as via API.
</Note>
## Guide to Encrypting Data
In the following steps, we explore how to generate a key and use it to encrypt data.
<Tabs>
<Tab title="Infisical UI">
<Steps>
<Step title="Creating a KMS key">
Navigate to Project > Key Management and tap on the **Add Key** button.
![kms add key button](/images/platform/kms/infisical-kms/kms-add-key.png)
Specify your key details. Here's some guidance on each field:
- Name: A slug-friendly name for the key.
- Type: The encryption algorithm associated with the key (e.g. `AES-GCM-256`).
- Description: An optional description of what the intended usage is for the key.
![kms add key modal](/images/platform/kms/infisical-kms/kms-add-key-modal.png)
</Step>
<Step title="Encrypting data with the KMS key">
Once your key is generated, open the options menu for the newly created key and select encrypt data.
![kms key options](/images/platform/kms/infisical-kms/kms-key-options.png)
Populate the text area with your data and tap on the Encrypt button.
![kms encrypt data](/images/platform/kms/infisical-kms/kms-encrypt-data.png)
<Note>
If your data is already Base64 encoded make sure to toggle the respective switch on to avoid
redundant encoding.
</Note>
Copy and store the encrypted data.
![kms encrypted data](/images/platform/kms/infisical-kms/kms-encrypted-data.png)
</Step>
</Steps>
</Tab>
<Tab title="API">
<Steps>
<Step title="Creating a KMS key">
To create a cryptographic key, make an API request to the [Create KMS
Key](/api-reference/endpoints/kms/keys/create) API endpoint.
### Sample request
```bash Request
curl --request POST \
--url https://app.infisical.com/api/v1/kms/keys \
--header 'Content-Type: application/json' \
--data '{
"projectId": "<project-id>",
"name": "my-secret-key",
"description": "...",
"encryptionAlgorithm": "aes-256-gcm"
}'
```
### Sample response
```bash Response
{
"key": {
"id": "<key-id>",
"description": "...",
"isDisabled": false,
"isReserved": false,
"orgId": "<org-id>",
"name": "my-secret-key",
"createdAt": "2023-11-07T05:31:56Z",
"updatedAt": "2023-11-07T05:31:56Z",
"projectId": "<project-id>"
}
}
```
</Step>
<Step title="Encrypting data with the KMS key">
To encrypt data, make an API request to the [Encrypt
Data](/api-reference/endpoints/kms/keys/encrypt) API endpoint,
specifying the key to use.
<Note>
Make sure your data is Base64 encoded
</Note>
### Sample request
```bash Request
curl --request POST \
--url https://app.infisical.com/api/v1/kms/keys/<key-id>/encrypt \
--header 'Content-Type: application/json' \
--data '{
"plaintext": "lUFHM5Ggwo6TOfpuN1S==" // base64 encoded plaintext
}'
```
### Sample response
```bash Response
{
"ciphertext": "HwFHwSFHwlMF6TOfp==" // base64 encoded ciphertext
}
```
</Step>
</Steps>
</Tab>
</Tabs>
## Guide to Decrypting Data
In the following steps, we explore how to use decrypt data using an existing key in Infisical KMS.
<Tabs>
<Tab title="Infisical UI">
<Steps>
<Step title="Accessing your key">
Navigate to Project > Key Management and open the options menu for the key used to encrypt the data
you want to decrypt.
![kms key options](/images/platform/kms/infisical-kms/kms-decrypt-options.png)
</Step>
<Step title="Decrypting your data">
Paste your encrypted data into the text area and tap on the Decrypt button. Optionally, if your data was
originally plain text, enable the decode Base64 switch.
![kms decrypt data](/images/platform/kms/infisical-kms/kms-decrypt-data.png)
Your decrypted data will be displayed and can be copied for use.
![kms decrypted data](/images/platform/kms/infisical-kms/kms-decrypted-data.png)
</Step>
</Steps>
</Tab>
<Tab title="API">
<Steps>
<Step title="Decrypting data">
To decrypt data, make an API request to the [Decrypt
Data](/api-reference/endpoints/kms/keys/decrypt) API endpoint,
specifying the key to use.
### Sample request
```bash Request
curl --request POST \
--url https://app.infisical.com/api/v1/kms/keys/<key-id>/decrypt \
--header 'Content-Type: application/json' \
--data '{
"ciphertext": "HwFHwSFHwlMF6TOfp==" // base64 encoded ciphertext
}'
```
### Sample response
```bash Response
{
"plaintext": "lUFHM5Ggwo6TOfpuN1S==" // base64 encoded plaintext
}
```
</Step>
</Steps>
</Tab>
</Tabs>
## FAQ
<AccordionGroup>
<Accordion title="Is my data stored in Infisical KMS?">
No. Infisical's KMS only provides cryptographic services and does not store
any encrypted or decrypted data.
</Accordion>
<Accordion title="Can key material be accessed outside of Infisical KMS?">
No. Infisical's KMS will never expose your keys, encrypted or decrypted, to
external sources.
</Accordion>
<Accordion title="What algorithms does Infisical KMS support?">
Currently, Infisical only supports `AES-128-GCM` and `AES-256-GCM` for
encryption operations. We anticipate supporting more algorithms and
cryptographic operations in the coming months.
</Accordion>
</AccordionGroup>

Binary file not shown.

After

Width:  |  Height:  |  Size: 656 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 760 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 878 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 279 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 535 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 702 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 624 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 942 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 585 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 558 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 586 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 734 KiB

View File

@@ -21,12 +21,6 @@ Prerequisites:
![integrations circleci authorization](../../images/integrations/circleci/integrations-circleci-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which CircleCI project and press create integration to start syncing secrets to CircleCI.

View File

@@ -22,12 +22,6 @@ Prerequisites:
![integrations codefresh authorization](../../images/integrations/codefresh/integrations-codefresh-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Codefresh service and press create integration to start syncing secrets to Codefresh.

View File

@@ -27,10 +27,6 @@ Prerequisites:
![integrations github authorization](../../images/integrations/github/integrations-github-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant Infisical access to your project's environment variables.
Although this step breaks E2EE, it's necessary for Infisical to sync the environment variables to the cloud platform.
</Info>
</Step>
<Step title="Configure Infisical GitHub integration">
Select which Infisical environment secrets you want to sync to which GitHub organization, repository, or repository environment.

View File

@@ -20,12 +20,6 @@ description: "How to sync secrets from Infisical to GitLab"
![integrations gitlab authorization](../../images/integrations/gitlab/integrations-gitlab-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which GitLab repository and press create integration to start syncing secrets to GitLab.

View File

@@ -21,13 +21,6 @@ Prerequisites:
![integrations rundeck authorization](../../images/integrations/rundeck/integrations-rundeck-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to a Rundeck Key Storage Path and press create integration to start syncing secrets to Rundeck.

View File

@@ -21,12 +21,6 @@ Prerequisites:
![integrations travis ci authorization](../../images/integrations/travis-ci/integrations-travisci-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Travis CI repository and press create integration to start syncing secrets to Travis CI.

View File

@@ -60,13 +60,6 @@ Prerequisites:
![integration auth](../../images/integrations/aws/integrations-aws-parameter-store-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which AWS Parameter Store region and indicate the path for your secrets. Then, press create integration to start syncing secrets to AWS Parameter Store.

View File

@@ -22,12 +22,6 @@ Prerequisites:
![integrations checkly authorization](../../images/integrations/checkly/integrations-checkly-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to Checkly and press create integration to start syncing secrets.

View File

@@ -31,13 +31,6 @@ Copy and save your token.
Click on the Cloud 66 tile and enter your API token to grant Infisical access to your Cloud 66 account.
![integrations cloud 66 tile in infisical dashboard](../../images/integrations/cloud-66/integrations-cloud-66-infisical-dashboard.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
Enter your Cloud 66 Personal Access Token here. Then click "Connect to Cloud 66".
![integrations cloud 66 tile in infisical dashboard](../../images/integrations/cloud-66/integrations-cloud-66-paste-pat.png)

View File

@@ -29,12 +29,6 @@ Prerequisites:
![integrations cloudflare authorization](../../images/integrations/cloudflare/integrations-cloudflare-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to Cloudflare and press create integration to start syncing secrets.

View File

@@ -29,13 +29,6 @@ Prerequisites:
![integrations cloudflare authorization](../../images/integrations/cloudflare/integration-cloudflare-workers-connect.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to Cloudflare Workers and press create integration to start syncing secrets.

View File

@@ -0,0 +1,31 @@
---
title: "Databricks"
description: "Learn how to sync secrets from Infisical to Databricks."
---
Prerequisites:
- Set up and add secrets to [Infisical Cloud](https://app.infisical.com)
<Steps>
<Step title="Authorize Infisical for Databricks">
Obtain a Personal Access Token in **User Settings** > **Developer** > **Access Tokens**.
![integrations databricks token](../../images/integrations/databricks/pat-token.png)
Navigate to your project's integrations tab in Infisical.
![integrations](../../images/integrations.png)
Press on the Databricks tile and enter your Databricks instance URL in the following format: `https://xxx.cloud.databricks.com`. Then, input your Databricks Access Token to grant Infisical the necessary permissions in your Databricks account.
![integrations databricks authorization](../../images/integrations/databricks/integrations-databricks-auth.png)
</Step>
<Step title="Start integration">
Select which Infisical environment and secret path you want to sync to which Databricks scope. Then, press create integration to start syncing secrets to Databricks.
![create integration Databricks](../../images/integrations/databricks/integrations-databricks-create.png)
![integrations Databricks](../../images/integrations/databricks/integrations-databricks.png)
</Step>
</Steps>

View File

@@ -20,13 +20,6 @@ Name it **infisical**, choose **No expiry**, and make sure to check **Write (opt
Click on the **Digital Ocean App Platform** tile and enter your API token to grant Infisical access to your Digital Ocean account.
![integrations](../../images/integrations.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
Then enter your Digital Ocean Personal Access Token here. Then click "Connect to Digital Ocean App Platform".
![integrations infisical dashboard digital ocean integration](../../images/integrations/digital-ocean/integrations-do-enter-token.png)

View File

@@ -22,12 +22,6 @@ Prerequisites:
![integrations fly authorization](../../images/integrations/flyio/integrations-flyio-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Fly.io app and press create integration to start syncing secrets to Fly.io.

View File

@@ -24,12 +24,6 @@ description: "How to sync secrets from Infisical to GCP Secret Manager"
![integrations GCP authorization](../../images/integrations/gcp-secret-manager/integrations-gcp-secret-manager-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
In the **Connection** tab, select which Infisical environment secrets you want to sync to which GCP secret manager project. Lastly, press create integration to start syncing secrets to GCP secret manager.
@@ -85,12 +79,6 @@ description: "How to sync secrets from Infisical to GCP Secret Manager"
![integrations GCP authorization options](../../images/integrations/gcp-secret-manager/integrations-gcp-secret-manager-auth-options.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
In the **Connection** tab, select which Infisical environment secrets you want to sync to the GCP secret manager project. Lastly, press create integration to start syncing secrets to GCP secret manager.

View File

@@ -21,12 +21,6 @@ Prerequisites:
![integrations hasura cloud authorization](../../images/integrations/hasura-cloud/integrations-hasura-cloud-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Hasura Cloud project and press create integration to start syncing secrets to Hasura Cloud.

View File

@@ -19,12 +19,6 @@ description: "How to sync secrets from Infisical to Heroku"
![integrations heroku authorization](../../images/integrations/heroku/integrations-heroku-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Heroku app and press create integration to start syncing secrets to Heroku.

View File

@@ -27,12 +27,6 @@ Prerequisites:
![integrations laravel forge authorization](../../images/integrations/laravel-forge/integrations-laravelforge-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Laravel Forge site and press create integration to start syncing secrets to Laravel Forge.

View File

@@ -25,12 +25,6 @@ description: "How to sync secrets from Infisical to Netlify"
![integrations netlify authorization](../../images/integrations/netlify/integrations-netlify-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Netlify app and context. Lastly, press create integration to start syncing secrets to Netlify.

View File

@@ -23,12 +23,6 @@ Prerequisites:
![integrations northflank authorization](../../images/integrations/northflank/integrations-northflank-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Northflank project and secret group. Finally, press create integration to start syncing secrets to Northflank.

View File

@@ -21,12 +21,6 @@ Prerequisites:
![integrations qovery authorization](../../images/integrations/qovery/integrations-qovery-auth.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it is necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to Qovery and press create integration to start syncing secrets.

View File

@@ -30,12 +30,6 @@ Prerequisites:
![integrations railway authorization](../../images/integrations/railway/integrations-railway-authorization.png)
<Info>
If this is your project's first cloud integration, then you'll have to grant
Infisical access to your project's environment variables. Although this step
breaks E2EE, it's necessary for Infisical to sync the environment variables to
the cloud platform.
</Info>
</Step>
<Step title="Start integration">
Select which Infisical environment secrets you want to sync to which Railway project and environment (and optionally service). Lastly, press create integration to start syncing secrets to Railway.

Some files were not shown because too many files have changed in this diff Show More