Compare commits

...

73 Commits

Author SHA1 Message Date
a0ea2627ed change hash to etag 2024-03-01 02:11:50 -05:00
5c40b538af remove ExecuteCommandWithTimeout 2024-03-01 02:11:27 -05:00
8dd94a4e10 move ExecuteCommandWithTimeout to agent file 2024-03-01 02:11:03 -05:00
041c4a20a0 example config 2024-03-01 02:10:26 -05:00
4a2a5f42a8 Renamed to exec to execute, and cleanup 🧼 2024-03-01 07:26:31 +01:00
9fcdf17a04 Update agent.go 2024-03-01 07:17:27 +01:00
97ac8cb45a Update agent.go 2024-03-01 07:02:26 +01:00
e952659415 Update agent.go 2024-03-01 07:02:04 +01:00
1f3f061a06 Fix: Agent output 2024-03-01 06:46:09 +01:00
5096ce3bdc Feat: Agent improvements 2024-03-01 06:41:17 +01:00
fb8c4bd415 Feat: Agent improvements 2024-02-29 07:12:30 +01:00
48bf41ac8c Update cli.go 2024-02-29 07:12:18 +01:00
1ad916a784 Feat: Agent improvements, Secrets state manager 2024-02-29 07:12:10 +01:00
c91456838e Update model.go 2024-02-29 07:12:01 +01:00
79efe64504 Feat: Agent improvements, get ETag from secrets request 2024-02-29 07:11:56 +01:00
40cb5c4394 Merge pull request #1326 from quinton11/feat/cli-export-with-tag-slugs
feat: cli export allow filtering with tags
2024-02-28 18:32:22 -05:00
60b73879df Update postgres.mdx 2024-02-28 14:40:04 -08:00
4339ef4737 Merge pull request #1485 from Infisical/snyk-upgrade-14579311c8ea1dfb5d579851318fccc5
[Snyk] Upgrade posthog-js from 1.104.4 to 1.105.4
2024-02-28 16:59:28 -05:00
d98669700d Merge pull request #1487 from nhedger/docs/docker
docs: improve docker page
2024-02-28 16:59:18 -05:00
162f339149 Merge pull request #1489 from Infisical/snyk-upgrade-978758f53696a9f0d6b71883b9614b0a
[Snyk] Upgrade aws-sdk from 2.1549.0 to 2.1553.0
2024-02-28 16:48:35 -05:00
d3eb0c4cc9 Merge pull request #1488 from Kiskadee-dev/patch-1
fix postgresql volume path on docker-compose.prod.yml
2024-02-28 16:35:15 -05:00
4b4295f53d fix: upgrade aws-sdk from 2.1549.0 to 2.1553.0
Snyk has created this PR to upgrade aws-sdk from 2.1549.0 to 2.1553.0.

See this package in npm:
https://www.npmjs.com/package/aws-sdk

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-28 21:11:42 +00:00
6c4d193b12 Update docker-compose.prod.yml
/data/db doesn't seems to exist, would never persist data otherwise
2024-02-28 16:27:16 -03:00
d08d412f54 improve more 2024-02-28 20:21:43 +01:00
bb4810470f docs: rewording 2024-02-28 20:20:39 +01:00
24e9c0a39f Merge pull request #1486 from nhedger/docs/secrets
docs: fix typo
2024-02-28 14:07:00 -05:00
3161d0ee67 docs: fix typo 2024-02-28 20:01:57 +01:00
8a7e18dc7c fix: upgrade posthog-js from 1.104.4 to 1.105.4
Snyk has created this PR to upgrade posthog-js from 1.104.4 to 1.105.4.

See this package in npm:
https://www.npmjs.com/package/posthog-js

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/53d4ecb6-6cc1-4918-aa73-bf9cae4ffd13?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-28 18:49:06 +00:00
0497c3b49e Merge pull request #1484 from akhilmhdh/feat/i18n-removal
feat(ui): secret blind index banner in secret main page and removed i18n to keep only english for now
2024-02-28 11:09:35 -05:00
e6a89fb9d0 feat(ui): secret blind index banner in secret main page and removed i18n translations for now by keeping en as only option 2024-02-28 14:47:50 +05:30
d9828db2ec update gamma helm values 2024-02-27 18:51:36 -05:00
f11efc9e3f Merge pull request #1461 from Infisical/snyk-upgrade-430437d73f24d5cfbeaa8f5f8f1fa7dc
[Snyk] Upgrade posthog-js from 1.103.0 to 1.104.4
2024-02-27 18:43:09 -05:00
32bad10c0e Merge branch 'main' into snyk-upgrade-430437d73f24d5cfbeaa8f5f8f1fa7dc 2024-02-27 18:43:03 -05:00
41064920f7 Merge pull request #1465 from Infisical/snyk-upgrade-29c4ba3e253755510159e658916d2c3f
[Snyk] Upgrade @fastify/cookie from 9.2.0 to 9.3.1
2024-02-27 18:41:33 -05:00
8d8e23add2 Merge pull request #1471 from akhilmhdh/feat/telemetry-aggregation
Telemetry stats event for self hosted instance on midnight
2024-02-27 18:36:29 -05:00
a2a959cc32 disable telemetry for local dev by default 2024-02-27 18:26:15 -05:00
d6cde48181 set posthog flush to zero and fix typos 2024-02-27 18:23:24 -05:00
23966c12e2 Merge pull request #1482 from Infisical/daniel/fix-invite-all-members
Fix: Invite all members to project when there are no members to invite
2024-02-27 17:38:52 -05:00
2a233ea43c Fix: Inviting all members when there's only 1 user in the organization 2024-02-27 23:15:40 +01:00
fe497d87c0 add INFISICAL_CLOUD env back from old backend 2024-02-27 16:38:18 -05:00
0c3060e1c6 Merge pull request #1477 from Infisical/daniel/upgrade-transparency
Chore: Project upgrade notice
2024-02-27 13:47:36 -05:00
5d64398e58 add more clarity to e2ee notice 2024-02-27 13:45:31 -05:00
2f6f713c98 Better phrasing 2024-02-27 19:19:17 +01:00
4f47d43801 Merge pull request #1479 from Infisical/snyk-fix-46ae40ea09c96f3a158e662824e76ed8
[Snyk] Security upgrade bullmq from 5.1.6 to 5.3.3
2024-02-27 12:17:35 -05:00
6cf9a83c16 fix: backend/package.json & backend/package-lock.json to reduce vulnerabilities
The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-JS-INFLIGHT-6095116
2024-02-27 08:51:09 +00:00
c3adc8b188 Update overview.mdx 2024-02-26 22:31:07 -08:00
a723c456aa Update Chart.yaml 2024-02-27 01:26:47 -05:00
c455ef7ced Update values.yaml 2024-02-27 01:26:28 -05:00
f9d0680dc3 Update Chart.yaml 2024-02-27 01:24:06 -05:00
7a4e8b8c32 Update values.yaml 2024-02-27 01:23:35 -05:00
8e83b0f2dd npm install backend 2024-02-27 01:13:00 -05:00
59c6837071 Update faq.mdx 2024-02-27 00:48:32 -05:00
d4d23e06a8 Merge pull request #1478 from Infisical/mongo-to-postgres-guide
Mongo to postgres guide
2024-02-27 00:43:58 -05:00
5d71b02f8d Fix: Add learn more to both alerts 2024-02-27 06:01:22 +01:00
9d2a0f1d54 Chore: Add notice link 2024-02-27 05:55:11 +01:00
0f4da61aaa Docs: Upgrade notice 2024-02-27 05:54:56 +01:00
26abb7d89f Merge pull request #1476 from Infisical/ldap-docs
Update docs for LDAP
2024-02-26 20:48:21 -08:00
892a25edfe Update docs for LDAP 2024-02-26 20:47:20 -08:00
082a533cfa Update Chart.yaml 2024-02-26 17:19:48 -05:00
d71a8a35e5 increase resource limits more 2024-02-26 17:19:38 -05:00
59585dfea9 Merge pull request #1474 from Infisical/daniel/failed-decryption-log
Fix: Add detailed decryption error logging
2024-02-26 16:49:52 -05:00
514304eed0 Fix: Add detailed decryption error logging 2024-02-26 22:19:54 +01:00
a0fc9e534c Update Chart.yaml 2024-02-26 16:10:02 -05:00
73323c0343 update resource limits 2024-02-26 16:09:21 -05:00
98cd71d421 Merge pull request #1473 from Infisical/ldap-docs
Add docs for LDAP
2024-02-26 10:51:48 -08:00
ae6157dd78 Add docs for LDAP 2024-02-26 10:49:30 -08:00
3b9ceff21c refactor(server): updated all telemetry send events to await as changed to async 2024-02-26 19:52:38 +05:30
d64d935d7d feat(server): added telemetry queue for self hosted to upload instance stats to posthog on midnight 2024-02-26 19:52:38 +05:30
8aaed739d5 feat(server): resolved a possible race condition on replication on frest first boot up and fixed making values optional on create rows for generate schema 2024-02-26 19:52:38 +05:30
7d8b399102 feat(server): added keystore and made server cfg fetch from keystore to avoid db calls 2024-02-26 19:52:38 +05:30
1594165768 fix: upgrade @fastify/cookie from 9.2.0 to 9.3.1
Snyk has created this PR to upgrade @fastify/cookie from 9.2.0 to 9.3.1.

See this package in npm:
https://www.npmjs.com/package/@fastify/cookie

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-25 03:14:18 +00:00
29d91d83ab fix: upgrade posthog-js from 1.103.0 to 1.104.4
Snyk has created this PR to upgrade posthog-js from 1.103.0 to 1.104.4.

See this package in npm:
https://www.npmjs.com/package/posthog-js

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/53d4ecb6-6cc1-4918-aa73-bf9cae4ffd13?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-24 04:51:43 +00:00
4057e2c6ab feat: cli export allow filtering with tags 2024-01-24 19:05:16 +00:00
121 changed files with 1066 additions and 365 deletions

View File

@ -19,10 +19,6 @@ POSTGRES_DB=infisical
# Redis # Redis
REDIS_URL=redis://redis:6379 REDIS_URL=redis://redis:6379
# Optional credentials for MongoDB container instance and Mongo-Express
MONGO_USERNAME=root
MONGO_PASSWORD=example
# Website URL # Website URL
# Required # Required
SITE_URL=http://localhost:8080 SITE_URL=http://localhost:8080

13
.github/values.yaml vendored
View File

@ -13,11 +13,10 @@ fullnameOverride: ""
## ##
infisical: infisical:
## @param backend.enabled Enable backend autoDatabaseSchemaMigration: false
##
enabled: false enabled: false
## @param backend.name Backend name
##
name: infisical name: infisical
replicaCount: 3 replicaCount: 3
image: image:
@ -50,3 +49,9 @@ ingress:
- secretName: letsencrypt-prod - secretName: letsencrypt-prod
hosts: hosts:
- gamma.infisical.com - gamma.infisical.com
postgresql:
enabled: false
redis:
enabled: false

View File

@ -0,0 +1,30 @@
import { TKeyStoreFactory } from "@app/keystore/keystore";
export const mockKeyStore = (): TKeyStoreFactory => {
const store: Record<string, string | number | Buffer> = {};
return {
setItem: async (key, value) => {
store[key] = value;
return "OK";
},
setItemWithExpiry: async (key, value) => {
store[key] = value;
return "OK";
},
deleteItem: async (key) => {
delete store[key];
return 1;
},
getItem: async (key) => {
const value = store[key];
if (typeof value === "string") {
return value;
}
return null;
},
incrementBy: async () => {
return 1;
}
};
};

View File

@ -14,6 +14,7 @@ import { AuthTokenType } from "@app/services/auth/auth-type";
import { mockQueue } from "./mocks/queue"; import { mockQueue } from "./mocks/queue";
import { mockSmtpServer } from "./mocks/smtp"; import { mockSmtpServer } from "./mocks/smtp";
import { mockKeyStore } from "./mocks/keystore";
dotenv.config({ path: path.join(__dirname, "../../.env.test"), debug: true }); dotenv.config({ path: path.join(__dirname, "../../.env.test"), debug: true });
export default { export default {
@ -41,7 +42,8 @@ export default {
await db.seed.run(); await db.seed.run();
const smtp = mockSmtpServer(); const smtp = mockSmtpServer();
const queue = mockQueue(); const queue = mockQueue();
const server = await main({ db, smtp, logger, queue }); const keyStore = mockKeyStore();
const server = await main({ db, smtp, logger, queue, keyStore });
// @ts-expect-error type // @ts-expect-error type
globalThis.testServer = server; globalThis.testServer = server;
// @ts-expect-error type // @ts-expect-error type

View File

@ -11,7 +11,7 @@
"dependencies": { "dependencies": {
"@aws-sdk/client-secrets-manager": "^3.504.0", "@aws-sdk/client-secrets-manager": "^3.504.0",
"@casl/ability": "^6.5.0", "@casl/ability": "^6.5.0",
"@fastify/cookie": "^9.2.0", "@fastify/cookie": "^9.3.1",
"@fastify/cors": "^8.5.0", "@fastify/cors": "^8.5.0",
"@fastify/etag": "^5.1.0", "@fastify/etag": "^5.1.0",
"@fastify/formbody": "^7.4.0", "@fastify/formbody": "^7.4.0",
@ -29,11 +29,11 @@
"@ucast/mongo2js": "^1.3.4", "@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0", "ajv": "^8.12.0",
"argon2": "^0.31.2", "argon2": "^0.31.2",
"aws-sdk": "^2.1549.0", "aws-sdk": "^2.1553.0",
"axios": "^1.6.7", "axios": "^1.6.7",
"axios-retry": "^4.0.0", "axios-retry": "^4.0.0",
"bcrypt": "^5.1.1", "bcrypt": "^5.1.1",
"bullmq": "^5.1.6", "bullmq": "^5.3.3",
"dotenv": "^16.4.1", "dotenv": "^16.4.1",
"fastify": "^4.26.0", "fastify": "^4.26.0",
"fastify-plugin": "^4.5.1", "fastify-plugin": "^4.5.1",
@ -1687,9 +1687,9 @@
} }
}, },
"node_modules/@fastify/cookie": { "node_modules/@fastify/cookie": {
"version": "9.2.0", "version": "9.3.1",
"resolved": "https://registry.npmjs.org/@fastify/cookie/-/cookie-9.2.0.tgz", "resolved": "https://registry.npmjs.org/@fastify/cookie/-/cookie-9.3.1.tgz",
"integrity": "sha512-fkg1yjjQRHPFAxSHeLC8CqYuNzvR6Lwlj/KjrzQcGjNBK+K82nW+UfCjfN71g1GkoVoc1GTOgIWkFJpcMfMkHQ==", "integrity": "sha512-h1NAEhB266+ZbZ0e9qUE6NnNR07i7DnNXWG9VbbZ8uC6O/hxHpl+Zoe5sw1yfdZ2U6XhToUGDnzQtWJdCaPwfg==",
"dependencies": { "dependencies": {
"cookie-signature": "^1.1.0", "cookie-signature": "^1.1.0",
"fastify-plugin": "^4.0.0" "fastify-plugin": "^4.0.0"
@ -2193,7 +2193,6 @@
"version": "2.1.5", "version": "2.1.5",
"resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz", "resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz",
"integrity": "sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==", "integrity": "sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==",
"dev": true,
"dependencies": { "dependencies": {
"@nodelib/fs.stat": "2.0.5", "@nodelib/fs.stat": "2.0.5",
"run-parallel": "^1.1.9" "run-parallel": "^1.1.9"
@ -2206,7 +2205,6 @@
"version": "2.0.5", "version": "2.0.5",
"resolved": "https://registry.npmjs.org/@nodelib/fs.stat/-/fs.stat-2.0.5.tgz", "resolved": "https://registry.npmjs.org/@nodelib/fs.stat/-/fs.stat-2.0.5.tgz",
"integrity": "sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==", "integrity": "sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==",
"dev": true,
"engines": { "engines": {
"node": ">= 8" "node": ">= 8"
} }
@ -2215,7 +2213,6 @@
"version": "1.2.8", "version": "1.2.8",
"resolved": "https://registry.npmjs.org/@nodelib/fs.walk/-/fs.walk-1.2.8.tgz", "resolved": "https://registry.npmjs.org/@nodelib/fs.walk/-/fs.walk-1.2.8.tgz",
"integrity": "sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==", "integrity": "sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==",
"dev": true,
"dependencies": { "dependencies": {
"@nodelib/fs.scandir": "2.1.5", "@nodelib/fs.scandir": "2.1.5",
"fastq": "^1.6.0" "fastq": "^1.6.0"
@ -5189,9 +5186,9 @@
"integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w==" "integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
}, },
"node_modules/aws-sdk": { "node_modules/aws-sdk": {
"version": "2.1549.0", "version": "2.1553.0",
"resolved": "https://registry.npmjs.org/aws-sdk/-/aws-sdk-2.1549.0.tgz", "resolved": "https://registry.npmjs.org/aws-sdk/-/aws-sdk-2.1553.0.tgz",
"integrity": "sha512-SoVfrrV3A2mxH+NV2tA0eMtG301glhewvhL3Ob4107qLWjvwjy/CoWLclMLmfXniTGxbI8tsgN0r5mLZUKey3Q==", "integrity": "sha512-CfZaw8dR9e642aBOeFhkFL7KoQApeLR15uH2IQqfL/12snWYayAAesYh0tEaU+XbhrH0CUsf2Zro5IraEXEZMg==",
"dependencies": { "dependencies": {
"buffer": "4.9.2", "buffer": "4.9.2",
"events": "1.1.1", "events": "1.1.1",
@ -5442,7 +5439,6 @@
"version": "3.0.2", "version": "3.0.2",
"resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz", "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz",
"integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==", "integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==",
"dev": true,
"dependencies": { "dependencies": {
"fill-range": "^7.0.1" "fill-range": "^7.0.1"
}, },
@ -5492,14 +5488,15 @@
} }
}, },
"node_modules/bullmq": { "node_modules/bullmq": {
"version": "5.1.6", "version": "5.3.3",
"resolved": "https://registry.npmjs.org/bullmq/-/bullmq-5.1.6.tgz", "resolved": "https://registry.npmjs.org/bullmq/-/bullmq-5.3.3.tgz",
"integrity": "sha512-VkLfig+xm4U3hc4QChzuuAy0NGQ9dfPB8o54hmcZHCX9ofp0Zn6bEY+W3Ytkk76eYwPAgXfywDBlAb2Unjl1Rg==", "integrity": "sha512-Gc/68HxiCHLMPBiGIqtINxcf8HER/5wvBYMY/6x3tFejlvldUBFaAErMTLDv4TnPsTyzNPrfBKmFCEM58uVnJg==",
"dependencies": { "dependencies": {
"cron-parser": "^4.6.0", "cron-parser": "^4.6.0",
"glob": "^8.0.3", "fast-glob": "^3.3.2",
"ioredis": "^5.3.2", "ioredis": "^5.3.2",
"lodash": "^4.17.21", "lodash": "^4.17.21",
"minimatch": "^9.0.3",
"msgpackr": "^1.10.1", "msgpackr": "^1.10.1",
"node-abort-controller": "^3.1.1", "node-abort-controller": "^3.1.1",
"semver": "^7.5.4", "semver": "^7.5.4",
@ -5507,6 +5504,28 @@
"uuid": "^9.0.0" "uuid": "^9.0.0"
} }
}, },
"node_modules/bullmq/node_modules/brace-expansion": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.1.tgz",
"integrity": "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==",
"dependencies": {
"balanced-match": "^1.0.0"
}
},
"node_modules/bullmq/node_modules/minimatch": {
"version": "9.0.3",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.3.tgz",
"integrity": "sha512-RHiac9mvaRw0x3AYRgDC1CxAP7HTcNrrECeA8YYJeWnpo+2Q5CegtZjaotWTWxDG3UeGA1coE05iH1mPjT/2mg==",
"dependencies": {
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=16 || 14 >=14.17"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/bundle-require": { "node_modules/bundle-require": {
"version": "4.0.2", "version": "4.0.2",
"resolved": "https://registry.npmjs.org/bundle-require/-/bundle-require-4.0.2.tgz", "resolved": "https://registry.npmjs.org/bundle-require/-/bundle-require-4.0.2.tgz",
@ -6906,7 +6925,6 @@
"version": "3.3.2", "version": "3.3.2",
"resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.2.tgz", "resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.2.tgz",
"integrity": "sha512-oX2ruAFQwf/Orj8m737Y5adxDQO0LAB7/S5MnxCdTNDd4p6BsyIVsv9JQsATbTSq8KHRpLwIHbVlUNatxd+1Ow==", "integrity": "sha512-oX2ruAFQwf/Orj8m737Y5adxDQO0LAB7/S5MnxCdTNDd4p6BsyIVsv9JQsATbTSq8KHRpLwIHbVlUNatxd+1Ow==",
"dev": true,
"dependencies": { "dependencies": {
"@nodelib/fs.stat": "^2.0.2", "@nodelib/fs.stat": "^2.0.2",
"@nodelib/fs.walk": "^1.2.3", "@nodelib/fs.walk": "^1.2.3",
@ -7058,7 +7076,6 @@
"version": "7.0.1", "version": "7.0.1",
"resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz", "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz",
"integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==", "integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==",
"dev": true,
"dependencies": { "dependencies": {
"to-regex-range": "^5.0.1" "to-regex-range": "^5.0.1"
}, },
@ -7510,7 +7527,6 @@
"version": "5.1.2", "version": "5.1.2",
"resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz", "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz",
"integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==", "integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==",
"dev": true,
"dependencies": { "dependencies": {
"is-glob": "^4.0.1" "is-glob": "^4.0.1"
}, },
@ -8111,7 +8127,6 @@
"version": "2.1.1", "version": "2.1.1",
"resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz", "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
"integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==", "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==",
"dev": true,
"engines": { "engines": {
"node": ">=0.10.0" "node": ">=0.10.0"
} }
@ -8142,7 +8157,6 @@
"version": "4.0.3", "version": "4.0.3",
"resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz", "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz",
"integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==", "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==",
"dev": true,
"dependencies": { "dependencies": {
"is-extglob": "^2.1.1" "is-extglob": "^2.1.1"
}, },
@ -8177,7 +8191,6 @@
"version": "7.0.0", "version": "7.0.0",
"resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz", "resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz",
"integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==", "integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==",
"dev": true,
"engines": { "engines": {
"node": ">=0.12.0" "node": ">=0.12.0"
} }
@ -8934,7 +8947,6 @@
"version": "1.4.1", "version": "1.4.1",
"resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz", "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz",
"integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==", "integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==",
"dev": true,
"engines": { "engines": {
"node": ">= 8" "node": ">= 8"
} }
@ -8951,7 +8963,6 @@
"version": "4.0.5", "version": "4.0.5",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz", "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz",
"integrity": "sha512-DMy+ERcEW2q8Z2Po+WNXuw3c5YaUSFjAO5GsJqfEl7UjvtIuFKO6ZrKvcItdy98dwFI2N1tg3zNIdKaQT+aNdA==", "integrity": "sha512-DMy+ERcEW2q8Z2Po+WNXuw3c5YaUSFjAO5GsJqfEl7UjvtIuFKO6ZrKvcItdy98dwFI2N1tg3zNIdKaQT+aNdA==",
"dev": true,
"dependencies": { "dependencies": {
"braces": "^3.0.2", "braces": "^3.0.2",
"picomatch": "^2.3.1" "picomatch": "^2.3.1"
@ -8964,7 +8975,6 @@
"version": "2.3.1", "version": "2.3.1",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz", "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==", "integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
"dev": true,
"engines": { "engines": {
"node": ">=8.6" "node": ">=8.6"
}, },
@ -10557,7 +10567,6 @@
"version": "1.2.3", "version": "1.2.3",
"resolved": "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz", "resolved": "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz",
"integrity": "sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==", "integrity": "sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==",
"dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@ -10904,7 +10913,6 @@
"version": "1.2.0", "version": "1.2.0",
"resolved": "https://registry.npmjs.org/run-parallel/-/run-parallel-1.2.0.tgz", "resolved": "https://registry.npmjs.org/run-parallel/-/run-parallel-1.2.0.tgz",
"integrity": "sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==", "integrity": "sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==",
"dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@ -11705,7 +11713,6 @@
"version": "5.0.1", "version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz", "resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
"integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==", "integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==",
"dev": true,
"dependencies": { "dependencies": {
"is-number": "^7.0.0" "is-number": "^7.0.0"
}, },

View File

@ -72,7 +72,7 @@
"dependencies": { "dependencies": {
"@aws-sdk/client-secrets-manager": "^3.504.0", "@aws-sdk/client-secrets-manager": "^3.504.0",
"@casl/ability": "^6.5.0", "@casl/ability": "^6.5.0",
"@fastify/cookie": "^9.2.0", "@fastify/cookie": "^9.3.1",
"@fastify/cors": "^8.5.0", "@fastify/cors": "^8.5.0",
"@fastify/etag": "^5.1.0", "@fastify/etag": "^5.1.0",
"@fastify/formbody": "^7.4.0", "@fastify/formbody": "^7.4.0",
@ -90,11 +90,11 @@
"@ucast/mongo2js": "^1.3.4", "@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0", "ajv": "^8.12.0",
"argon2": "^0.31.2", "argon2": "^0.31.2",
"aws-sdk": "^2.1549.0", "aws-sdk": "^2.1553.0",
"axios": "^1.6.7", "axios": "^1.6.7",
"axios-retry": "^4.0.0", "axios-retry": "^4.0.0",
"bcrypt": "^5.1.1", "bcrypt": "^5.1.1",
"bullmq": "^5.1.6", "bullmq": "^5.3.3",
"dotenv": "^16.4.1", "dotenv": "^16.4.1",
"fastify": "^4.26.0", "fastify": "^4.26.0",
"fastify-plugin": "^4.5.1", "fastify-plugin": "^4.5.1",

View File

@ -44,7 +44,7 @@ const getZodDefaultValue = (type: unknown, value: string | number | boolean | Ob
if (!value || value === "null") return; if (!value || value === "null") return;
switch (type) { switch (type) {
case "uuid": case "uuid":
return; return `.default("00000000-0000-0000-0000-000000000000")`;
case "character varying": { case "character varying": {
if (value === "gen_random_uuid()") return; if (value === "gen_random_uuid()") return;
if (typeof value === "string" && value.includes("::")) { if (typeof value === "string" && value.includes("::")) {
@ -100,7 +100,8 @@ const main = async () => {
const columnName = columnNames[colNum]; const columnName = columnNames[colNum];
const colInfo = columns[columnName]; const colInfo = columns[columnName];
let ztype = getZodPrimitiveType(colInfo.type); let ztype = getZodPrimitiveType(colInfo.type);
if (colInfo.defaultValue) { // don't put optional on id
if (colInfo.defaultValue && columnName !== "id") {
const { defaultValue } = colInfo; const { defaultValue } = colInfo;
const zSchema = getZodDefaultValue(colInfo.type, defaultValue); const zSchema = getZodDefaultValue(colInfo.type, defaultValue);
if (zSchema) { if (zSchema) {
@ -120,6 +121,7 @@ const main = async () => {
.split("_") .split("_")
.reduce((prev, curr) => prev + `${curr.at(0)?.toUpperCase()}${curr.slice(1).toLowerCase()}`, ""); .reduce((prev, curr) => prev + `${curr.at(0)?.toUpperCase()}${curr.slice(1).toLowerCase()}`, "");
// the insert and update are changed to zod input type to use default cases
writeFileSync( writeFileSync(
path.join(__dirname, "../src/db/schemas", `${dashcase}.ts`), path.join(__dirname, "../src/db/schemas", `${dashcase}.ts`),
`// Code generated by automation script, DO NOT EDIT. `// Code generated by automation script, DO NOT EDIT.
@ -134,8 +136,8 @@ import { TImmutableDBKeys } from "./models";
export const ${pascalCase}Schema = z.object({${schema}}); export const ${pascalCase}Schema = z.object({${schema}});
export type T${pascalCase} = z.infer<typeof ${pascalCase}Schema>; export type T${pascalCase} = z.infer<typeof ${pascalCase}Schema>;
export type T${pascalCase}Insert = Omit<T${pascalCase}, TImmutableDBKeys>; export type T${pascalCase}Insert = Omit<z.input<typeof ${pascalCase}Schema>, TImmutableDBKeys>;
export type T${pascalCase}Update = Partial<Omit<T${pascalCase}, TImmutableDBKeys>>; export type T${pascalCase}Update = Partial<Omit<z.input<typeof ${pascalCase}Schema>, TImmutableDBKeys>>;
` `
); );
} }

View File

@ -1,6 +0,0 @@
import Redis from "ioredis";
export const initRedisConnection = (redisUrl: string) => {
const redis = new Redis(redisUrl);
return redis;
};

View File

@ -0,0 +1,21 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
const ADMIN_CONFIG_UUID = "00000000-0000-0000-0000-000000000000";
export async function up(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.SuperAdmin, (t) => {
t.uuid("instanceId").notNullable().defaultTo(knex.fn.uuid());
});
// this is updated to avoid race condition on replication
// eslint-disable-next-line
// @ts-ignore
await knex(TableName.SuperAdmin).update({ id: ADMIN_CONFIG_UUID }).whereNotNull("id").limit(1);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.SuperAdmin, (t) => {
t.dropColumn("instanceId");
});
}

View File

@ -19,5 +19,5 @@ export const ApiKeysSchema = z.object({
}); });
export type TApiKeys = z.infer<typeof ApiKeysSchema>; export type TApiKeys = z.infer<typeof ApiKeysSchema>;
export type TApiKeysInsert = Omit<TApiKeys, TImmutableDBKeys>; export type TApiKeysInsert = Omit<z.input<typeof ApiKeysSchema>, TImmutableDBKeys>;
export type TApiKeysUpdate = Partial<Omit<TApiKeys, TImmutableDBKeys>>; export type TApiKeysUpdate = Partial<Omit<z.input<typeof ApiKeysSchema>, TImmutableDBKeys>>;

View File

@ -24,5 +24,5 @@ export const AuditLogsSchema = z.object({
}); });
export type TAuditLogs = z.infer<typeof AuditLogsSchema>; export type TAuditLogs = z.infer<typeof AuditLogsSchema>;
export type TAuditLogsInsert = Omit<TAuditLogs, TImmutableDBKeys>; export type TAuditLogsInsert = Omit<z.input<typeof AuditLogsSchema>, TImmutableDBKeys>;
export type TAuditLogsUpdate = Partial<Omit<TAuditLogs, TImmutableDBKeys>>; export type TAuditLogsUpdate = Partial<Omit<z.input<typeof AuditLogsSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const AuthTokenSessionsSchema = z.object({
}); });
export type TAuthTokenSessions = z.infer<typeof AuthTokenSessionsSchema>; export type TAuthTokenSessions = z.infer<typeof AuthTokenSessionsSchema>;
export type TAuthTokenSessionsInsert = Omit<TAuthTokenSessions, TImmutableDBKeys>; export type TAuthTokenSessionsInsert = Omit<z.input<typeof AuthTokenSessionsSchema>, TImmutableDBKeys>;
export type TAuthTokenSessionsUpdate = Partial<Omit<TAuthTokenSessions, TImmutableDBKeys>>; export type TAuthTokenSessionsUpdate = Partial<Omit<z.input<typeof AuthTokenSessionsSchema>, TImmutableDBKeys>>;

View File

@ -21,5 +21,5 @@ export const AuthTokensSchema = z.object({
}); });
export type TAuthTokens = z.infer<typeof AuthTokensSchema>; export type TAuthTokens = z.infer<typeof AuthTokensSchema>;
export type TAuthTokensInsert = Omit<TAuthTokens, TImmutableDBKeys>; export type TAuthTokensInsert = Omit<z.input<typeof AuthTokensSchema>, TImmutableDBKeys>;
export type TAuthTokensUpdate = Partial<Omit<TAuthTokens, TImmutableDBKeys>>; export type TAuthTokensUpdate = Partial<Omit<z.input<typeof AuthTokensSchema>, TImmutableDBKeys>>;

View File

@ -22,5 +22,5 @@ export const BackupPrivateKeySchema = z.object({
}); });
export type TBackupPrivateKey = z.infer<typeof BackupPrivateKeySchema>; export type TBackupPrivateKey = z.infer<typeof BackupPrivateKeySchema>;
export type TBackupPrivateKeyInsert = Omit<TBackupPrivateKey, TImmutableDBKeys>; export type TBackupPrivateKeyInsert = Omit<z.input<typeof BackupPrivateKeySchema>, TImmutableDBKeys>;
export type TBackupPrivateKeyUpdate = Partial<Omit<TBackupPrivateKey, TImmutableDBKeys>>; export type TBackupPrivateKeyUpdate = Partial<Omit<z.input<typeof BackupPrivateKeySchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const GitAppInstallSessionsSchema = z.object({
}); });
export type TGitAppInstallSessions = z.infer<typeof GitAppInstallSessionsSchema>; export type TGitAppInstallSessions = z.infer<typeof GitAppInstallSessionsSchema>;
export type TGitAppInstallSessionsInsert = Omit<TGitAppInstallSessions, TImmutableDBKeys>; export type TGitAppInstallSessionsInsert = Omit<z.input<typeof GitAppInstallSessionsSchema>, TImmutableDBKeys>;
export type TGitAppInstallSessionsUpdate = Partial<Omit<TGitAppInstallSessions, TImmutableDBKeys>>; export type TGitAppInstallSessionsUpdate = Partial<Omit<z.input<typeof GitAppInstallSessionsSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const GitAppOrgSchema = z.object({
}); });
export type TGitAppOrg = z.infer<typeof GitAppOrgSchema>; export type TGitAppOrg = z.infer<typeof GitAppOrgSchema>;
export type TGitAppOrgInsert = Omit<TGitAppOrg, TImmutableDBKeys>; export type TGitAppOrgInsert = Omit<z.input<typeof GitAppOrgSchema>, TImmutableDBKeys>;
export type TGitAppOrgUpdate = Partial<Omit<TGitAppOrg, TImmutableDBKeys>>; export type TGitAppOrgUpdate = Partial<Omit<z.input<typeof GitAppOrgSchema>, TImmutableDBKeys>>;

View File

@ -16,5 +16,5 @@ export const IdentitiesSchema = z.object({
}); });
export type TIdentities = z.infer<typeof IdentitiesSchema>; export type TIdentities = z.infer<typeof IdentitiesSchema>;
export type TIdentitiesInsert = Omit<TIdentities, TImmutableDBKeys>; export type TIdentitiesInsert = Omit<z.input<typeof IdentitiesSchema>, TImmutableDBKeys>;
export type TIdentitiesUpdate = Partial<Omit<TIdentities, TImmutableDBKeys>>; export type TIdentitiesUpdate = Partial<Omit<z.input<typeof IdentitiesSchema>, TImmutableDBKeys>>;

View File

@ -23,5 +23,5 @@ export const IdentityAccessTokensSchema = z.object({
}); });
export type TIdentityAccessTokens = z.infer<typeof IdentityAccessTokensSchema>; export type TIdentityAccessTokens = z.infer<typeof IdentityAccessTokensSchema>;
export type TIdentityAccessTokensInsert = Omit<TIdentityAccessTokens, TImmutableDBKeys>; export type TIdentityAccessTokensInsert = Omit<z.input<typeof IdentityAccessTokensSchema>, TImmutableDBKeys>;
export type TIdentityAccessTokensUpdate = Partial<Omit<TIdentityAccessTokens, TImmutableDBKeys>>; export type TIdentityAccessTokensUpdate = Partial<Omit<z.input<typeof IdentityAccessTokensSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,7 @@ export const IdentityOrgMembershipsSchema = z.object({
}); });
export type TIdentityOrgMemberships = z.infer<typeof IdentityOrgMembershipsSchema>; export type TIdentityOrgMemberships = z.infer<typeof IdentityOrgMembershipsSchema>;
export type TIdentityOrgMembershipsInsert = Omit<TIdentityOrgMemberships, TImmutableDBKeys>; export type TIdentityOrgMembershipsInsert = Omit<z.input<typeof IdentityOrgMembershipsSchema>, TImmutableDBKeys>;
export type TIdentityOrgMembershipsUpdate = Partial<Omit<TIdentityOrgMemberships, TImmutableDBKeys>>; export type TIdentityOrgMembershipsUpdate = Partial<
Omit<z.input<typeof IdentityOrgMembershipsSchema>, TImmutableDBKeys>
>;

View File

@ -18,5 +18,10 @@ export const IdentityProjectMembershipsSchema = z.object({
}); });
export type TIdentityProjectMemberships = z.infer<typeof IdentityProjectMembershipsSchema>; export type TIdentityProjectMemberships = z.infer<typeof IdentityProjectMembershipsSchema>;
export type TIdentityProjectMembershipsInsert = Omit<TIdentityProjectMemberships, TImmutableDBKeys>; export type TIdentityProjectMembershipsInsert = Omit<
export type TIdentityProjectMembershipsUpdate = Partial<Omit<TIdentityProjectMemberships, TImmutableDBKeys>>; z.input<typeof IdentityProjectMembershipsSchema>,
TImmutableDBKeys
>;
export type TIdentityProjectMembershipsUpdate = Partial<
Omit<z.input<typeof IdentityProjectMembershipsSchema>, TImmutableDBKeys>
>;

View File

@ -23,5 +23,7 @@ export const IdentityUaClientSecretsSchema = z.object({
}); });
export type TIdentityUaClientSecrets = z.infer<typeof IdentityUaClientSecretsSchema>; export type TIdentityUaClientSecrets = z.infer<typeof IdentityUaClientSecretsSchema>;
export type TIdentityUaClientSecretsInsert = Omit<TIdentityUaClientSecrets, TImmutableDBKeys>; export type TIdentityUaClientSecretsInsert = Omit<z.input<typeof IdentityUaClientSecretsSchema>, TImmutableDBKeys>;
export type TIdentityUaClientSecretsUpdate = Partial<Omit<TIdentityUaClientSecrets, TImmutableDBKeys>>; export type TIdentityUaClientSecretsUpdate = Partial<
Omit<z.input<typeof IdentityUaClientSecretsSchema>, TImmutableDBKeys>
>;

View File

@ -21,5 +21,7 @@ export const IdentityUniversalAuthsSchema = z.object({
}); });
export type TIdentityUniversalAuths = z.infer<typeof IdentityUniversalAuthsSchema>; export type TIdentityUniversalAuths = z.infer<typeof IdentityUniversalAuthsSchema>;
export type TIdentityUniversalAuthsInsert = Omit<TIdentityUniversalAuths, TImmutableDBKeys>; export type TIdentityUniversalAuthsInsert = Omit<z.input<typeof IdentityUniversalAuthsSchema>, TImmutableDBKeys>;
export type TIdentityUniversalAuthsUpdate = Partial<Omit<TIdentityUniversalAuths, TImmutableDBKeys>>; export type TIdentityUniversalAuthsUpdate = Partial<
Omit<z.input<typeof IdentityUniversalAuthsSchema>, TImmutableDBKeys>
>;

View File

@ -16,5 +16,5 @@ export const IncidentContactsSchema = z.object({
}); });
export type TIncidentContacts = z.infer<typeof IncidentContactsSchema>; export type TIncidentContacts = z.infer<typeof IncidentContactsSchema>;
export type TIncidentContactsInsert = Omit<TIncidentContacts, TImmutableDBKeys>; export type TIncidentContactsInsert = Omit<z.input<typeof IncidentContactsSchema>, TImmutableDBKeys>;
export type TIncidentContactsUpdate = Partial<Omit<TIncidentContacts, TImmutableDBKeys>>; export type TIncidentContactsUpdate = Partial<Omit<z.input<typeof IncidentContactsSchema>, TImmutableDBKeys>>;

View File

@ -33,5 +33,5 @@ export const IntegrationAuthsSchema = z.object({
}); });
export type TIntegrationAuths = z.infer<typeof IntegrationAuthsSchema>; export type TIntegrationAuths = z.infer<typeof IntegrationAuthsSchema>;
export type TIntegrationAuthsInsert = Omit<TIntegrationAuths, TImmutableDBKeys>; export type TIntegrationAuthsInsert = Omit<z.input<typeof IntegrationAuthsSchema>, TImmutableDBKeys>;
export type TIntegrationAuthsUpdate = Partial<Omit<TIntegrationAuths, TImmutableDBKeys>>; export type TIntegrationAuthsUpdate = Partial<Omit<z.input<typeof IntegrationAuthsSchema>, TImmutableDBKeys>>;

View File

@ -31,5 +31,5 @@ export const IntegrationsSchema = z.object({
}); });
export type TIntegrations = z.infer<typeof IntegrationsSchema>; export type TIntegrations = z.infer<typeof IntegrationsSchema>;
export type TIntegrationsInsert = Omit<TIntegrations, TImmutableDBKeys>; export type TIntegrationsInsert = Omit<z.input<typeof IntegrationsSchema>, TImmutableDBKeys>;
export type TIntegrationsUpdate = Partial<Omit<TIntegrations, TImmutableDBKeys>>; export type TIntegrationsUpdate = Partial<Omit<z.input<typeof IntegrationsSchema>, TImmutableDBKeys>>;

View File

@ -27,5 +27,5 @@ export const OrgBotsSchema = z.object({
}); });
export type TOrgBots = z.infer<typeof OrgBotsSchema>; export type TOrgBots = z.infer<typeof OrgBotsSchema>;
export type TOrgBotsInsert = Omit<TOrgBots, TImmutableDBKeys>; export type TOrgBotsInsert = Omit<z.input<typeof OrgBotsSchema>, TImmutableDBKeys>;
export type TOrgBotsUpdate = Partial<Omit<TOrgBots, TImmutableDBKeys>>; export type TOrgBotsUpdate = Partial<Omit<z.input<typeof OrgBotsSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const OrgMembershipsSchema = z.object({
}); });
export type TOrgMemberships = z.infer<typeof OrgMembershipsSchema>; export type TOrgMemberships = z.infer<typeof OrgMembershipsSchema>;
export type TOrgMembershipsInsert = Omit<TOrgMemberships, TImmutableDBKeys>; export type TOrgMembershipsInsert = Omit<z.input<typeof OrgMembershipsSchema>, TImmutableDBKeys>;
export type TOrgMembershipsUpdate = Partial<Omit<TOrgMemberships, TImmutableDBKeys>>; export type TOrgMembershipsUpdate = Partial<Omit<z.input<typeof OrgMembershipsSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const OrgRolesSchema = z.object({
}); });
export type TOrgRoles = z.infer<typeof OrgRolesSchema>; export type TOrgRoles = z.infer<typeof OrgRolesSchema>;
export type TOrgRolesInsert = Omit<TOrgRoles, TImmutableDBKeys>; export type TOrgRolesInsert = Omit<z.input<typeof OrgRolesSchema>, TImmutableDBKeys>;
export type TOrgRolesUpdate = Partial<Omit<TOrgRoles, TImmutableDBKeys>>; export type TOrgRolesUpdate = Partial<Omit<z.input<typeof OrgRolesSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const OrganizationsSchema = z.object({
}); });
export type TOrganizations = z.infer<typeof OrganizationsSchema>; export type TOrganizations = z.infer<typeof OrganizationsSchema>;
export type TOrganizationsInsert = Omit<TOrganizations, TImmutableDBKeys>; export type TOrganizationsInsert = Omit<z.input<typeof OrganizationsSchema>, TImmutableDBKeys>;
export type TOrganizationsUpdate = Partial<Omit<TOrganizations, TImmutableDBKeys>>; export type TOrganizationsUpdate = Partial<Omit<z.input<typeof OrganizationsSchema>, TImmutableDBKeys>>;

View File

@ -26,5 +26,5 @@ export const ProjectBotsSchema = z.object({
}); });
export type TProjectBots = z.infer<typeof ProjectBotsSchema>; export type TProjectBots = z.infer<typeof ProjectBotsSchema>;
export type TProjectBotsInsert = Omit<TProjectBots, TImmutableDBKeys>; export type TProjectBotsInsert = Omit<z.input<typeof ProjectBotsSchema>, TImmutableDBKeys>;
export type TProjectBotsUpdate = Partial<Omit<TProjectBots, TImmutableDBKeys>>; export type TProjectBotsUpdate = Partial<Omit<z.input<typeof ProjectBotsSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const ProjectEnvironmentsSchema = z.object({
}); });
export type TProjectEnvironments = z.infer<typeof ProjectEnvironmentsSchema>; export type TProjectEnvironments = z.infer<typeof ProjectEnvironmentsSchema>;
export type TProjectEnvironmentsInsert = Omit<TProjectEnvironments, TImmutableDBKeys>; export type TProjectEnvironmentsInsert = Omit<z.input<typeof ProjectEnvironmentsSchema>, TImmutableDBKeys>;
export type TProjectEnvironmentsUpdate = Partial<Omit<TProjectEnvironments, TImmutableDBKeys>>; export type TProjectEnvironmentsUpdate = Partial<Omit<z.input<typeof ProjectEnvironmentsSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const ProjectKeysSchema = z.object({
}); });
export type TProjectKeys = z.infer<typeof ProjectKeysSchema>; export type TProjectKeys = z.infer<typeof ProjectKeysSchema>;
export type TProjectKeysInsert = Omit<TProjectKeys, TImmutableDBKeys>; export type TProjectKeysInsert = Omit<z.input<typeof ProjectKeysSchema>, TImmutableDBKeys>;
export type TProjectKeysUpdate = Partial<Omit<TProjectKeys, TImmutableDBKeys>>; export type TProjectKeysUpdate = Partial<Omit<z.input<typeof ProjectKeysSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const ProjectMembershipsSchema = z.object({
}); });
export type TProjectMemberships = z.infer<typeof ProjectMembershipsSchema>; export type TProjectMemberships = z.infer<typeof ProjectMembershipsSchema>;
export type TProjectMembershipsInsert = Omit<TProjectMemberships, TImmutableDBKeys>; export type TProjectMembershipsInsert = Omit<z.input<typeof ProjectMembershipsSchema>, TImmutableDBKeys>;
export type TProjectMembershipsUpdate = Partial<Omit<TProjectMemberships, TImmutableDBKeys>>; export type TProjectMembershipsUpdate = Partial<Omit<z.input<typeof ProjectMembershipsSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const ProjectRolesSchema = z.object({
}); });
export type TProjectRoles = z.infer<typeof ProjectRolesSchema>; export type TProjectRoles = z.infer<typeof ProjectRolesSchema>;
export type TProjectRolesInsert = Omit<TProjectRoles, TImmutableDBKeys>; export type TProjectRolesInsert = Omit<z.input<typeof ProjectRolesSchema>, TImmutableDBKeys>;
export type TProjectRolesUpdate = Partial<Omit<TProjectRoles, TImmutableDBKeys>>; export type TProjectRolesUpdate = Partial<Omit<z.input<typeof ProjectRolesSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const ProjectsSchema = z.object({
}); });
export type TProjects = z.infer<typeof ProjectsSchema>; export type TProjects = z.infer<typeof ProjectsSchema>;
export type TProjectsInsert = Omit<TProjects, TImmutableDBKeys>; export type TProjectsInsert = Omit<z.input<typeof ProjectsSchema>, TImmutableDBKeys>;
export type TProjectsUpdate = Partial<Omit<TProjects, TImmutableDBKeys>>; export type TProjectsUpdate = Partial<Omit<z.input<typeof ProjectsSchema>, TImmutableDBKeys>>;

View File

@ -27,5 +27,5 @@ export const SamlConfigsSchema = z.object({
}); });
export type TSamlConfigs = z.infer<typeof SamlConfigsSchema>; export type TSamlConfigs = z.infer<typeof SamlConfigsSchema>;
export type TSamlConfigsInsert = Omit<TSamlConfigs, TImmutableDBKeys>; export type TSamlConfigsInsert = Omit<z.input<typeof SamlConfigsSchema>, TImmutableDBKeys>;
export type TSamlConfigsUpdate = Partial<Omit<TSamlConfigs, TImmutableDBKeys>>; export type TSamlConfigsUpdate = Partial<Omit<z.input<typeof SamlConfigsSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const ScimTokensSchema = z.object({
}); });
export type TScimTokens = z.infer<typeof ScimTokensSchema>; export type TScimTokens = z.infer<typeof ScimTokensSchema>;
export type TScimTokensInsert = Omit<TScimTokens, TImmutableDBKeys>; export type TScimTokensInsert = Omit<z.input<typeof ScimTokensSchema>, TImmutableDBKeys>;
export type TScimTokensUpdate = Partial<Omit<TScimTokens, TImmutableDBKeys>>; export type TScimTokensUpdate = Partial<Omit<z.input<typeof ScimTokensSchema>, TImmutableDBKeys>>;

View File

@ -16,5 +16,10 @@ export const SecretApprovalPoliciesApproversSchema = z.object({
}); });
export type TSecretApprovalPoliciesApprovers = z.infer<typeof SecretApprovalPoliciesApproversSchema>; export type TSecretApprovalPoliciesApprovers = z.infer<typeof SecretApprovalPoliciesApproversSchema>;
export type TSecretApprovalPoliciesApproversInsert = Omit<TSecretApprovalPoliciesApprovers, TImmutableDBKeys>; export type TSecretApprovalPoliciesApproversInsert = Omit<
export type TSecretApprovalPoliciesApproversUpdate = Partial<Omit<TSecretApprovalPoliciesApprovers, TImmutableDBKeys>>; z.input<typeof SecretApprovalPoliciesApproversSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalPoliciesApproversUpdate = Partial<
Omit<z.input<typeof SecretApprovalPoliciesApproversSchema>, TImmutableDBKeys>
>;

View File

@ -18,5 +18,7 @@ export const SecretApprovalPoliciesSchema = z.object({
}); });
export type TSecretApprovalPolicies = z.infer<typeof SecretApprovalPoliciesSchema>; export type TSecretApprovalPolicies = z.infer<typeof SecretApprovalPoliciesSchema>;
export type TSecretApprovalPoliciesInsert = Omit<TSecretApprovalPolicies, TImmutableDBKeys>; export type TSecretApprovalPoliciesInsert = Omit<z.input<typeof SecretApprovalPoliciesSchema>, TImmutableDBKeys>;
export type TSecretApprovalPoliciesUpdate = Partial<Omit<TSecretApprovalPolicies, TImmutableDBKeys>>; export type TSecretApprovalPoliciesUpdate = Partial<
Omit<z.input<typeof SecretApprovalPoliciesSchema>, TImmutableDBKeys>
>;

View File

@ -16,5 +16,10 @@ export const SecretApprovalRequestSecretTagsSchema = z.object({
}); });
export type TSecretApprovalRequestSecretTags = z.infer<typeof SecretApprovalRequestSecretTagsSchema>; export type TSecretApprovalRequestSecretTags = z.infer<typeof SecretApprovalRequestSecretTagsSchema>;
export type TSecretApprovalRequestSecretTagsInsert = Omit<TSecretApprovalRequestSecretTags, TImmutableDBKeys>; export type TSecretApprovalRequestSecretTagsInsert = Omit<
export type TSecretApprovalRequestSecretTagsUpdate = Partial<Omit<TSecretApprovalRequestSecretTags, TImmutableDBKeys>>; z.input<typeof SecretApprovalRequestSecretTagsSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalRequestSecretTagsUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestSecretTagsSchema>, TImmutableDBKeys>
>;

View File

@ -17,5 +17,10 @@ export const SecretApprovalRequestsReviewersSchema = z.object({
}); });
export type TSecretApprovalRequestsReviewers = z.infer<typeof SecretApprovalRequestsReviewersSchema>; export type TSecretApprovalRequestsReviewers = z.infer<typeof SecretApprovalRequestsReviewersSchema>;
export type TSecretApprovalRequestsReviewersInsert = Omit<TSecretApprovalRequestsReviewers, TImmutableDBKeys>; export type TSecretApprovalRequestsReviewersInsert = Omit<
export type TSecretApprovalRequestsReviewersUpdate = Partial<Omit<TSecretApprovalRequestsReviewers, TImmutableDBKeys>>; z.input<typeof SecretApprovalRequestsReviewersSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalRequestsReviewersUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestsReviewersSchema>, TImmutableDBKeys>
>;

View File

@ -35,5 +35,10 @@ export const SecretApprovalRequestsSecretsSchema = z.object({
}); });
export type TSecretApprovalRequestsSecrets = z.infer<typeof SecretApprovalRequestsSecretsSchema>; export type TSecretApprovalRequestsSecrets = z.infer<typeof SecretApprovalRequestsSecretsSchema>;
export type TSecretApprovalRequestsSecretsInsert = Omit<TSecretApprovalRequestsSecrets, TImmutableDBKeys>; export type TSecretApprovalRequestsSecretsInsert = Omit<
export type TSecretApprovalRequestsSecretsUpdate = Partial<Omit<TSecretApprovalRequestsSecrets, TImmutableDBKeys>>; z.input<typeof SecretApprovalRequestsSecretsSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalRequestsSecretsUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestsSecretsSchema>, TImmutableDBKeys>
>;

View File

@ -22,5 +22,7 @@ export const SecretApprovalRequestsSchema = z.object({
}); });
export type TSecretApprovalRequests = z.infer<typeof SecretApprovalRequestsSchema>; export type TSecretApprovalRequests = z.infer<typeof SecretApprovalRequestsSchema>;
export type TSecretApprovalRequestsInsert = Omit<TSecretApprovalRequests, TImmutableDBKeys>; export type TSecretApprovalRequestsInsert = Omit<z.input<typeof SecretApprovalRequestsSchema>, TImmutableDBKeys>;
export type TSecretApprovalRequestsUpdate = Partial<Omit<TSecretApprovalRequests, TImmutableDBKeys>>; export type TSecretApprovalRequestsUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestsSchema>, TImmutableDBKeys>
>;

View File

@ -20,5 +20,5 @@ export const SecretBlindIndexesSchema = z.object({
}); });
export type TSecretBlindIndexes = z.infer<typeof SecretBlindIndexesSchema>; export type TSecretBlindIndexes = z.infer<typeof SecretBlindIndexesSchema>;
export type TSecretBlindIndexesInsert = Omit<TSecretBlindIndexes, TImmutableDBKeys>; export type TSecretBlindIndexesInsert = Omit<z.input<typeof SecretBlindIndexesSchema>, TImmutableDBKeys>;
export type TSecretBlindIndexesUpdate = Partial<Omit<TSecretBlindIndexes, TImmutableDBKeys>>; export type TSecretBlindIndexesUpdate = Partial<Omit<z.input<typeof SecretBlindIndexesSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const SecretFolderVersionsSchema = z.object({
}); });
export type TSecretFolderVersions = z.infer<typeof SecretFolderVersionsSchema>; export type TSecretFolderVersions = z.infer<typeof SecretFolderVersionsSchema>;
export type TSecretFolderVersionsInsert = Omit<TSecretFolderVersions, TImmutableDBKeys>; export type TSecretFolderVersionsInsert = Omit<z.input<typeof SecretFolderVersionsSchema>, TImmutableDBKeys>;
export type TSecretFolderVersionsUpdate = Partial<Omit<TSecretFolderVersions, TImmutableDBKeys>>; export type TSecretFolderVersionsUpdate = Partial<Omit<z.input<typeof SecretFolderVersionsSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const SecretFoldersSchema = z.object({
}); });
export type TSecretFolders = z.infer<typeof SecretFoldersSchema>; export type TSecretFolders = z.infer<typeof SecretFoldersSchema>;
export type TSecretFoldersInsert = Omit<TSecretFolders, TImmutableDBKeys>; export type TSecretFoldersInsert = Omit<z.input<typeof SecretFoldersSchema>, TImmutableDBKeys>;
export type TSecretFoldersUpdate = Partial<Omit<TSecretFolders, TImmutableDBKeys>>; export type TSecretFoldersUpdate = Partial<Omit<z.input<typeof SecretFoldersSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const SecretImportsSchema = z.object({
}); });
export type TSecretImports = z.infer<typeof SecretImportsSchema>; export type TSecretImports = z.infer<typeof SecretImportsSchema>;
export type TSecretImportsInsert = Omit<TSecretImports, TImmutableDBKeys>; export type TSecretImportsInsert = Omit<z.input<typeof SecretImportsSchema>, TImmutableDBKeys>;
export type TSecretImportsUpdate = Partial<Omit<TSecretImports, TImmutableDBKeys>>; export type TSecretImportsUpdate = Partial<Omit<z.input<typeof SecretImportsSchema>, TImmutableDBKeys>>;

View File

@ -15,5 +15,5 @@ export const SecretRotationOutputsSchema = z.object({
}); });
export type TSecretRotationOutputs = z.infer<typeof SecretRotationOutputsSchema>; export type TSecretRotationOutputs = z.infer<typeof SecretRotationOutputsSchema>;
export type TSecretRotationOutputsInsert = Omit<TSecretRotationOutputs, TImmutableDBKeys>; export type TSecretRotationOutputsInsert = Omit<z.input<typeof SecretRotationOutputsSchema>, TImmutableDBKeys>;
export type TSecretRotationOutputsUpdate = Partial<Omit<TSecretRotationOutputs, TImmutableDBKeys>>; export type TSecretRotationOutputsUpdate = Partial<Omit<z.input<typeof SecretRotationOutputsSchema>, TImmutableDBKeys>>;

View File

@ -26,5 +26,5 @@ export const SecretRotationsSchema = z.object({
}); });
export type TSecretRotations = z.infer<typeof SecretRotationsSchema>; export type TSecretRotations = z.infer<typeof SecretRotationsSchema>;
export type TSecretRotationsInsert = Omit<TSecretRotations, TImmutableDBKeys>; export type TSecretRotationsInsert = Omit<z.input<typeof SecretRotationsSchema>, TImmutableDBKeys>;
export type TSecretRotationsUpdate = Partial<Omit<TSecretRotations, TImmutableDBKeys>>; export type TSecretRotationsUpdate = Partial<Omit<z.input<typeof SecretRotationsSchema>, TImmutableDBKeys>>;

View File

@ -42,5 +42,7 @@ export const SecretScanningGitRisksSchema = z.object({
}); });
export type TSecretScanningGitRisks = z.infer<typeof SecretScanningGitRisksSchema>; export type TSecretScanningGitRisks = z.infer<typeof SecretScanningGitRisksSchema>;
export type TSecretScanningGitRisksInsert = Omit<TSecretScanningGitRisks, TImmutableDBKeys>; export type TSecretScanningGitRisksInsert = Omit<z.input<typeof SecretScanningGitRisksSchema>, TImmutableDBKeys>;
export type TSecretScanningGitRisksUpdate = Partial<Omit<TSecretScanningGitRisks, TImmutableDBKeys>>; export type TSecretScanningGitRisksUpdate = Partial<
Omit<z.input<typeof SecretScanningGitRisksSchema>, TImmutableDBKeys>
>;

View File

@ -17,5 +17,5 @@ export const SecretSnapshotFoldersSchema = z.object({
}); });
export type TSecretSnapshotFolders = z.infer<typeof SecretSnapshotFoldersSchema>; export type TSecretSnapshotFolders = z.infer<typeof SecretSnapshotFoldersSchema>;
export type TSecretSnapshotFoldersInsert = Omit<TSecretSnapshotFolders, TImmutableDBKeys>; export type TSecretSnapshotFoldersInsert = Omit<z.input<typeof SecretSnapshotFoldersSchema>, TImmutableDBKeys>;
export type TSecretSnapshotFoldersUpdate = Partial<Omit<TSecretSnapshotFolders, TImmutableDBKeys>>; export type TSecretSnapshotFoldersUpdate = Partial<Omit<z.input<typeof SecretSnapshotFoldersSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const SecretSnapshotSecretsSchema = z.object({
}); });
export type TSecretSnapshotSecrets = z.infer<typeof SecretSnapshotSecretsSchema>; export type TSecretSnapshotSecrets = z.infer<typeof SecretSnapshotSecretsSchema>;
export type TSecretSnapshotSecretsInsert = Omit<TSecretSnapshotSecrets, TImmutableDBKeys>; export type TSecretSnapshotSecretsInsert = Omit<z.input<typeof SecretSnapshotSecretsSchema>, TImmutableDBKeys>;
export type TSecretSnapshotSecretsUpdate = Partial<Omit<TSecretSnapshotSecrets, TImmutableDBKeys>>; export type TSecretSnapshotSecretsUpdate = Partial<Omit<z.input<typeof SecretSnapshotSecretsSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const SecretSnapshotsSchema = z.object({
}); });
export type TSecretSnapshots = z.infer<typeof SecretSnapshotsSchema>; export type TSecretSnapshots = z.infer<typeof SecretSnapshotsSchema>;
export type TSecretSnapshotsInsert = Omit<TSecretSnapshots, TImmutableDBKeys>; export type TSecretSnapshotsInsert = Omit<z.input<typeof SecretSnapshotsSchema>, TImmutableDBKeys>;
export type TSecretSnapshotsUpdate = Partial<Omit<TSecretSnapshots, TImmutableDBKeys>>; export type TSecretSnapshotsUpdate = Partial<Omit<z.input<typeof SecretSnapshotsSchema>, TImmutableDBKeys>>;

View File

@ -14,5 +14,5 @@ export const SecretTagJunctionSchema = z.object({
}); });
export type TSecretTagJunction = z.infer<typeof SecretTagJunctionSchema>; export type TSecretTagJunction = z.infer<typeof SecretTagJunctionSchema>;
export type TSecretTagJunctionInsert = Omit<TSecretTagJunction, TImmutableDBKeys>; export type TSecretTagJunctionInsert = Omit<z.input<typeof SecretTagJunctionSchema>, TImmutableDBKeys>;
export type TSecretTagJunctionUpdate = Partial<Omit<TSecretTagJunction, TImmutableDBKeys>>; export type TSecretTagJunctionUpdate = Partial<Omit<z.input<typeof SecretTagJunctionSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const SecretTagsSchema = z.object({
}); });
export type TSecretTags = z.infer<typeof SecretTagsSchema>; export type TSecretTags = z.infer<typeof SecretTagsSchema>;
export type TSecretTagsInsert = Omit<TSecretTags, TImmutableDBKeys>; export type TSecretTagsInsert = Omit<z.input<typeof SecretTagsSchema>, TImmutableDBKeys>;
export type TSecretTagsUpdate = Partial<Omit<TSecretTags, TImmutableDBKeys>>; export type TSecretTagsUpdate = Partial<Omit<z.input<typeof SecretTagsSchema>, TImmutableDBKeys>>;

View File

@ -14,5 +14,7 @@ export const SecretVersionTagJunctionSchema = z.object({
}); });
export type TSecretVersionTagJunction = z.infer<typeof SecretVersionTagJunctionSchema>; export type TSecretVersionTagJunction = z.infer<typeof SecretVersionTagJunctionSchema>;
export type TSecretVersionTagJunctionInsert = Omit<TSecretVersionTagJunction, TImmutableDBKeys>; export type TSecretVersionTagJunctionInsert = Omit<z.input<typeof SecretVersionTagJunctionSchema>, TImmutableDBKeys>;
export type TSecretVersionTagJunctionUpdate = Partial<Omit<TSecretVersionTagJunction, TImmutableDBKeys>>; export type TSecretVersionTagJunctionUpdate = Partial<
Omit<z.input<typeof SecretVersionTagJunctionSchema>, TImmutableDBKeys>
>;

View File

@ -36,5 +36,5 @@ export const SecretVersionsSchema = z.object({
}); });
export type TSecretVersions = z.infer<typeof SecretVersionsSchema>; export type TSecretVersions = z.infer<typeof SecretVersionsSchema>;
export type TSecretVersionsInsert = Omit<TSecretVersions, TImmutableDBKeys>; export type TSecretVersionsInsert = Omit<z.input<typeof SecretVersionsSchema>, TImmutableDBKeys>;
export type TSecretVersionsUpdate = Partial<Omit<TSecretVersions, TImmutableDBKeys>>; export type TSecretVersionsUpdate = Partial<Omit<z.input<typeof SecretVersionsSchema>, TImmutableDBKeys>>;

View File

@ -34,5 +34,5 @@ export const SecretsSchema = z.object({
}); });
export type TSecrets = z.infer<typeof SecretsSchema>; export type TSecrets = z.infer<typeof SecretsSchema>;
export type TSecretsInsert = Omit<TSecrets, TImmutableDBKeys>; export type TSecretsInsert = Omit<z.input<typeof SecretsSchema>, TImmutableDBKeys>;
export type TSecretsUpdate = Partial<Omit<TSecrets, TImmutableDBKeys>>; export type TSecretsUpdate = Partial<Omit<z.input<typeof SecretsSchema>, TImmutableDBKeys>>;

View File

@ -25,5 +25,5 @@ export const ServiceTokensSchema = z.object({
}); });
export type TServiceTokens = z.infer<typeof ServiceTokensSchema>; export type TServiceTokens = z.infer<typeof ServiceTokensSchema>;
export type TServiceTokensInsert = Omit<TServiceTokens, TImmutableDBKeys>; export type TServiceTokensInsert = Omit<z.input<typeof ServiceTokensSchema>, TImmutableDBKeys>;
export type TServiceTokensUpdate = Partial<Omit<TServiceTokens, TImmutableDBKeys>>; export type TServiceTokensUpdate = Partial<Omit<z.input<typeof ServiceTokensSchema>, TImmutableDBKeys>>;

View File

@ -13,9 +13,10 @@ export const SuperAdminSchema = z.object({
allowSignUp: z.boolean().default(true).nullable().optional(), allowSignUp: z.boolean().default(true).nullable().optional(),
createdAt: z.date(), createdAt: z.date(),
updatedAt: z.date(), updatedAt: z.date(),
allowedSignUpDomain: z.string().nullable().optional() allowedSignUpDomain: z.string().nullable().optional(),
instanceId: z.string().uuid().default("00000000-0000-0000-0000-000000000000")
}); });
export type TSuperAdmin = z.infer<typeof SuperAdminSchema>; export type TSuperAdmin = z.infer<typeof SuperAdminSchema>;
export type TSuperAdminInsert = Omit<TSuperAdmin, TImmutableDBKeys>; export type TSuperAdminInsert = Omit<z.input<typeof SuperAdminSchema>, TImmutableDBKeys>;
export type TSuperAdminUpdate = Partial<Omit<TSuperAdmin, TImmutableDBKeys>>; export type TSuperAdminUpdate = Partial<Omit<z.input<typeof SuperAdminSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const TrustedIpsSchema = z.object({
}); });
export type TTrustedIps = z.infer<typeof TrustedIpsSchema>; export type TTrustedIps = z.infer<typeof TrustedIpsSchema>;
export type TTrustedIpsInsert = Omit<TTrustedIps, TImmutableDBKeys>; export type TTrustedIpsInsert = Omit<z.input<typeof TrustedIpsSchema>, TImmutableDBKeys>;
export type TTrustedIpsUpdate = Partial<Omit<TTrustedIps, TImmutableDBKeys>>; export type TTrustedIpsUpdate = Partial<Omit<z.input<typeof TrustedIpsSchema>, TImmutableDBKeys>>;

View File

@ -16,5 +16,5 @@ export const UserActionsSchema = z.object({
}); });
export type TUserActions = z.infer<typeof UserActionsSchema>; export type TUserActions = z.infer<typeof UserActionsSchema>;
export type TUserActionsInsert = Omit<TUserActions, TImmutableDBKeys>; export type TUserActionsInsert = Omit<z.input<typeof UserActionsSchema>, TImmutableDBKeys>;
export type TUserActionsUpdate = Partial<Omit<TUserActions, TImmutableDBKeys>>; export type TUserActionsUpdate = Partial<Omit<z.input<typeof UserActionsSchema>, TImmutableDBKeys>>;

View File

@ -25,5 +25,5 @@ export const UserEncryptionKeysSchema = z.object({
}); });
export type TUserEncryptionKeys = z.infer<typeof UserEncryptionKeysSchema>; export type TUserEncryptionKeys = z.infer<typeof UserEncryptionKeysSchema>;
export type TUserEncryptionKeysInsert = Omit<TUserEncryptionKeys, TImmutableDBKeys>; export type TUserEncryptionKeysInsert = Omit<z.input<typeof UserEncryptionKeysSchema>, TImmutableDBKeys>;
export type TUserEncryptionKeysUpdate = Partial<Omit<TUserEncryptionKeys, TImmutableDBKeys>>; export type TUserEncryptionKeysUpdate = Partial<Omit<z.input<typeof UserEncryptionKeysSchema>, TImmutableDBKeys>>;

View File

@ -24,5 +24,5 @@ export const UsersSchema = z.object({
}); });
export type TUsers = z.infer<typeof UsersSchema>; export type TUsers = z.infer<typeof UsersSchema>;
export type TUsersInsert = Omit<TUsers, TImmutableDBKeys>; export type TUsersInsert = Omit<z.input<typeof UsersSchema>, TImmutableDBKeys>;
export type TUsersUpdate = Partial<Omit<TUsers, TImmutableDBKeys>>; export type TUsersUpdate = Partial<Omit<z.input<typeof UsersSchema>, TImmutableDBKeys>>;

View File

@ -25,5 +25,5 @@ export const WebhooksSchema = z.object({
}); });
export type TWebhooks = z.infer<typeof WebhooksSchema>; export type TWebhooks = z.infer<typeof WebhooksSchema>;
export type TWebhooksInsert = Omit<TWebhooks, TImmutableDBKeys>; export type TWebhooksInsert = Omit<z.input<typeof WebhooksSchema>, TImmutableDBKeys>;
export type TWebhooksUpdate = Partial<Omit<TWebhooks, TImmutableDBKeys>>; export type TWebhooksUpdate = Partial<Omit<z.input<typeof WebhooksSchema>, TImmutableDBKeys>>;

View File

@ -505,6 +505,9 @@ export const licenseServiceFactory = ({ orgDAL, permissionService, licenseDAL }:
get isValidLicense() { get isValidLicense() {
return isValidLicense; return isValidLicense;
}, },
getInstanceType() {
return instanceType;
},
getPlan, getPlan,
updateSubscriptionOrgMemberCount, updateSubscriptionOrgMemberCount,
refreshPlan, refreshPlan,

View File

@ -240,7 +240,7 @@ export const secretRotationQueueFactory = ({
); );
}); });
telemetryService.sendPostHogEvents({ await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretRotated, event: PostHogEventTypes.SecretRotated,
distinctId: "", distinctId: "",
properties: { properties: {

View File

@ -158,7 +158,7 @@ export const secretScanningQueueFactory = ({
}); });
} }
telemetryService.sendPostHogEvents({ await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretScannerPush, event: PostHogEventTypes.SecretScannerPush,
distinctId: repository.fullName, distinctId: repository.fullName,
properties: { properties: {
@ -228,7 +228,7 @@ export const secretScanningQueueFactory = ({
}); });
} }
telemetryService.sendPostHogEvents({ await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretScannerFull, event: PostHogEventTypes.SecretScannerFull,
distinctId: repository.fullName, distinctId: repository.fullName,
properties: { properties: {

View File

@ -0,0 +1,20 @@
import { Redis } from "ioredis";
export type TKeyStoreFactory = ReturnType<typeof keyStoreFactory>;
export const keyStoreFactory = (redisUrl: string) => {
const redis = new Redis(redisUrl);
const setItem = async (key: string, value: string | number | Buffer) => redis.set(key, value);
const getItem = async (key: string) => redis.get(key);
const setItemWithExpiry = async (key: string, exp: number | string, value: string | number | Buffer) =>
redis.setex(key, exp, value);
const deleteItem = async (key: string) => redis.del(key);
const incrementBy = async (key: string, value: number) => redis.incrby(key, value);
return { setItem, getItem, setItemWithExpiry, deleteItem, incrementBy };
};

View File

@ -94,14 +94,17 @@ const envSchema = z
SECRET_SCANNING_WEBHOOK_SECRET: zpStr(z.string().optional()), SECRET_SCANNING_WEBHOOK_SECRET: zpStr(z.string().optional()),
SECRET_SCANNING_GIT_APP_ID: zpStr(z.string().optional()), SECRET_SCANNING_GIT_APP_ID: zpStr(z.string().optional()),
SECRET_SCANNING_PRIVATE_KEY: zpStr(z.string().optional()), SECRET_SCANNING_PRIVATE_KEY: zpStr(z.string().optional()),
// LICENCE // LICENSE
LICENSE_SERVER_URL: zpStr(z.string().optional().default("https://portal.infisical.com")), LICENSE_SERVER_URL: zpStr(z.string().optional().default("https://portal.infisical.com")),
LICENSE_SERVER_KEY: zpStr(z.string().optional()), LICENSE_SERVER_KEY: zpStr(z.string().optional()),
LICENSE_KEY: zpStr(z.string().optional()), LICENSE_KEY: zpStr(z.string().optional()),
// GENERIC
STANDALONE_MODE: z STANDALONE_MODE: z
.enum(["true", "false"]) .enum(["true", "false"])
.transform((val) => val === "true") .transform((val) => val === "true")
.optional() .optional(),
INFISICAL_CLOUD: zodStrBool.default("false")
}) })
.transform((data) => ({ .transform((data) => ({
...data, ...data,

View File

@ -1,6 +1,7 @@
import dotenv from "dotenv"; import dotenv from "dotenv";
import { initDbConnection } from "./db"; import { initDbConnection } from "./db";
import { keyStoreFactory } from "./keystore/keystore";
import { formatSmtpConfig, initEnvConfig } from "./lib/config/env"; import { formatSmtpConfig, initEnvConfig } from "./lib/config/env";
import { initLogger } from "./lib/logger"; import { initLogger } from "./lib/logger";
import { queueServiceFactory } from "./queue"; import { queueServiceFactory } from "./queue";
@ -19,8 +20,9 @@ const run = async () => {
const smtp = smtpServiceFactory(formatSmtpConfig()); const smtp = smtpServiceFactory(formatSmtpConfig());
const queue = queueServiceFactory(appCfg.REDIS_URL); const queue = queueServiceFactory(appCfg.REDIS_URL);
const keyStore = keyStoreFactory(appCfg.REDIS_URL);
const server = await main({ db, smtp, logger, queue }); const server = await main({ db, smtp, logger, queue, keyStore });
const bootstrap = await bootstrapCheck({ db }); const bootstrap = await bootstrapCheck({ db });
// eslint-disable-next-line // eslint-disable-next-line
process.on("SIGINT", async () => { process.on("SIGINT", async () => {

View File

@ -13,6 +13,7 @@ export enum QueueName {
SecretReminder = "secret-reminder", SecretReminder = "secret-reminder",
AuditLog = "audit-log", AuditLog = "audit-log",
AuditLogPrune = "audit-log-prune", AuditLogPrune = "audit-log-prune",
TelemetryInstanceStats = "telemtry-self-hosted-stats",
IntegrationSync = "sync-integrations", IntegrationSync = "sync-integrations",
SecretWebhook = "secret-webhook", SecretWebhook = "secret-webhook",
SecretFullRepoScan = "secret-full-repo-scan", SecretFullRepoScan = "secret-full-repo-scan",
@ -26,6 +27,7 @@ export enum QueueJobs {
AuditLog = "audit-log-job", AuditLog = "audit-log-job",
AuditLogPrune = "audit-log-prune-job", AuditLogPrune = "audit-log-prune-job",
SecWebhook = "secret-webhook-trigger", SecWebhook = "secret-webhook-trigger",
TelemetryInstanceStats = "telemetry-self-hosted-stats",
IntegrationSync = "secret-integration-pull", IntegrationSync = "secret-integration-pull",
SecretScan = "secret-scan", SecretScan = "secret-scan",
UpgradeProjectToGhost = "upgrade-project-to-ghost-job" UpgradeProjectToGhost = "upgrade-project-to-ghost-job"
@ -67,7 +69,6 @@ export type TQueueJobTypes = {
payload: TScanFullRepoEventPayload; payload: TScanFullRepoEventPayload;
}; };
[QueueName.SecretPushEventScan]: { name: QueueJobs.SecretScan; payload: TScanPushEventPayload }; [QueueName.SecretPushEventScan]: { name: QueueJobs.SecretScan; payload: TScanPushEventPayload };
[QueueName.UpgradeProjectToGhost]: { [QueueName.UpgradeProjectToGhost]: {
name: QueueJobs.UpgradeProjectToGhost; name: QueueJobs.UpgradeProjectToGhost;
payload: { payload: {
@ -81,6 +82,10 @@ export type TQueueJobTypes = {
}; };
}; };
}; };
[QueueName.TelemetryInstanceStats]: {
name: QueueJobs.TelemetryInstanceStats;
payload: undefined;
};
}; };
export type TQueueServiceFactory = ReturnType<typeof queueServiceFactory>; export type TQueueServiceFactory = ReturnType<typeof queueServiceFactory>;

View File

@ -14,6 +14,7 @@ import fasitfy from "fastify";
import { Knex } from "knex"; import { Knex } from "knex";
import { Logger } from "pino"; import { Logger } from "pino";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { TQueueServiceFactory } from "@app/queue"; import { TQueueServiceFactory } from "@app/queue";
import { TSmtpService } from "@app/services/smtp/smtp-service"; import { TSmtpService } from "@app/services/smtp/smtp-service";
@ -31,10 +32,11 @@ type TMain = {
smtp: TSmtpService; smtp: TSmtpService;
logger?: Logger; logger?: Logger;
queue: TQueueServiceFactory; queue: TQueueServiceFactory;
keyStore: TKeyStoreFactory;
}; };
// Run the server! // Run the server!
export const main = async ({ db, smtp, logger, queue }: TMain) => { export const main = async ({ db, smtp, logger, queue, keyStore }: TMain) => {
const appCfg = getConfig(); const appCfg = getConfig();
const server = fasitfy({ const server = fasitfy({
logger: appCfg.NODE_ENV === "test" ? false : logger, logger: appCfg.NODE_ENV === "test" ? false : logger,
@ -70,7 +72,7 @@ export const main = async ({ db, smtp, logger, queue }: TMain) => {
} }
await server.register(helmet, { contentSecurityPolicy: false }); await server.register(helmet, { contentSecurityPolicy: false });
await server.register(registerRoutes, { smtp, queue, db }); await server.register(registerRoutes, { smtp, queue, db, keyStore });
if (appCfg.isProductionMode) { if (appCfg.isProductionMode) {
await server.register(registerExternalNextjs, { await server.register(registerExternalNextjs, {

View File

@ -34,6 +34,7 @@ import { snapshotFolderDALFactory } from "@app/ee/services/secret-snapshot/snaps
import { snapshotSecretDALFactory } from "@app/ee/services/secret-snapshot/snapshot-secret-dal"; import { snapshotSecretDALFactory } from "@app/ee/services/secret-snapshot/snapshot-secret-dal";
import { trustedIpDALFactory } from "@app/ee/services/trusted-ip/trusted-ip-dal"; import { trustedIpDALFactory } from "@app/ee/services/trusted-ip/trusted-ip-dal";
import { trustedIpServiceFactory } from "@app/ee/services/trusted-ip/trusted-ip-service"; import { trustedIpServiceFactory } from "@app/ee/services/trusted-ip/trusted-ip-service";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { TQueueServiceFactory } from "@app/queue"; import { TQueueServiceFactory } from "@app/queue";
import { apiKeyDALFactory } from "@app/services/api-key/api-key-dal"; import { apiKeyDALFactory } from "@app/services/api-key/api-key-dal";
@ -96,6 +97,8 @@ import { serviceTokenServiceFactory } from "@app/services/service-token/service-
import { TSmtpService } from "@app/services/smtp/smtp-service"; import { TSmtpService } from "@app/services/smtp/smtp-service";
import { superAdminDALFactory } from "@app/services/super-admin/super-admin-dal"; import { superAdminDALFactory } from "@app/services/super-admin/super-admin-dal";
import { getServerCfg, superAdminServiceFactory } from "@app/services/super-admin/super-admin-service"; import { getServerCfg, superAdminServiceFactory } from "@app/services/super-admin/super-admin-service";
import { telemetryDALFactory } from "@app/services/telemetry/telemetry-dal";
import { telemetryQueueServiceFactory } from "@app/services/telemetry/telemetry-queue";
import { telemetryServiceFactory } from "@app/services/telemetry/telemetry-service"; import { telemetryServiceFactory } from "@app/services/telemetry/telemetry-service";
import { userDALFactory } from "@app/services/user/user-dal"; import { userDALFactory } from "@app/services/user/user-dal";
import { userServiceFactory } from "@app/services/user/user-service"; import { userServiceFactory } from "@app/services/user/user-service";
@ -112,7 +115,12 @@ import { registerV3Routes } from "./v3";
export const registerRoutes = async ( export const registerRoutes = async (
server: FastifyZodProvider, server: FastifyZodProvider,
{ db, smtp: smtpService, queue: queueService }: { db: Knex; smtp: TSmtpService; queue: TQueueServiceFactory } {
db,
smtp: smtpService,
queue: queueService,
keyStore
}: { db: Knex; smtp: TSmtpService; queue: TQueueServiceFactory; keyStore: TKeyStoreFactory }
) => { ) => {
await server.register(registerSecretScannerGhApp, { prefix: "/ss-webhook" }); await server.register(registerSecretScannerGhApp, { prefix: "/ss-webhook" });
@ -159,6 +167,7 @@ export const registerRoutes = async (
const auditLogDAL = auditLogDALFactory(db); const auditLogDAL = auditLogDALFactory(db);
const trustedIpDAL = trustedIpDALFactory(db); const trustedIpDAL = trustedIpDALFactory(db);
const scimDAL = scimDALFactory(db); const scimDAL = scimDALFactory(db);
const telemetryDAL = telemetryDALFactory(db);
// ee db layer ops // ee db layer ops
const permissionDAL = permissionDALFactory(db); const permissionDAL = permissionDALFactory(db);
@ -226,7 +235,16 @@ export const registerRoutes = async (
smtpService smtpService
}); });
const telemetryService = telemetryServiceFactory(); const telemetryService = telemetryServiceFactory({
keyStore,
licenseService
});
const telemetryQueue = telemetryQueueServiceFactory({
keyStore,
telemetryDAL,
queueService
});
const tokenService = tokenServiceFactory({ tokenDAL: authTokenDAL, userDAL }); const tokenService = tokenServiceFactory({ tokenDAL: authTokenDAL, userDAL });
const userService = userServiceFactory({ userDAL }); const userService = userServiceFactory({ userDAL });
const loginService = authLoginServiceFactory({ userDAL, smtpService, tokenService }); const loginService = authLoginServiceFactory({ userDAL, smtpService, tokenService });
@ -263,7 +281,8 @@ export const registerRoutes = async (
userDAL, userDAL,
authService: loginService, authService: loginService,
serverCfgDAL: superAdminDAL, serverCfgDAL: superAdminDAL,
orgService orgService,
keyStore
}); });
const apiKeyService = apiKeyServiceFactory({ apiKeyDAL, userDAL }); const apiKeyService = apiKeyServiceFactory({ apiKeyDAL, userDAL });
@ -491,9 +510,13 @@ export const registerRoutes = async (
}); });
await superAdminService.initServerCfg(); await superAdminService.initServerCfg();
await auditLogQueue.startAuditLogPruneJob(); //
// setup the communication with license key server // setup the communication with license key server
await licenseService.init(); await licenseService.init();
await auditLogQueue.startAuditLogPruneJob();
await telemetryQueue.startTelemetryCheck();
// inject all services // inject all services
server.decorate<FastifyZodProvider["services"]>("services", { server.decorate<FastifyZodProvider["services"]>("services", {
login: loginService, login: loginService,

View File

@ -16,7 +16,7 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
schema: { schema: {
response: { response: {
200: z.object({ 200: z.object({
config: SuperAdminSchema config: SuperAdminSchema.omit({ createdAt: true, updatedAt: true })
}) })
} }
}, },
@ -90,7 +90,7 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
userAgent: req.headers["user-agent"] || "" userAgent: req.headers["user-agent"] || ""
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.AdminInit, event: PostHogEventTypes.AdminInit,
distinctId: user.user.email, distinctId: user.user.email,
properties: { properties: {

View File

@ -51,7 +51,7 @@ export const registerIdentityRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.MachineIdentityCreated, event: PostHogEventTypes.MachineIdentityCreated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {

View File

@ -82,7 +82,7 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.IntegrationCreated, event: PostHogEventTypes.IntegrationCreated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {

View File

@ -32,7 +32,7 @@ export const registerInviteOrgRouter = async (server: FastifyZodProvider) => {
actorOrgId: req.permission.orgId actorOrgId: req.permission.orgId
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.UserOrgInvitation, event: PostHogEventTypes.UserOrgInvitation,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {

View File

@ -154,7 +154,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
slug: req.body.slug slug: req.body.slug
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.ProjectCreated, event: PostHogEventTypes.ProjectCreated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {

View File

@ -95,7 +95,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled, event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -185,7 +185,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled, event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -261,7 +261,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated, event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -336,7 +336,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated, event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -406,7 +406,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted, event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -512,7 +512,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
(req.headers["user-agent"] !== "k8-operator" || shouldRecordK8Event); (req.headers["user-agent"] !== "k8-operator" || shouldRecordK8Event);
const approximateNumberTotalSecrets = secrets.length * 20; const approximateNumberTotalSecrets = secrets.length * 20;
if (shouldCapture) { if (shouldCapture) {
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled, event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -589,7 +589,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled, event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -752,7 +752,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated, event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -934,7 +934,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated, event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -1052,7 +1052,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted, event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -1172,7 +1172,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated, event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -1292,7 +1292,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated, event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {
@ -1400,7 +1400,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
} }
}); });
server.services.telemetry.sendPostHogEvents({ await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted, event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req), distinctId: getTelemetryDistinctId(req),
properties: { properties: {

View File

@ -238,6 +238,8 @@ export const projectMembershipServiceFactory = ({
if (orgMembers.length !== emails.length) throw new BadRequestError({ message: "Some users are not part of org" }); if (orgMembers.length !== emails.length) throw new BadRequestError({ message: "Some users are not part of org" });
if (!orgMembers.length) return [];
const existingMembers = await projectMembershipDAL.find({ const existingMembers = await projectMembershipDAL.find({
projectId, projectId,
$in: { userId: orgMembers.map(({ user }) => user.id).filter(Boolean) } $in: { userId: orgMembers.map(({ user }) => user.id).filter(Boolean) }

View File

@ -1,4 +1,5 @@
import { TSuperAdmin, TSuperAdminUpdate } from "@app/db/schemas"; import { TSuperAdmin, TSuperAdminUpdate } from "@app/db/schemas";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors"; import { BadRequestError } from "@app/lib/errors";
@ -14,6 +15,7 @@ type TSuperAdminServiceFactoryDep = {
userDAL: TUserDALFactory; userDAL: TUserDALFactory;
authService: Pick<TAuthLoginFactory, "generateUserTokens">; authService: Pick<TAuthLoginFactory, "generateUserTokens">;
orgService: Pick<TOrgServiceFactory, "createOrganization">; orgService: Pick<TOrgServiceFactory, "createOrganization">;
keyStore: Pick<TKeyStoreFactory, "getItem" | "setItemWithExpiry" | "deleteItem">;
}; };
export type TSuperAdminServiceFactory = ReturnType<typeof superAdminServiceFactory>; export type TSuperAdminServiceFactory = ReturnType<typeof superAdminServiceFactory>;
@ -21,26 +23,53 @@ export type TSuperAdminServiceFactory = ReturnType<typeof superAdminServiceFacto
// eslint-disable-next-line // eslint-disable-next-line
export let getServerCfg: () => Promise<TSuperAdmin>; export let getServerCfg: () => Promise<TSuperAdmin>;
const ADMIN_CONFIG_KEY = "infisical-admin-cfg";
const ADMIN_CONFIG_KEY_EXP = 60; // 60s
const ADMIN_CONFIG_DB_UUID = "00000000-0000-0000-0000-000000000000";
export const superAdminServiceFactory = ({ export const superAdminServiceFactory = ({
serverCfgDAL, serverCfgDAL,
userDAL, userDAL,
authService, authService,
orgService orgService,
keyStore
}: TSuperAdminServiceFactoryDep) => { }: TSuperAdminServiceFactoryDep) => {
const initServerCfg = async () => { const initServerCfg = async () => {
// TODO(akhilmhdh): bad pattern time less change this later to me itself // TODO(akhilmhdh): bad pattern time less change this later to me itself
getServerCfg = () => serverCfgDAL.findOne({}); getServerCfg = async () => {
const config = await keyStore.getItem(ADMIN_CONFIG_KEY);
// missing in keystore means fetch from db
if (!config) {
const serverCfg = await serverCfgDAL.findById(ADMIN_CONFIG_DB_UUID);
if (serverCfg) {
await keyStore.setItemWithExpiry(ADMIN_CONFIG_KEY, ADMIN_CONFIG_KEY_EXP, JSON.stringify(serverCfg)); // insert it back to keystore
}
return serverCfg;
}
const serverCfg = await serverCfgDAL.findOne({}); const keyStoreServerCfg = JSON.parse(config) as TSuperAdmin;
return {
...keyStoreServerCfg,
// this is to allow admin router to work
createdAt: new Date(keyStoreServerCfg.createdAt),
updatedAt: new Date(keyStoreServerCfg.updatedAt)
};
};
// reset on initialized
await keyStore.deleteItem(ADMIN_CONFIG_KEY);
const serverCfg = await serverCfgDAL.findById(ADMIN_CONFIG_DB_UUID);
if (serverCfg) return; if (serverCfg) return;
const newCfg = await serverCfgDAL.create({ initialized: false, allowSignUp: true });
// @ts-expect-error id is kept as fixed for idempotence and to avoid race condition
const newCfg = await serverCfgDAL.create({ initialized: false, allowSignUp: true, id: ADMIN_CONFIG_DB_UUID });
return newCfg; return newCfg;
}; };
const updateServerCfg = async (data: TSuperAdminUpdate) => { const updateServerCfg = async (data: TSuperAdminUpdate) => {
const serverCfg = await getServerCfg(); const updatedServerCfg = await serverCfgDAL.updateById(ADMIN_CONFIG_DB_UUID, data);
const cfg = await serverCfgDAL.updateById(serverCfg.id, data); await keyStore.setItemWithExpiry(ADMIN_CONFIG_KEY, ADMIN_CONFIG_KEY_EXP, JSON.stringify(updatedServerCfg));
return cfg; return updatedServerCfg;
}; };
const adminSignUp = async ({ const adminSignUp = async ({

View File

@ -0,0 +1,39 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
export type TTelemetryDALFactory = ReturnType<typeof telemetryDALFactory>;
export const telemetryDALFactory = (db: TDbClient) => {
const getTelemetryInstanceStats = async () => {
try {
const userCount = (await db(TableName.Users).where({ isGhost: false }).count().first())?.count as string;
const users = parseInt(userCount || "0", 10);
const identityCount = (await db(TableName.Identity).count().first())?.count as string;
const identities = parseInt(identityCount || "0", 10);
const projectCount = (await db(TableName.Project).count().first())?.count as string;
const projects = parseInt(projectCount || "0", 10);
const secretCount = (await db(TableName.Secret).count().first())?.count as string;
const secrets = parseInt(secretCount || "0", 10);
const organizationNames = await db(TableName.Organization).select("name");
const organizations = organizationNames.length;
return {
users,
identities,
projects,
secrets,
organizations,
organizationNames: organizationNames.map(({ name }) => name)
};
} catch (error) {
throw new DatabaseError({ error, name: "TelemtryInstanceStats" });
}
};
return { getTelemetryInstanceStats };
};

View File

@ -0,0 +1,78 @@
import { PostHog } from "posthog-node";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { logger } from "@app/lib/logger";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { getServerCfg } from "../super-admin/super-admin-service";
import { TTelemetryDALFactory } from "./telemetry-dal";
import { TELEMETRY_SECRET_OPERATIONS_KEY, TELEMETRY_SECRET_PROCESSED_KEY } from "./telemetry-service";
import { PostHogEventTypes } from "./telemetry-types";
type TTelemetryQueueServiceFactoryDep = {
queueService: TQueueServiceFactory;
keyStore: Pick<TKeyStoreFactory, "getItem" | "deleteItem">;
telemetryDAL: TTelemetryDALFactory;
};
export type TTelemetryQueueServiceFactory = ReturnType<typeof telemetryQueueServiceFactory>;
export const telemetryQueueServiceFactory = ({
queueService,
keyStore,
telemetryDAL
}: TTelemetryQueueServiceFactoryDep) => {
const appCfg = getConfig();
const postHog =
appCfg.isProductionMode && appCfg.TELEMETRY_ENABLED
? new PostHog(appCfg.POSTHOG_PROJECT_API_KEY, { host: appCfg.POSTHOG_HOST, flushAt: 1, flushInterval: 0 })
: undefined;
queueService.start(QueueName.TelemetryInstanceStats, async () => {
const { instanceId } = await getServerCfg();
const telemtryStats = await telemetryDAL.getTelemetryInstanceStats();
// parse the redis values into integer
const numberOfSecretOperationsMade = parseInt((await keyStore.getItem(TELEMETRY_SECRET_OPERATIONS_KEY)) || "0", 10);
const numberOfSecretProcessed = parseInt((await keyStore.getItem(TELEMETRY_SECRET_PROCESSED_KEY)) || "0", 10);
const stats = { ...telemtryStats, numberOfSecretProcessed, numberOfSecretOperationsMade };
// send to postHog
postHog?.capture({
event: PostHogEventTypes.TelemetryInstanceStats,
distinctId: instanceId,
properties: stats
});
// reset the stats
await keyStore.deleteItem(TELEMETRY_SECRET_PROCESSED_KEY);
await keyStore.deleteItem(TELEMETRY_SECRET_OPERATIONS_KEY);
});
// every day at midnight a telemetry job executes on self hosted
// this sends some telemetry information like instance id secrets operated etc
const startTelemetryCheck = async () => {
// this is a fast way to check its cloud or not
if (appCfg.INFISICAL_CLOUD) return;
// clear previous job
await queueService.stopRepeatableJob(
QueueName.TelemetryInstanceStats,
QueueJobs.TelemetryInstanceStats,
{ pattern: "0 0 * * *", utc: true },
QueueName.TelemetryInstanceStats // just a job id
);
if (postHog) {
await queueService.queue(QueueName.TelemetryInstanceStats, QueueJobs.TelemetryInstanceStats, undefined, {
jobId: QueueName.TelemetryInstanceStats,
repeat: { pattern: "0 0 * * *", utc: true }
});
}
};
queueService.listen(QueueName.TelemetryInstanceStats, "failed", (err) => {
logger.error(err?.failedReason, `${QueueName.TelemetryInstanceStats}: failed`);
});
return {
startTelemetryCheck
};
};

View File

@ -1,15 +1,24 @@
import { PostHog } from "posthog-node"; import { PostHog } from "posthog-node";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { InstanceType } from "@app/ee/services/license/license-types";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request"; import { request } from "@app/lib/config/request";
import { logger } from "@app/lib/logger"; import { logger } from "@app/lib/logger";
import { TPostHogEvent } from "./telemetry-types"; import { PostHogEventTypes, TPostHogEvent, TSecretModifiedEvent } from "./telemetry-types";
export const TELEMETRY_SECRET_PROCESSED_KEY = "telemetry-secret-processed";
export const TELEMETRY_SECRET_OPERATIONS_KEY = "telemetry-secret-operations";
export type TTelemetryServiceFactory = ReturnType<typeof telemetryServiceFactory>; export type TTelemetryServiceFactory = ReturnType<typeof telemetryServiceFactory>;
export type TTelemetryServiceFactoryDep = {
keyStore: Pick<TKeyStoreFactory, "getItem" | "incrementBy">;
licenseService: Pick<TLicenseServiceFactory, "getInstanceType">;
};
// type TTelemetryServiceFactoryDep = {}; export const telemetryServiceFactory = ({ keyStore, licenseService }: TTelemetryServiceFactoryDep) => {
export const telemetryServiceFactory = () => {
const appCfg = getConfig(); const appCfg = getConfig();
if (appCfg.isProductionMode && !appCfg.TELEMETRY_ENABLED) { if (appCfg.isProductionMode && !appCfg.TELEMETRY_ENABLED) {
@ -21,10 +30,9 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
`); `);
} }
const postHog = const postHog = appCfg.TELEMETRY_ENABLED
appCfg.isProductionMode && appCfg.TELEMETRY_ENABLED ? new PostHog(appCfg.POSTHOG_PROJECT_API_KEY, { host: appCfg.POSTHOG_HOST })
? new PostHog(appCfg.POSTHOG_PROJECT_API_KEY, { host: appCfg.POSTHOG_HOST }) : undefined;
: undefined;
// used for email marketting email sending purpose // used for email marketting email sending purpose
const sendLoopsEvent = async (email: string, firstName?: string, lastName?: string) => { const sendLoopsEvent = async (email: string, firstName?: string, lastName?: string) => {
@ -51,13 +59,33 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
} }
}; };
const sendPostHogEvents = (event: TPostHogEvent) => { const sendPostHogEvents = async (event: TPostHogEvent) => {
if (postHog) { if (postHog) {
postHog.capture({ const instanceType = licenseService.getInstanceType();
event: event.event, // capture posthog only when its cloud or signup event happens in self hosted
distinctId: event.distinctId, if (instanceType === InstanceType.Cloud || event.event === PostHogEventTypes.UserSignedUp) {
properties: event.properties postHog.capture({
}); event: event.event,
distinctId: event.distinctId,
properties: event.properties
});
return;
}
if (
[
PostHogEventTypes.SecretPulled,
PostHogEventTypes.SecretCreated,
PostHogEventTypes.SecretDeleted,
PostHogEventTypes.SecretUpdated
].includes(event.event)
) {
await keyStore.incrementBy(
TELEMETRY_SECRET_PROCESSED_KEY,
(event as TSecretModifiedEvent).properties.numberOfSecrets
);
await keyStore.incrementBy(TELEMETRY_SECRET_OPERATIONS_KEY, 1);
}
} }
}; };

View File

@ -12,7 +12,8 @@ export enum PostHogEventTypes {
ProjectCreated = "Project Created", ProjectCreated = "Project Created",
IntegrationCreated = "Integration Created", IntegrationCreated = "Integration Created",
MachineIdentityCreated = "Machine Identity Created", MachineIdentityCreated = "Machine Identity Created",
UserOrgInvitation = "User Org Invitation" UserOrgInvitation = "User Org Invitation",
TelemetryInstanceStats = "Self Hosted Instance Stats"
} }
export type TSecretModifiedEvent = { export type TSecretModifiedEvent = {
@ -101,6 +102,20 @@ export type TUserOrgInvitedEvent = {
}; };
}; };
export type TTelemetryInstanceStatsEvent = {
event: PostHogEventTypes.TelemetryInstanceStats;
properties: {
users: number;
identities: number;
projects: number;
secrets: number;
organizations: number;
organizationNames: number;
numberOfSecretOperationsMade: number;
numberOfSecretProcessed: number;
};
};
export type TPostHogEvent = { distinctId: string } & ( export type TPostHogEvent = { distinctId: string } & (
| TSecretModifiedEvent | TSecretModifiedEvent
| TAdminInitEvent | TAdminInitEvent
@ -110,4 +125,5 @@ export type TPostHogEvent = { distinctId: string } & (
| TMachineIdentityCreatedEvent | TMachineIdentityCreatedEvent
| TIntegrationCreatedEvent | TIntegrationCreatedEvent
| TProjectCreateEvent | TProjectCreateEvent
| TTelemetryInstanceStatsEvent
); );

View File

@ -1,5 +1,5 @@
infisical: infisical:
address: "http://localhost:8080" address: "https://app.infisical.com/"
auth: auth:
type: "universal-auth" type: "universal-auth"
config: config:
@ -13,3 +13,12 @@ sinks:
templates: templates:
- source-path: my-dot-ev-secret-template - source-path: my-dot-ev-secret-template
destination-path: my-dot-env.env destination-path: my-dot-env.env
config:
polling-interval: 60s
execute:
command: docker-compose -f docker-compose.prod.yml down && docker-compose -f docker-compose.prod.yml up -d
- source-path: my-dot-ev-secret-template1
destination-path: my-dot-env-1.env
config:
exec:
command: mkdir hello-world1

View File

@ -490,5 +490,7 @@ func CallGetRawSecretsV3(httpClient *resty.Client, request GetRawSecretsV3Reques
return GetRawSecretsV3Response{}, fmt.Errorf("CallGetRawSecretsV3: Unsuccessful response [%v %v] [status-code=%v] [response=%v]", response.Request.Method, response.Request.URL, response.StatusCode(), response.String()) return GetRawSecretsV3Response{}, fmt.Errorf("CallGetRawSecretsV3: Unsuccessful response [%v %v] [status-code=%v] [response=%v]", response.Request.Method, response.Request.URL, response.StatusCode(), response.String())
} }
getRawSecretsV3Response.ETag = response.Header().Get(("etag"))
return getRawSecretsV3Response, nil return getRawSecretsV3Response, nil
} }

View File

@ -505,4 +505,5 @@ type GetRawSecretsV3Response struct {
SecretComment string `json:"secretComment"` SecretComment string `json:"secretComment"`
} `json:"secrets"` } `json:"secrets"`
Imports []any `json:"imports"` Imports []any `json:"imports"`
ETag string
} }

View File

@ -5,12 +5,15 @@ package cmd
import ( import (
"bytes" "bytes"
"context"
"encoding/base64" "encoding/base64"
"fmt" "fmt"
"io/ioutil" "io/ioutil"
"os" "os"
"os/exec"
"os/signal" "os/signal"
"path" "path"
"runtime"
"strings" "strings"
"sync" "sync"
"syscall" "syscall"
@ -71,12 +74,56 @@ type Template struct {
SourcePath string `yaml:"source-path"` SourcePath string `yaml:"source-path"`
Base64TemplateContent string `yaml:"base64-template-content"` Base64TemplateContent string `yaml:"base64-template-content"`
DestinationPath string `yaml:"destination-path"` DestinationPath string `yaml:"destination-path"`
Config struct { // Configurations for the template
PollingInterval string `yaml:"polling-interval"` // How often to poll for changes in the secret
Execute struct {
Command string `yaml:"command"` // Command to execute once the template has been rendered
Timeout int64 `yaml:"timeout"` // Timeout for the command
} `yaml:"execute"` // Command to execute once the template has been rendered
} `yaml:"config"`
} }
func ReadFile(filePath string) ([]byte, error) { func ReadFile(filePath string) ([]byte, error) {
return ioutil.ReadFile(filePath) return ioutil.ReadFile(filePath)
} }
func ExecuteCommandWithTimeout(command string, timeout int64) error {
shell := [2]string{"sh", "-c"}
if runtime.GOOS == "windows" {
shell = [2]string{"cmd", "/C"}
} else {
currentShell := os.Getenv("SHELL")
if currentShell != "" {
shell[0] = currentShell
}
}
ctx := context.Background()
if timeout > 0 {
var cancel context.CancelFunc
ctx, cancel = context.WithTimeout(context.Background(), time.Duration(timeout)*time.Second)
defer cancel()
}
cmd := exec.CommandContext(ctx, shell[0], shell[1], command)
cmd.Stdin = os.Stdin
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
if err := cmd.Run(); err != nil {
if exitError, ok := err.(*exec.ExitError); ok { // type assertion
if exitError.ProcessState.ExitCode() == -1 {
return fmt.Errorf("command timed out")
}
}
return err
} else {
return nil
}
}
func FileExists(filepath string) bool { func FileExists(filepath string) bool {
info, err := os.Stat(filepath) info, err := os.Stat(filepath)
if os.IsNotExist(err) { if os.IsNotExist(err) {
@ -170,20 +217,24 @@ func ParseAgentConfig(configFile []byte) (*Config, error) {
return config, nil return config, nil
} }
func secretTemplateFunction(accessToken string) func(string, string, string) ([]models.SingleEnvironmentVariable, error) { func secretTemplateFunction(accessToken string, existingEtag string, currentEtag *string) func(string, string, string) ([]models.SingleEnvironmentVariable, error) {
return func(projectID, envSlug, secretPath string) ([]models.SingleEnvironmentVariable, error) { return func(projectID, envSlug, secretPath string) ([]models.SingleEnvironmentVariable, error) {
secrets, err := util.GetPlainTextSecretsViaMachineIdentity(accessToken, projectID, envSlug, secretPath, false) res, err := util.GetPlainTextSecretsViaMachineIdentity(accessToken, projectID, envSlug, secretPath, false)
if err != nil { if err != nil {
return nil, err return nil, err
} }
return secrets, nil if existingEtag != res.Etag {
*currentEtag = res.Etag
}
return res.Secrets, nil
} }
} }
func ProcessTemplate(templatePath string, data interface{}, accessToken string) (*bytes.Buffer, error) { func ProcessTemplate(templatePath string, data interface{}, accessToken string, existingEtag string, currentEtag *string) (*bytes.Buffer, error) {
// custom template function to fetch secrets from Infisical // custom template function to fetch secrets from Infisical
secretFunction := secretTemplateFunction(accessToken) secretFunction := secretTemplateFunction(accessToken, existingEtag, currentEtag)
funcs := template.FuncMap{ funcs := template.FuncMap{
"secret": secretFunction, "secret": secretFunction,
} }
@ -203,7 +254,7 @@ func ProcessTemplate(templatePath string, data interface{}, accessToken string)
return &buf, nil return &buf, nil
} }
func ProcessBase64Template(encodedTemplate string, data interface{}, accessToken string) (*bytes.Buffer, error) { func ProcessBase64Template(encodedTemplate string, data interface{}, accessToken string, existingEtag string, currentEtag *string) (*bytes.Buffer, error) {
// custom template function to fetch secrets from Infisical // custom template function to fetch secrets from Infisical
decoded, err := base64.StdEncoding.DecodeString(encodedTemplate) decoded, err := base64.StdEncoding.DecodeString(encodedTemplate)
if err != nil { if err != nil {
@ -212,7 +263,7 @@ func ProcessBase64Template(encodedTemplate string, data interface{}, accessToken
templateString := string(decoded) templateString := string(decoded)
secretFunction := secretTemplateFunction(accessToken) secretFunction := secretTemplateFunction(accessToken, existingEtag, currentEtag) // TODO: Fix this
funcs := template.FuncMap{ funcs := template.FuncMap{
"secret": secretFunction, "secret": secretFunction,
} }
@ -250,7 +301,16 @@ type TokenManager struct {
} }
func NewTokenManager(fileDeposits []Sink, templates []Template, clientIdPath string, clientSecretPath string, newAccessTokenNotificationChan chan bool, removeClientSecretOnRead bool, exitAfterAuth bool) *TokenManager { func NewTokenManager(fileDeposits []Sink, templates []Template, clientIdPath string, clientSecretPath string, newAccessTokenNotificationChan chan bool, removeClientSecretOnRead bool, exitAfterAuth bool) *TokenManager {
return &TokenManager{filePaths: fileDeposits, templates: templates, clientIdPath: clientIdPath, clientSecretPath: clientSecretPath, newAccessTokenNotificationChan: newAccessTokenNotificationChan, removeClientSecretOnRead: removeClientSecretOnRead, exitAfterAuth: exitAfterAuth} return &TokenManager{
filePaths: fileDeposits,
templates: templates,
clientIdPath: clientIdPath,
clientSecretPath: clientSecretPath,
newAccessTokenNotificationChan: newAccessTokenNotificationChan,
removeClientSecretOnRead: removeClientSecretOnRead,
exitAfterAuth: exitAfterAuth,
}
} }
func (tm *TokenManager) SetToken(token string, accessTokenTTL time.Duration, accessTokenMaxTTL time.Duration) { func (tm *TokenManager) SetToken(token string, accessTokenTTL time.Duration, accessTokenMaxTTL time.Duration) {
@ -428,38 +488,80 @@ func (tm *TokenManager) WriteTokenToFiles() {
} }
} }
func (tm *TokenManager) FetchSecrets() { func (tm *TokenManager) WriteTemplateToFile(bytes *bytes.Buffer, template *Template) {
log.Info().Msgf("template engine started...") if err := WriteBytesToFile(bytes, template.DestinationPath); err != nil {
log.Error().Msgf("template engine: unable to write secrets to path because %s. Will try again on next cycle", err)
return
}
log.Info().Msgf("template engine: secret template at path %s has been rendered and saved to path %s", template.SourcePath, template.DestinationPath)
}
func (tm *TokenManager) MonitorSecretChanges(secretTemplate Template, sigChan chan os.Signal) {
pollingInterval := time.Duration(5 * time.Minute)
if secretTemplate.Config.PollingInterval != "" {
interval, err := util.ConvertPollingIntervalToTime(secretTemplate.Config.PollingInterval)
if err != nil {
log.Error().Msgf("unable to convert polling interval to time because %v", err)
sigChan <- syscall.SIGINT
return
} else {
pollingInterval = interval
}
}
var existingEtag string
var currentEtag string
var firstRun = true
execTimeout := secretTemplate.Config.Execute.Timeout
execCommand := secretTemplate.Config.Execute.Command
for { for {
token := tm.GetToken() token := tm.GetToken()
if token != "" { if token != "" {
for _, secretTemplate := range tm.templates {
var processedTemplate *bytes.Buffer
var err error
if secretTemplate.SourcePath != "" {
processedTemplate, err = ProcessTemplate(secretTemplate.SourcePath, nil, token)
} else {
processedTemplate, err = ProcessBase64Template(secretTemplate.Base64TemplateContent, nil, token)
}
if err != nil { var processedTemplate *bytes.Buffer
log.Error().Msgf("template engine: unable to render secrets because %s. Will try again on next cycle", err) var err error
continue if secretTemplate.SourcePath != "" {
} processedTemplate, err = ProcessTemplate(secretTemplate.SourcePath, nil, token, existingEtag, &currentEtag)
} else {
if err := WriteBytesToFile(processedTemplate, secretTemplate.DestinationPath); err != nil { processedTemplate, err = ProcessBase64Template(secretTemplate.Base64TemplateContent, nil, token, existingEtag, &currentEtag)
log.Error().Msgf("template engine: unable to write secrets to path because %s. Will try again on next cycle", err)
continue
}
log.Info().Msgf("template engine: secret template at path %s has been rendered and saved to path %s", secretTemplate.SourcePath, secretTemplate.DestinationPath)
} }
// fetch new secrets every 5 minutes (TODO: add PubSub in the future ) if err != nil {
time.Sleep(5 * time.Minute) log.Error().Msgf("unable to process template because %v", err)
} else {
if (existingEtag != currentEtag) || firstRun {
tm.WriteTemplateToFile(processedTemplate, &secretTemplate)
existingEtag = currentEtag
if !firstRun && execCommand != "" {
log.Info().Msgf("executing command: %s", execCommand)
err := ExecuteCommandWithTimeout(execCommand, execTimeout)
if err != nil {
log.Error().Msgf("unable to execute command because %v", err)
}
}
if firstRun {
firstRun = false
}
}
}
time.Sleep(pollingInterval)
} else {
// It fails to get the access token. So we will re-try in 3 seconds. We do this because if we don't, the user will have to wait for the next polling interval to get the first secret render.
time.Sleep(3 * time.Second)
} }
} }
} }
@ -544,7 +646,11 @@ var agentCmd = &cobra.Command{
tm := NewTokenManager(filePaths, agentConfig.Templates, configUniversalAuthType.ClientIDPath, configUniversalAuthType.ClientSecretPath, tokenRefreshNotifier, configUniversalAuthType.RemoveClientSecretOnRead, agentConfig.Infisical.ExitAfterAuth) tm := NewTokenManager(filePaths, agentConfig.Templates, configUniversalAuthType.ClientIDPath, configUniversalAuthType.ClientSecretPath, tokenRefreshNotifier, configUniversalAuthType.RemoveClientSecretOnRead, agentConfig.Infisical.ExitAfterAuth)
go tm.ManageTokenLifecycle() go tm.ManageTokenLifecycle()
go tm.FetchSecrets()
for i, template := range agentConfig.Templates {
log.Info().Msgf("template engine started for template %v...", i+1)
go tm.MonitorSecretChanges(template, sigChan)
}
for { for {
select { select {

View File

@ -87,16 +87,12 @@ var exportCmd = &cobra.Command{
var output string var output string
if shouldExpandSecrets { if shouldExpandSecrets {
substitutions := util.ExpandSecrets(secrets, infisicalToken, "") secrets = util.ExpandSecrets(secrets, infisicalToken, "")
output, err = formatEnvs(substitutions, format) }
if err != nil { secrets = util.FilterSecretsByTag(secrets, tagSlugs)
util.HandleError(err) output, err = formatEnvs(secrets, format)
} if err != nil {
} else { util.HandleError(err)
output, err = formatEnvs(secrets, format)
if err != nil {
util.HandleError(err)
}
} }
fmt.Print(output) fmt.Print(output)

View File

@ -34,6 +34,11 @@ type SingleEnvironmentVariable struct {
Comment string `json:"comment"` Comment string `json:"comment"`
} }
type PlaintextSecretResult struct {
Secrets []SingleEnvironmentVariable
Etag string
}
type SingleFolder struct { type SingleFolder struct {
ID string `json:"_id"` ID string `json:"_id"`
Name string `json:"name"` Name string `json:"name"`

View File

@ -0,0 +1,41 @@
package util
import (
"fmt"
"strconv"
"time"
)
// ConvertPollingIntervalToTime converts a string representation of a polling interval to a time.Duration
func ConvertPollingIntervalToTime(pollingInterval string) (time.Duration, error) {
length := len(pollingInterval)
if length < 2 {
return 0, fmt.Errorf("invalid format")
}
unit := pollingInterval[length-1:]
numberPart := pollingInterval[:length-1]
number, err := strconv.Atoi(numberPart)
if err != nil {
return 0, err
}
switch unit {
case "s":
if number < 60 {
return 0, fmt.Errorf("polling interval should be at least 60 seconds")
}
return time.Duration(number) * time.Second, nil
case "m":
return time.Duration(number) * time.Minute, nil
case "h":
return time.Duration(number) * time.Hour, nil
case "d":
return time.Duration(number) * 24 * time.Hour, nil
case "w":
return time.Duration(number) * 7 * 24 * time.Hour, nil
default:
return 0, fmt.Errorf("invalid time unit")
}
}

View File

@ -152,7 +152,7 @@ func GetPlainTextSecretsViaJTW(JTWToken string, receiversPrivateKey string, work
return plainTextSecrets, nil return plainTextSecrets, nil
} }
func GetPlainTextSecretsViaMachineIdentity(accessToken string, workspaceId string, environmentName string, secretsPath string, includeImports bool) ([]models.SingleEnvironmentVariable, error) { func GetPlainTextSecretsViaMachineIdentity(accessToken string, workspaceId string, environmentName string, secretsPath string, includeImports bool) (models.PlaintextSecretResult, error) {
httpClient := resty.New() httpClient := resty.New()
httpClient.SetAuthToken(accessToken). httpClient.SetAuthToken(accessToken).
SetHeader("Accept", "application/json") SetHeader("Accept", "application/json")
@ -170,12 +170,12 @@ func GetPlainTextSecretsViaMachineIdentity(accessToken string, workspaceId strin
rawSecrets, err := api.CallGetRawSecretsV3(httpClient, api.GetRawSecretsV3Request{WorkspaceId: workspaceId, SecretPath: secretsPath, Environment: environmentName}) rawSecrets, err := api.CallGetRawSecretsV3(httpClient, api.GetRawSecretsV3Request{WorkspaceId: workspaceId, SecretPath: secretsPath, Environment: environmentName})
if err != nil { if err != nil {
return nil, err return models.PlaintextSecretResult{}, err
} }
plainTextSecrets := []models.SingleEnvironmentVariable{} plainTextSecrets := []models.SingleEnvironmentVariable{}
if err != nil { if err != nil {
return nil, fmt.Errorf("unable to decrypt your secrets [err=%v]", err) return models.PlaintextSecretResult{}, fmt.Errorf("unable to decrypt your secrets [err=%v]", err)
} }
for _, secret := range rawSecrets.Secrets { for _, secret := range rawSecrets.Secrets {
@ -189,7 +189,10 @@ func GetPlainTextSecretsViaMachineIdentity(accessToken string, workspaceId strin
// } // }
// } // }
return plainTextSecrets, nil return models.PlaintextSecretResult{
Secrets: plainTextSecrets,
Hash: rawSecrets.ETag,
}, nil
} }
func InjectImportedSecret(plainTextWorkspaceKey []byte, secrets []models.SingleEnvironmentVariable, importedSecrets []api.ImportedSecretV3) ([]models.SingleEnvironmentVariable, error) { func InjectImportedSecret(plainTextWorkspaceKey []byte, secrets []models.SingleEnvironmentVariable, importedSecrets []api.ImportedSecretV3) ([]models.SingleEnvironmentVariable, error) {
@ -220,6 +223,30 @@ func InjectImportedSecret(plainTextWorkspaceKey []byte, secrets []models.SingleE
return secrets, nil return secrets, nil
} }
func FilterSecretsByTag(plainTextSecrets []models.SingleEnvironmentVariable, tagSlugs string) []models.SingleEnvironmentVariable {
if tagSlugs == "" {
return plainTextSecrets
}
tagSlugsMap := make(map[string]bool)
tagSlugsList := strings.Split(tagSlugs, ",")
for _, slug := range tagSlugsList {
tagSlugsMap[slug] = true
}
filteredSecrets := []models.SingleEnvironmentVariable{}
for _, secret := range plainTextSecrets {
for _, tag := range secret.Tags {
if tagSlugsMap[tag.Slug] {
filteredSecrets = append(filteredSecrets, secret)
break
}
}
}
return filteredSecrets
}
func GetAllEnvironmentVariables(params models.GetAllSecretsParameters, projectConfigFilePath string) ([]models.SingleEnvironmentVariable, error) { func GetAllEnvironmentVariables(params models.GetAllSecretsParameters, projectConfigFilePath string) ([]models.SingleEnvironmentVariable, error) {
var infisicalToken string var infisicalToken string
if params.InfisicalToken == "" { if params.InfisicalToken == "" {

View File

@ -86,6 +86,7 @@ services:
environment: environment:
- NODE_ENV=development - NODE_ENV=development
- DB_CONNECTION_URI=postgres://infisical:infisical@db/infisical?sslmode=disable - DB_CONNECTION_URI=postgres://infisical:infisical@db/infisical?sslmode=disable
- TELEMETRY_ENABLED=false
volumes: volumes:
- ./backend/src:/app/src - ./backend/src:/app/src

View File

@ -52,7 +52,7 @@ services:
restart: always restart: always
env_file: .env env_file: .env
volumes: volumes:
- pg_data:/data/db - pg_data:/var/lib/postgresql/data
networks: networks:
- infisical - infisical
healthcheck: healthcheck:

View File

@ -16,49 +16,7 @@ git checkout -b MY_BRANCH_NAME
## Set up environment variables ## Set up environment variables
Start by creating a .env file at the root of the Infisical directory then copy the contents of the file below into the .env file. Start by creating a .env file at the root of the Infisical directory then copy the contents of the file linked [here](https://github.com/Infisical/infisical/blob/main/.env.example). View all available [environment variables](https://infisical.com/docs/self-hosting/configuration/envars) and guidance for each.
<Accordion title=".env file content">
```env
# Keys
# Required key for platform encryption/decryption ops
ENCRYPTION_KEY=6c1fe4e407b8911c104518103505b218
# JWT
# Required secrets to sign JWT tokens
JWT_SIGNUP_SECRET=3679e04ca949f914c03332aaaeba805a
JWT_REFRESH_SECRET=5f2f3c8f0159068dc2bbb3a652a716ff
JWT_AUTH_SECRET=4be6ba5602e0fa0ac6ac05c3cd4d247f
JWT_SERVICE_SECRET=f32f716d70a42c5703f4656015e76200
# MongoDB
# Backend will connect to the MongoDB instance at connection string MONGO_URL which can either be a ref
# to the MongoDB container instance or Mongo Cloud
# Required
MONGO_URL=mongodb://root:example@mongo:27017/?authSource=admin
# Optional credentials for MongoDB container instance and Mongo-Express
MONGO_USERNAME=root
MONGO_PASSWORD=example
# Website URL
# Required
SITE_URL=http://localhost:8080
# Mail/SMTP
SMTP_HOST='smtp-server'
SMTP_PORT='1025'
SMTP_NAME='local'
SMTP_USERNAME='team@infisical.com'
SMTP_PASSWORD=
```
</Accordion>
<Warning>
The pre-populated environment variable values above are meant to be used in development only. They should never be used in production.
</Warning>
View all available [environment variables](https://infisical.com/docs/self-hosting/configuration/envars) and guidance for each.
## Starting Infisical for development ## Starting Infisical for development
@ -72,10 +30,7 @@ docker-compose -f docker-compose.dev.yml up --build --force-recreate
``` ```
#### Access local server #### Access local server
Once all the services have spun up, browse to http://localhost:8080. To sign in, you may use the default credentials listed below. Once all the services have spun up, browse to http://localhost:8080.
Email: `test@localhost.local`
Password: `testInfisical1`
#### Shutdown local server #### Shutdown local server

View File

@ -2,7 +2,7 @@
title: "Introduction" title: "Introduction"
--- ---
Infisical is an [open-source](https://opensource.com/resources/what-open-source), [end-to-end encrypted](https://en.wikipedia.org/wiki/End-to-end_encryption) secret management platform for storing, managing, and syncing Infisical is an [open-source](https://opensource.com/resources/what-open-source), [end-to-end encrypted](https://en.wikipedia.org/wiki/End-to-end_encryption) secrets management platform for storing, managing, and syncing
application configuration and secrets like API keys, database credentials, and environment variables across applications and infrastructure. application configuration and secrets like API keys, database credentials, and environment variables across applications and infrastructure.
Start syncing environment variables with [Infisical Cloud](https://app.infisical.com) or learn how to [host Infisical](/self-hosting/overview) yourself. Start syncing environment variables with [Infisical Cloud](https://app.infisical.com) or learn how to [host Infisical](/self-hosting/overview) yourself.

View File

@ -0,0 +1,36 @@
---
title: "LDAP"
description: "Log in to Infisical with LDAP"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact team@infisical.com to purchase an enterprise license to use it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol); this includes support for Active Directory.
<Steps>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
Next, input your LDAP server settings.
![LDAP configuration](/images/platform/ldap/ldap-config.png)
Here's some guidance for each field:
- URL: The LDAP server to connect to such as `ldap://ldap.your-org.com`, `ldaps://ldap.myorg.com:636` (for connection over SSL/TLS), etc.
- Bind DN: The distinguished name of object to bind when performing the user search such as `cn=infisical,ou=Users,dc=acme,dc=com`.
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=example,dc=com`
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate.
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>

View File

@ -0,0 +1,36 @@
---
title: "General LDAP"
description: "Log in to Infisical with LDAP"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact team@infisical.com to purchase an enterprise license to use it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol)
<Steps>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
Next, input your LDAP server settings.
![LDAP configuration](/images/platform/ldap/ldap-config.png)
Here's some guidance for each field:
- URL: The LDAP server to connect to such as `ldap://ldap.your-org.com`, `ldaps://ldap.myorg.com:636` (for connection over SSL/TLS), etc.
- Bind DN: The distinguished name of object to bind when performing the user search such as `cn=infisical,ou=Users,dc=acme,dc=com`.
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=example,dc=com`
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate.
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>

Some files were not shown because too many files have changed in this diff Show More