Compare commits

..

81 Commits

Author SHA1 Message Date
9476594978 Merge pull request #1494 from akhilmhdh/fix/migration-admin-bug
fix(pg-migrator): added uuid 0000 for admin config
2024-02-29 10:57:26 -05:00
02be9ebd5e Merge pull request #1492 from akhilmhdh/fix/create-tag 2024-02-29 09:03:18 -05:00
eb29d1dc28 fix(pg-migrator): added uuid 0000 for admin config 2024-02-29 15:38:45 +05:30
114a4b1412 fix(server): resolved broken create tag scoped to project 2024-02-29 13:02:09 +05:30
cde8cef8b0 Merge pull request #1490 from 24601/patch-1
fix(helm-charts): standalone chart rbac fix for jobs
2024-02-28 22:41:27 -05:00
7207997cea update chart version 2024-02-28 22:40:15 -05:00
aaabfb7870 fix(helm-charts): standalone chart rbac fix for jobs 2024-02-28 19:31:16 -07:00
40cb5c4394 Merge pull request #1326 from quinton11/feat/cli-export-with-tag-slugs
feat: cli export allow filtering with tags
2024-02-28 18:32:22 -05:00
60b73879df Update postgres.mdx 2024-02-28 14:40:04 -08:00
4339ef4737 Merge pull request #1485 from Infisical/snyk-upgrade-14579311c8ea1dfb5d579851318fccc5
[Snyk] Upgrade posthog-js from 1.104.4 to 1.105.4
2024-02-28 16:59:28 -05:00
d98669700d Merge pull request #1487 from nhedger/docs/docker
docs: improve docker page
2024-02-28 16:59:18 -05:00
162f339149 Merge pull request #1489 from Infisical/snyk-upgrade-978758f53696a9f0d6b71883b9614b0a
[Snyk] Upgrade aws-sdk from 2.1549.0 to 2.1553.0
2024-02-28 16:48:35 -05:00
d3eb0c4cc9 Merge pull request #1488 from Kiskadee-dev/patch-1
fix postgresql volume path on docker-compose.prod.yml
2024-02-28 16:35:15 -05:00
4b4295f53d fix: upgrade aws-sdk from 2.1549.0 to 2.1553.0
Snyk has created this PR to upgrade aws-sdk from 2.1549.0 to 2.1553.0.

See this package in npm:
https://www.npmjs.com/package/aws-sdk

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-28 21:11:42 +00:00
6c4d193b12 Update docker-compose.prod.yml
/data/db doesn't seems to exist, would never persist data otherwise
2024-02-28 16:27:16 -03:00
d08d412f54 improve more 2024-02-28 20:21:43 +01:00
bb4810470f docs: rewording 2024-02-28 20:20:39 +01:00
24e9c0a39f Merge pull request #1486 from nhedger/docs/secrets
docs: fix typo
2024-02-28 14:07:00 -05:00
3161d0ee67 docs: fix typo 2024-02-28 20:01:57 +01:00
8a7e18dc7c fix: upgrade posthog-js from 1.104.4 to 1.105.4
Snyk has created this PR to upgrade posthog-js from 1.104.4 to 1.105.4.

See this package in npm:
https://www.npmjs.com/package/posthog-js

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/53d4ecb6-6cc1-4918-aa73-bf9cae4ffd13?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-28 18:49:06 +00:00
0497c3b49e Merge pull request #1484 from akhilmhdh/feat/i18n-removal
feat(ui): secret blind index banner in secret main page and removed i18n to keep only english for now
2024-02-28 11:09:35 -05:00
e6a89fb9d0 feat(ui): secret blind index banner in secret main page and removed i18n translations for now by keeping en as only option 2024-02-28 14:47:50 +05:30
d9828db2ec update gamma helm values 2024-02-27 18:51:36 -05:00
f11efc9e3f Merge pull request #1461 from Infisical/snyk-upgrade-430437d73f24d5cfbeaa8f5f8f1fa7dc
[Snyk] Upgrade posthog-js from 1.103.0 to 1.104.4
2024-02-27 18:43:09 -05:00
32bad10c0e Merge branch 'main' into snyk-upgrade-430437d73f24d5cfbeaa8f5f8f1fa7dc 2024-02-27 18:43:03 -05:00
41064920f7 Merge pull request #1465 from Infisical/snyk-upgrade-29c4ba3e253755510159e658916d2c3f
[Snyk] Upgrade @fastify/cookie from 9.2.0 to 9.3.1
2024-02-27 18:41:33 -05:00
8d8e23add2 Merge pull request #1471 from akhilmhdh/feat/telemetry-aggregation
Telemetry stats event for self hosted instance on midnight
2024-02-27 18:36:29 -05:00
a2a959cc32 disable telemetry for local dev by default 2024-02-27 18:26:15 -05:00
d6cde48181 set posthog flush to zero and fix typos 2024-02-27 18:23:24 -05:00
23966c12e2 Merge pull request #1482 from Infisical/daniel/fix-invite-all-members
Fix: Invite all members to project when there are no members to invite
2024-02-27 17:38:52 -05:00
2a233ea43c Fix: Inviting all members when there's only 1 user in the organization 2024-02-27 23:15:40 +01:00
fe497d87c0 add INFISICAL_CLOUD env back from old backend 2024-02-27 16:38:18 -05:00
0c3060e1c6 Merge pull request #1477 from Infisical/daniel/upgrade-transparency
Chore: Project upgrade notice
2024-02-27 13:47:36 -05:00
5d64398e58 add more clarity to e2ee notice 2024-02-27 13:45:31 -05:00
2f6f713c98 Better phrasing 2024-02-27 19:19:17 +01:00
4f47d43801 Merge pull request #1479 from Infisical/snyk-fix-46ae40ea09c96f3a158e662824e76ed8
[Snyk] Security upgrade bullmq from 5.1.6 to 5.3.3
2024-02-27 12:17:35 -05:00
6cf9a83c16 fix: backend/package.json & backend/package-lock.json to reduce vulnerabilities
The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-JS-INFLIGHT-6095116
2024-02-27 08:51:09 +00:00
c3adc8b188 Update overview.mdx 2024-02-26 22:31:07 -08:00
a723c456aa Update Chart.yaml 2024-02-27 01:26:47 -05:00
c455ef7ced Update values.yaml 2024-02-27 01:26:28 -05:00
f9d0680dc3 Update Chart.yaml 2024-02-27 01:24:06 -05:00
7a4e8b8c32 Update values.yaml 2024-02-27 01:23:35 -05:00
8e83b0f2dd npm install backend 2024-02-27 01:13:00 -05:00
59c6837071 Update faq.mdx 2024-02-27 00:48:32 -05:00
d4d23e06a8 Merge pull request #1478 from Infisical/mongo-to-postgres-guide
Mongo to postgres guide
2024-02-27 00:43:58 -05:00
9d202e8501 add additional discussion 2024-02-27 00:43:36 -05:00
1f9f15136e mongo to postgres guide 2024-02-27 00:35:41 -05:00
5d71b02f8d Fix: Add learn more to both alerts 2024-02-27 06:01:22 +01:00
9d2a0f1d54 Chore: Add notice link 2024-02-27 05:55:11 +01:00
0f4da61aaa Docs: Upgrade notice 2024-02-27 05:54:56 +01:00
26abb7d89f Merge pull request #1476 from Infisical/ldap-docs
Update docs for LDAP
2024-02-26 20:48:21 -08:00
892a25edfe Update docs for LDAP 2024-02-26 20:47:20 -08:00
082a533cfa Update Chart.yaml 2024-02-26 17:19:48 -05:00
d71a8a35e5 increase resource limits more 2024-02-26 17:19:38 -05:00
59585dfea9 Merge pull request #1474 from Infisical/daniel/failed-decryption-log
Fix: Add detailed decryption error logging
2024-02-26 16:49:52 -05:00
514304eed0 Fix: Add detailed decryption error logging 2024-02-26 22:19:54 +01:00
a0fc9e534c Update Chart.yaml 2024-02-26 16:10:02 -05:00
73323c0343 update resource limits 2024-02-26 16:09:21 -05:00
98cd71d421 Merge pull request #1473 from Infisical/ldap-docs
Add docs for LDAP
2024-02-26 10:51:48 -08:00
ae6157dd78 Add docs for LDAP 2024-02-26 10:49:30 -08:00
4bf7e8bbd1 add ingress back to helm 2024-02-26 13:01:57 -05:00
6891d309da Merge pull request #1467 from Trugamr/fix/1422-verify-email-loading
fix(signup): set send verification email button loading state
2024-02-26 19:56:51 +05:30
3b9ceff21c refactor(server): updated all telemetry send events to await as changed to async 2024-02-26 19:52:38 +05:30
d64d935d7d feat(server): added telemetry queue for self hosted to upload instance stats to posthog on midnight 2024-02-26 19:52:38 +05:30
8aaed739d5 feat(server): resolved a possible race condition on replication on frest first boot up and fixed making values optional on create rows for generate schema 2024-02-26 19:52:38 +05:30
7d8b399102 feat(server): added keystore and made server cfg fetch from keystore to avoid db calls 2024-02-26 19:52:38 +05:30
1cccbca0c5 Merge pull request #1466 from Trugamr/fix/contributing-guide-link
Fix broken contributing guide link
2024-02-26 08:54:53 -05:00
2c2e1f5d2e Merge pull request #1470 from Infisical/scroll-rotation-fix
fix scrolling issue in rotation modal
2024-02-26 13:43:48 +05:30
6946f3901c fix scrolling issue in rotation modal 2024-02-26 00:03:41 -08:00
82a7010e29 Update envars.mdx 2024-02-25 14:47:04 -05:00
a1e763fa28 Update kubernetes-helm.mdx 2024-02-25 14:42:10 -05:00
0992117173 add pull policy to docker compose 2024-02-25 14:12:19 -05:00
9419884a26 Merge pull request #1468 from radhakrisri/main
Wait for db service to be healthy before kicking off db-migration and backend services
2024-02-25 14:10:44 -05:00
850f3a347c Wait for db service to be healthy before kicking off db-migration and backend services 2024-02-25 12:33:42 -06:00
4c9101d18d fix(signup): set send verification email button loading state
Set loading state for button based on send verification email mutation state

fix #1422
2024-02-25 14:58:08 +05:30
06e8e90ad5 Fix broken contributing guide link 2024-02-25 13:52:40 +05:30
1594165768 fix: upgrade @fastify/cookie from 9.2.0 to 9.3.1
Snyk has created this PR to upgrade @fastify/cookie from 9.2.0 to 9.3.1.

See this package in npm:
https://www.npmjs.com/package/@fastify/cookie

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/35057e82-ed7d-4e19-ba4d-719a42135cd6?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-25 03:14:18 +00:00
29d91d83ab fix: upgrade posthog-js from 1.103.0 to 1.104.4
Snyk has created this PR to upgrade posthog-js from 1.103.0 to 1.104.4.

See this package in npm:
https://www.npmjs.com/package/posthog-js

See this project in Snyk:
https://app.snyk.io/org/maidul98/project/53d4ecb6-6cc1-4918-aa73-bf9cae4ffd13?utm_source=github&utm_medium=referral&page=upgrade-pr
2024-02-24 04:51:43 +00:00
fdd79c0568 Update kubernetes-helm.mdx 2024-02-23 21:41:48 -05:00
4ef8abdb00 Merge pull request #1460 from Infisical/postgres-helm
Add helm chart with postgres + many other docs changes
2024-02-23 21:25:12 -05:00
4057e2c6ab feat: cli export allow filtering with tags 2024-01-24 19:05:16 +00:00
126 changed files with 1151 additions and 363 deletions

View File

@ -19,10 +19,6 @@ POSTGRES_DB=infisical
# Redis
REDIS_URL=redis://redis:6379
# Optional credentials for MongoDB container instance and Mongo-Express
MONGO_USERNAME=root
MONGO_PASSWORD=example
# Website URL
# Required
SITE_URL=http://localhost:8080

13
.github/values.yaml vendored
View File

@ -13,11 +13,10 @@ fullnameOverride: ""
##
infisical:
## @param backend.enabled Enable backend
##
autoDatabaseSchemaMigration: false
enabled: false
## @param backend.name Backend name
##
name: infisical
replicaCount: 3
image:
@ -50,3 +49,9 @@ ingress:
- secretName: letsencrypt-prod
hosts:
- gamma.infisical.com
postgresql:
enabled: false
redis:
enabled: false

View File

@ -2,6 +2,6 @@
Thanks for taking the time to contribute! 😃 🚀
Please refer to our [Contributing Guide](https://infisical.com/docs/contributing/overview) for instructions on how to contribute.
Please refer to our [Contributing Guide](https://infisical.com/docs/contributing/getting-started/overview) for instructions on how to contribute.
We also have some 🔥amazing🔥 merch for our contributors. Please reach out to tony@infisical.com for more info 👀

View File

@ -0,0 +1,30 @@
import { TKeyStoreFactory } from "@app/keystore/keystore";
export const mockKeyStore = (): TKeyStoreFactory => {
const store: Record<string, string | number | Buffer> = {};
return {
setItem: async (key, value) => {
store[key] = value;
return "OK";
},
setItemWithExpiry: async (key, value) => {
store[key] = value;
return "OK";
},
deleteItem: async (key) => {
delete store[key];
return 1;
},
getItem: async (key) => {
const value = store[key];
if (typeof value === "string") {
return value;
}
return null;
},
incrementBy: async () => {
return 1;
}
};
};

View File

@ -14,6 +14,7 @@ import { AuthTokenType } from "@app/services/auth/auth-type";
import { mockQueue } from "./mocks/queue";
import { mockSmtpServer } from "./mocks/smtp";
import { mockKeyStore } from "./mocks/keystore";
dotenv.config({ path: path.join(__dirname, "../../.env.test"), debug: true });
export default {
@ -41,7 +42,8 @@ export default {
await db.seed.run();
const smtp = mockSmtpServer();
const queue = mockQueue();
const server = await main({ db, smtp, logger, queue });
const keyStore = mockKeyStore();
const server = await main({ db, smtp, logger, queue, keyStore });
// @ts-expect-error type
globalThis.testServer = server;
// @ts-expect-error type

View File

@ -11,7 +11,7 @@
"dependencies": {
"@aws-sdk/client-secrets-manager": "^3.504.0",
"@casl/ability": "^6.5.0",
"@fastify/cookie": "^9.2.0",
"@fastify/cookie": "^9.3.1",
"@fastify/cors": "^8.5.0",
"@fastify/etag": "^5.1.0",
"@fastify/formbody": "^7.4.0",
@ -29,11 +29,11 @@
"@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0",
"argon2": "^0.31.2",
"aws-sdk": "^2.1549.0",
"aws-sdk": "^2.1553.0",
"axios": "^1.6.7",
"axios-retry": "^4.0.0",
"bcrypt": "^5.1.1",
"bullmq": "^5.1.6",
"bullmq": "^5.3.3",
"dotenv": "^16.4.1",
"fastify": "^4.26.0",
"fastify-plugin": "^4.5.1",
@ -1687,9 +1687,9 @@
}
},
"node_modules/@fastify/cookie": {
"version": "9.2.0",
"resolved": "https://registry.npmjs.org/@fastify/cookie/-/cookie-9.2.0.tgz",
"integrity": "sha512-fkg1yjjQRHPFAxSHeLC8CqYuNzvR6Lwlj/KjrzQcGjNBK+K82nW+UfCjfN71g1GkoVoc1GTOgIWkFJpcMfMkHQ==",
"version": "9.3.1",
"resolved": "https://registry.npmjs.org/@fastify/cookie/-/cookie-9.3.1.tgz",
"integrity": "sha512-h1NAEhB266+ZbZ0e9qUE6NnNR07i7DnNXWG9VbbZ8uC6O/hxHpl+Zoe5sw1yfdZ2U6XhToUGDnzQtWJdCaPwfg==",
"dependencies": {
"cookie-signature": "^1.1.0",
"fastify-plugin": "^4.0.0"
@ -2193,7 +2193,6 @@
"version": "2.1.5",
"resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz",
"integrity": "sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==",
"dev": true,
"dependencies": {
"@nodelib/fs.stat": "2.0.5",
"run-parallel": "^1.1.9"
@ -2206,7 +2205,6 @@
"version": "2.0.5",
"resolved": "https://registry.npmjs.org/@nodelib/fs.stat/-/fs.stat-2.0.5.tgz",
"integrity": "sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==",
"dev": true,
"engines": {
"node": ">= 8"
}
@ -2215,7 +2213,6 @@
"version": "1.2.8",
"resolved": "https://registry.npmjs.org/@nodelib/fs.walk/-/fs.walk-1.2.8.tgz",
"integrity": "sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==",
"dev": true,
"dependencies": {
"@nodelib/fs.scandir": "2.1.5",
"fastq": "^1.6.0"
@ -5189,9 +5186,9 @@
"integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
},
"node_modules/aws-sdk": {
"version": "2.1549.0",
"resolved": "https://registry.npmjs.org/aws-sdk/-/aws-sdk-2.1549.0.tgz",
"integrity": "sha512-SoVfrrV3A2mxH+NV2tA0eMtG301glhewvhL3Ob4107qLWjvwjy/CoWLclMLmfXniTGxbI8tsgN0r5mLZUKey3Q==",
"version": "2.1553.0",
"resolved": "https://registry.npmjs.org/aws-sdk/-/aws-sdk-2.1553.0.tgz",
"integrity": "sha512-CfZaw8dR9e642aBOeFhkFL7KoQApeLR15uH2IQqfL/12snWYayAAesYh0tEaU+XbhrH0CUsf2Zro5IraEXEZMg==",
"dependencies": {
"buffer": "4.9.2",
"events": "1.1.1",
@ -5442,7 +5439,6 @@
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz",
"integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==",
"dev": true,
"dependencies": {
"fill-range": "^7.0.1"
},
@ -5492,14 +5488,15 @@
}
},
"node_modules/bullmq": {
"version": "5.1.6",
"resolved": "https://registry.npmjs.org/bullmq/-/bullmq-5.1.6.tgz",
"integrity": "sha512-VkLfig+xm4U3hc4QChzuuAy0NGQ9dfPB8o54hmcZHCX9ofp0Zn6bEY+W3Ytkk76eYwPAgXfywDBlAb2Unjl1Rg==",
"version": "5.3.3",
"resolved": "https://registry.npmjs.org/bullmq/-/bullmq-5.3.3.tgz",
"integrity": "sha512-Gc/68HxiCHLMPBiGIqtINxcf8HER/5wvBYMY/6x3tFejlvldUBFaAErMTLDv4TnPsTyzNPrfBKmFCEM58uVnJg==",
"dependencies": {
"cron-parser": "^4.6.0",
"glob": "^8.0.3",
"fast-glob": "^3.3.2",
"ioredis": "^5.3.2",
"lodash": "^4.17.21",
"minimatch": "^9.0.3",
"msgpackr": "^1.10.1",
"node-abort-controller": "^3.1.1",
"semver": "^7.5.4",
@ -5507,6 +5504,28 @@
"uuid": "^9.0.0"
}
},
"node_modules/bullmq/node_modules/brace-expansion": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.1.tgz",
"integrity": "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==",
"dependencies": {
"balanced-match": "^1.0.0"
}
},
"node_modules/bullmq/node_modules/minimatch": {
"version": "9.0.3",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.3.tgz",
"integrity": "sha512-RHiac9mvaRw0x3AYRgDC1CxAP7HTcNrrECeA8YYJeWnpo+2Q5CegtZjaotWTWxDG3UeGA1coE05iH1mPjT/2mg==",
"dependencies": {
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=16 || 14 >=14.17"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/bundle-require": {
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/bundle-require/-/bundle-require-4.0.2.tgz",
@ -6906,7 +6925,6 @@
"version": "3.3.2",
"resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.2.tgz",
"integrity": "sha512-oX2ruAFQwf/Orj8m737Y5adxDQO0LAB7/S5MnxCdTNDd4p6BsyIVsv9JQsATbTSq8KHRpLwIHbVlUNatxd+1Ow==",
"dev": true,
"dependencies": {
"@nodelib/fs.stat": "^2.0.2",
"@nodelib/fs.walk": "^1.2.3",
@ -7058,7 +7076,6 @@
"version": "7.0.1",
"resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz",
"integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==",
"dev": true,
"dependencies": {
"to-regex-range": "^5.0.1"
},
@ -7510,7 +7527,6 @@
"version": "5.1.2",
"resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz",
"integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==",
"dev": true,
"dependencies": {
"is-glob": "^4.0.1"
},
@ -8111,7 +8127,6 @@
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
"integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==",
"dev": true,
"engines": {
"node": ">=0.10.0"
}
@ -8142,7 +8157,6 @@
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz",
"integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==",
"dev": true,
"dependencies": {
"is-extglob": "^2.1.1"
},
@ -8177,7 +8191,6 @@
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz",
"integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==",
"dev": true,
"engines": {
"node": ">=0.12.0"
}
@ -8934,7 +8947,6 @@
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz",
"integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==",
"dev": true,
"engines": {
"node": ">= 8"
}
@ -8951,7 +8963,6 @@
"version": "4.0.5",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz",
"integrity": "sha512-DMy+ERcEW2q8Z2Po+WNXuw3c5YaUSFjAO5GsJqfEl7UjvtIuFKO6ZrKvcItdy98dwFI2N1tg3zNIdKaQT+aNdA==",
"dev": true,
"dependencies": {
"braces": "^3.0.2",
"picomatch": "^2.3.1"
@ -8964,7 +8975,6 @@
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
"dev": true,
"engines": {
"node": ">=8.6"
},
@ -10557,7 +10567,6 @@
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz",
"integrity": "sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==",
"dev": true,
"funding": [
{
"type": "github",
@ -10904,7 +10913,6 @@
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/run-parallel/-/run-parallel-1.2.0.tgz",
"integrity": "sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==",
"dev": true,
"funding": [
{
"type": "github",
@ -11705,7 +11713,6 @@
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
"integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==",
"dev": true,
"dependencies": {
"is-number": "^7.0.0"
},

View File

@ -72,7 +72,7 @@
"dependencies": {
"@aws-sdk/client-secrets-manager": "^3.504.0",
"@casl/ability": "^6.5.0",
"@fastify/cookie": "^9.2.0",
"@fastify/cookie": "^9.3.1",
"@fastify/cors": "^8.5.0",
"@fastify/etag": "^5.1.0",
"@fastify/formbody": "^7.4.0",
@ -90,11 +90,11 @@
"@ucast/mongo2js": "^1.3.4",
"ajv": "^8.12.0",
"argon2": "^0.31.2",
"aws-sdk": "^2.1549.0",
"aws-sdk": "^2.1553.0",
"axios": "^1.6.7",
"axios-retry": "^4.0.0",
"bcrypt": "^5.1.1",
"bullmq": "^5.1.6",
"bullmq": "^5.3.3",
"dotenv": "^16.4.1",
"fastify": "^4.26.0",
"fastify-plugin": "^4.5.1",

View File

@ -44,7 +44,7 @@ const getZodDefaultValue = (type: unknown, value: string | number | boolean | Ob
if (!value || value === "null") return;
switch (type) {
case "uuid":
return;
return `.default("00000000-0000-0000-0000-000000000000")`;
case "character varying": {
if (value === "gen_random_uuid()") return;
if (typeof value === "string" && value.includes("::")) {
@ -100,7 +100,8 @@ const main = async () => {
const columnName = columnNames[colNum];
const colInfo = columns[columnName];
let ztype = getZodPrimitiveType(colInfo.type);
if (colInfo.defaultValue) {
// don't put optional on id
if (colInfo.defaultValue && columnName !== "id") {
const { defaultValue } = colInfo;
const zSchema = getZodDefaultValue(colInfo.type, defaultValue);
if (zSchema) {
@ -120,6 +121,7 @@ const main = async () => {
.split("_")
.reduce((prev, curr) => prev + `${curr.at(0)?.toUpperCase()}${curr.slice(1).toLowerCase()}`, "");
// the insert and update are changed to zod input type to use default cases
writeFileSync(
path.join(__dirname, "../src/db/schemas", `${dashcase}.ts`),
`// Code generated by automation script, DO NOT EDIT.
@ -134,8 +136,8 @@ import { TImmutableDBKeys } from "./models";
export const ${pascalCase}Schema = z.object({${schema}});
export type T${pascalCase} = z.infer<typeof ${pascalCase}Schema>;
export type T${pascalCase}Insert = Omit<T${pascalCase}, TImmutableDBKeys>;
export type T${pascalCase}Update = Partial<Omit<T${pascalCase}, TImmutableDBKeys>>;
export type T${pascalCase}Insert = Omit<z.input<typeof ${pascalCase}Schema>, TImmutableDBKeys>;
export type T${pascalCase}Update = Partial<Omit<z.input<typeof ${pascalCase}Schema>, TImmutableDBKeys>>;
`
);
}

View File

@ -1,6 +0,0 @@
import Redis from "ioredis";
export const initRedisConnection = (redisUrl: string) => {
const redis = new Redis(redisUrl);
return redis;
};

View File

@ -0,0 +1,21 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
const ADMIN_CONFIG_UUID = "00000000-0000-0000-0000-000000000000";
export async function up(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.SuperAdmin, (t) => {
t.uuid("instanceId").notNullable().defaultTo(knex.fn.uuid());
});
// this is updated to avoid race condition on replication
// eslint-disable-next-line
// @ts-ignore
await knex(TableName.SuperAdmin).update({ id: ADMIN_CONFIG_UUID }).whereNotNull("id").limit(1);
}
export async function down(knex: Knex): Promise<void> {
await knex.schema.alterTable(TableName.SuperAdmin, (t) => {
t.dropColumn("instanceId");
});
}

View File

@ -19,5 +19,5 @@ export const ApiKeysSchema = z.object({
});
export type TApiKeys = z.infer<typeof ApiKeysSchema>;
export type TApiKeysInsert = Omit<TApiKeys, TImmutableDBKeys>;
export type TApiKeysUpdate = Partial<Omit<TApiKeys, TImmutableDBKeys>>;
export type TApiKeysInsert = Omit<z.input<typeof ApiKeysSchema>, TImmutableDBKeys>;
export type TApiKeysUpdate = Partial<Omit<z.input<typeof ApiKeysSchema>, TImmutableDBKeys>>;

View File

@ -24,5 +24,5 @@ export const AuditLogsSchema = z.object({
});
export type TAuditLogs = z.infer<typeof AuditLogsSchema>;
export type TAuditLogsInsert = Omit<TAuditLogs, TImmutableDBKeys>;
export type TAuditLogsUpdate = Partial<Omit<TAuditLogs, TImmutableDBKeys>>;
export type TAuditLogsInsert = Omit<z.input<typeof AuditLogsSchema>, TImmutableDBKeys>;
export type TAuditLogsUpdate = Partial<Omit<z.input<typeof AuditLogsSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const AuthTokenSessionsSchema = z.object({
});
export type TAuthTokenSessions = z.infer<typeof AuthTokenSessionsSchema>;
export type TAuthTokenSessionsInsert = Omit<TAuthTokenSessions, TImmutableDBKeys>;
export type TAuthTokenSessionsUpdate = Partial<Omit<TAuthTokenSessions, TImmutableDBKeys>>;
export type TAuthTokenSessionsInsert = Omit<z.input<typeof AuthTokenSessionsSchema>, TImmutableDBKeys>;
export type TAuthTokenSessionsUpdate = Partial<Omit<z.input<typeof AuthTokenSessionsSchema>, TImmutableDBKeys>>;

View File

@ -21,5 +21,5 @@ export const AuthTokensSchema = z.object({
});
export type TAuthTokens = z.infer<typeof AuthTokensSchema>;
export type TAuthTokensInsert = Omit<TAuthTokens, TImmutableDBKeys>;
export type TAuthTokensUpdate = Partial<Omit<TAuthTokens, TImmutableDBKeys>>;
export type TAuthTokensInsert = Omit<z.input<typeof AuthTokensSchema>, TImmutableDBKeys>;
export type TAuthTokensUpdate = Partial<Omit<z.input<typeof AuthTokensSchema>, TImmutableDBKeys>>;

View File

@ -22,5 +22,5 @@ export const BackupPrivateKeySchema = z.object({
});
export type TBackupPrivateKey = z.infer<typeof BackupPrivateKeySchema>;
export type TBackupPrivateKeyInsert = Omit<TBackupPrivateKey, TImmutableDBKeys>;
export type TBackupPrivateKeyUpdate = Partial<Omit<TBackupPrivateKey, TImmutableDBKeys>>;
export type TBackupPrivateKeyInsert = Omit<z.input<typeof BackupPrivateKeySchema>, TImmutableDBKeys>;
export type TBackupPrivateKeyUpdate = Partial<Omit<z.input<typeof BackupPrivateKeySchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const GitAppInstallSessionsSchema = z.object({
});
export type TGitAppInstallSessions = z.infer<typeof GitAppInstallSessionsSchema>;
export type TGitAppInstallSessionsInsert = Omit<TGitAppInstallSessions, TImmutableDBKeys>;
export type TGitAppInstallSessionsUpdate = Partial<Omit<TGitAppInstallSessions, TImmutableDBKeys>>;
export type TGitAppInstallSessionsInsert = Omit<z.input<typeof GitAppInstallSessionsSchema>, TImmutableDBKeys>;
export type TGitAppInstallSessionsUpdate = Partial<Omit<z.input<typeof GitAppInstallSessionsSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const GitAppOrgSchema = z.object({
});
export type TGitAppOrg = z.infer<typeof GitAppOrgSchema>;
export type TGitAppOrgInsert = Omit<TGitAppOrg, TImmutableDBKeys>;
export type TGitAppOrgUpdate = Partial<Omit<TGitAppOrg, TImmutableDBKeys>>;
export type TGitAppOrgInsert = Omit<z.input<typeof GitAppOrgSchema>, TImmutableDBKeys>;
export type TGitAppOrgUpdate = Partial<Omit<z.input<typeof GitAppOrgSchema>, TImmutableDBKeys>>;

View File

@ -16,5 +16,5 @@ export const IdentitiesSchema = z.object({
});
export type TIdentities = z.infer<typeof IdentitiesSchema>;
export type TIdentitiesInsert = Omit<TIdentities, TImmutableDBKeys>;
export type TIdentitiesUpdate = Partial<Omit<TIdentities, TImmutableDBKeys>>;
export type TIdentitiesInsert = Omit<z.input<typeof IdentitiesSchema>, TImmutableDBKeys>;
export type TIdentitiesUpdate = Partial<Omit<z.input<typeof IdentitiesSchema>, TImmutableDBKeys>>;

View File

@ -23,5 +23,5 @@ export const IdentityAccessTokensSchema = z.object({
});
export type TIdentityAccessTokens = z.infer<typeof IdentityAccessTokensSchema>;
export type TIdentityAccessTokensInsert = Omit<TIdentityAccessTokens, TImmutableDBKeys>;
export type TIdentityAccessTokensUpdate = Partial<Omit<TIdentityAccessTokens, TImmutableDBKeys>>;
export type TIdentityAccessTokensInsert = Omit<z.input<typeof IdentityAccessTokensSchema>, TImmutableDBKeys>;
export type TIdentityAccessTokensUpdate = Partial<Omit<z.input<typeof IdentityAccessTokensSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,7 @@ export const IdentityOrgMembershipsSchema = z.object({
});
export type TIdentityOrgMemberships = z.infer<typeof IdentityOrgMembershipsSchema>;
export type TIdentityOrgMembershipsInsert = Omit<TIdentityOrgMemberships, TImmutableDBKeys>;
export type TIdentityOrgMembershipsUpdate = Partial<Omit<TIdentityOrgMemberships, TImmutableDBKeys>>;
export type TIdentityOrgMembershipsInsert = Omit<z.input<typeof IdentityOrgMembershipsSchema>, TImmutableDBKeys>;
export type TIdentityOrgMembershipsUpdate = Partial<
Omit<z.input<typeof IdentityOrgMembershipsSchema>, TImmutableDBKeys>
>;

View File

@ -18,5 +18,10 @@ export const IdentityProjectMembershipsSchema = z.object({
});
export type TIdentityProjectMemberships = z.infer<typeof IdentityProjectMembershipsSchema>;
export type TIdentityProjectMembershipsInsert = Omit<TIdentityProjectMemberships, TImmutableDBKeys>;
export type TIdentityProjectMembershipsUpdate = Partial<Omit<TIdentityProjectMemberships, TImmutableDBKeys>>;
export type TIdentityProjectMembershipsInsert = Omit<
z.input<typeof IdentityProjectMembershipsSchema>,
TImmutableDBKeys
>;
export type TIdentityProjectMembershipsUpdate = Partial<
Omit<z.input<typeof IdentityProjectMembershipsSchema>, TImmutableDBKeys>
>;

View File

@ -23,5 +23,7 @@ export const IdentityUaClientSecretsSchema = z.object({
});
export type TIdentityUaClientSecrets = z.infer<typeof IdentityUaClientSecretsSchema>;
export type TIdentityUaClientSecretsInsert = Omit<TIdentityUaClientSecrets, TImmutableDBKeys>;
export type TIdentityUaClientSecretsUpdate = Partial<Omit<TIdentityUaClientSecrets, TImmutableDBKeys>>;
export type TIdentityUaClientSecretsInsert = Omit<z.input<typeof IdentityUaClientSecretsSchema>, TImmutableDBKeys>;
export type TIdentityUaClientSecretsUpdate = Partial<
Omit<z.input<typeof IdentityUaClientSecretsSchema>, TImmutableDBKeys>
>;

View File

@ -21,5 +21,7 @@ export const IdentityUniversalAuthsSchema = z.object({
});
export type TIdentityUniversalAuths = z.infer<typeof IdentityUniversalAuthsSchema>;
export type TIdentityUniversalAuthsInsert = Omit<TIdentityUniversalAuths, TImmutableDBKeys>;
export type TIdentityUniversalAuthsUpdate = Partial<Omit<TIdentityUniversalAuths, TImmutableDBKeys>>;
export type TIdentityUniversalAuthsInsert = Omit<z.input<typeof IdentityUniversalAuthsSchema>, TImmutableDBKeys>;
export type TIdentityUniversalAuthsUpdate = Partial<
Omit<z.input<typeof IdentityUniversalAuthsSchema>, TImmutableDBKeys>
>;

View File

@ -16,5 +16,5 @@ export const IncidentContactsSchema = z.object({
});
export type TIncidentContacts = z.infer<typeof IncidentContactsSchema>;
export type TIncidentContactsInsert = Omit<TIncidentContacts, TImmutableDBKeys>;
export type TIncidentContactsUpdate = Partial<Omit<TIncidentContacts, TImmutableDBKeys>>;
export type TIncidentContactsInsert = Omit<z.input<typeof IncidentContactsSchema>, TImmutableDBKeys>;
export type TIncidentContactsUpdate = Partial<Omit<z.input<typeof IncidentContactsSchema>, TImmutableDBKeys>>;

View File

@ -33,5 +33,5 @@ export const IntegrationAuthsSchema = z.object({
});
export type TIntegrationAuths = z.infer<typeof IntegrationAuthsSchema>;
export type TIntegrationAuthsInsert = Omit<TIntegrationAuths, TImmutableDBKeys>;
export type TIntegrationAuthsUpdate = Partial<Omit<TIntegrationAuths, TImmutableDBKeys>>;
export type TIntegrationAuthsInsert = Omit<z.input<typeof IntegrationAuthsSchema>, TImmutableDBKeys>;
export type TIntegrationAuthsUpdate = Partial<Omit<z.input<typeof IntegrationAuthsSchema>, TImmutableDBKeys>>;

View File

@ -31,5 +31,5 @@ export const IntegrationsSchema = z.object({
});
export type TIntegrations = z.infer<typeof IntegrationsSchema>;
export type TIntegrationsInsert = Omit<TIntegrations, TImmutableDBKeys>;
export type TIntegrationsUpdate = Partial<Omit<TIntegrations, TImmutableDBKeys>>;
export type TIntegrationsInsert = Omit<z.input<typeof IntegrationsSchema>, TImmutableDBKeys>;
export type TIntegrationsUpdate = Partial<Omit<z.input<typeof IntegrationsSchema>, TImmutableDBKeys>>;

View File

@ -27,5 +27,5 @@ export const OrgBotsSchema = z.object({
});
export type TOrgBots = z.infer<typeof OrgBotsSchema>;
export type TOrgBotsInsert = Omit<TOrgBots, TImmutableDBKeys>;
export type TOrgBotsUpdate = Partial<Omit<TOrgBots, TImmutableDBKeys>>;
export type TOrgBotsInsert = Omit<z.input<typeof OrgBotsSchema>, TImmutableDBKeys>;
export type TOrgBotsUpdate = Partial<Omit<z.input<typeof OrgBotsSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const OrgMembershipsSchema = z.object({
});
export type TOrgMemberships = z.infer<typeof OrgMembershipsSchema>;
export type TOrgMembershipsInsert = Omit<TOrgMemberships, TImmutableDBKeys>;
export type TOrgMembershipsUpdate = Partial<Omit<TOrgMemberships, TImmutableDBKeys>>;
export type TOrgMembershipsInsert = Omit<z.input<typeof OrgMembershipsSchema>, TImmutableDBKeys>;
export type TOrgMembershipsUpdate = Partial<Omit<z.input<typeof OrgMembershipsSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const OrgRolesSchema = z.object({
});
export type TOrgRoles = z.infer<typeof OrgRolesSchema>;
export type TOrgRolesInsert = Omit<TOrgRoles, TImmutableDBKeys>;
export type TOrgRolesUpdate = Partial<Omit<TOrgRoles, TImmutableDBKeys>>;
export type TOrgRolesInsert = Omit<z.input<typeof OrgRolesSchema>, TImmutableDBKeys>;
export type TOrgRolesUpdate = Partial<Omit<z.input<typeof OrgRolesSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const OrganizationsSchema = z.object({
});
export type TOrganizations = z.infer<typeof OrganizationsSchema>;
export type TOrganizationsInsert = Omit<TOrganizations, TImmutableDBKeys>;
export type TOrganizationsUpdate = Partial<Omit<TOrganizations, TImmutableDBKeys>>;
export type TOrganizationsInsert = Omit<z.input<typeof OrganizationsSchema>, TImmutableDBKeys>;
export type TOrganizationsUpdate = Partial<Omit<z.input<typeof OrganizationsSchema>, TImmutableDBKeys>>;

View File

@ -26,5 +26,5 @@ export const ProjectBotsSchema = z.object({
});
export type TProjectBots = z.infer<typeof ProjectBotsSchema>;
export type TProjectBotsInsert = Omit<TProjectBots, TImmutableDBKeys>;
export type TProjectBotsUpdate = Partial<Omit<TProjectBots, TImmutableDBKeys>>;
export type TProjectBotsInsert = Omit<z.input<typeof ProjectBotsSchema>, TImmutableDBKeys>;
export type TProjectBotsUpdate = Partial<Omit<z.input<typeof ProjectBotsSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const ProjectEnvironmentsSchema = z.object({
});
export type TProjectEnvironments = z.infer<typeof ProjectEnvironmentsSchema>;
export type TProjectEnvironmentsInsert = Omit<TProjectEnvironments, TImmutableDBKeys>;
export type TProjectEnvironmentsUpdate = Partial<Omit<TProjectEnvironments, TImmutableDBKeys>>;
export type TProjectEnvironmentsInsert = Omit<z.input<typeof ProjectEnvironmentsSchema>, TImmutableDBKeys>;
export type TProjectEnvironmentsUpdate = Partial<Omit<z.input<typeof ProjectEnvironmentsSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const ProjectKeysSchema = z.object({
});
export type TProjectKeys = z.infer<typeof ProjectKeysSchema>;
export type TProjectKeysInsert = Omit<TProjectKeys, TImmutableDBKeys>;
export type TProjectKeysUpdate = Partial<Omit<TProjectKeys, TImmutableDBKeys>>;
export type TProjectKeysInsert = Omit<z.input<typeof ProjectKeysSchema>, TImmutableDBKeys>;
export type TProjectKeysUpdate = Partial<Omit<z.input<typeof ProjectKeysSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const ProjectMembershipsSchema = z.object({
});
export type TProjectMemberships = z.infer<typeof ProjectMembershipsSchema>;
export type TProjectMembershipsInsert = Omit<TProjectMemberships, TImmutableDBKeys>;
export type TProjectMembershipsUpdate = Partial<Omit<TProjectMemberships, TImmutableDBKeys>>;
export type TProjectMembershipsInsert = Omit<z.input<typeof ProjectMembershipsSchema>, TImmutableDBKeys>;
export type TProjectMembershipsUpdate = Partial<Omit<z.input<typeof ProjectMembershipsSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const ProjectRolesSchema = z.object({
});
export type TProjectRoles = z.infer<typeof ProjectRolesSchema>;
export type TProjectRolesInsert = Omit<TProjectRoles, TImmutableDBKeys>;
export type TProjectRolesUpdate = Partial<Omit<TProjectRoles, TImmutableDBKeys>>;
export type TProjectRolesInsert = Omit<z.input<typeof ProjectRolesSchema>, TImmutableDBKeys>;
export type TProjectRolesUpdate = Partial<Omit<z.input<typeof ProjectRolesSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const ProjectsSchema = z.object({
});
export type TProjects = z.infer<typeof ProjectsSchema>;
export type TProjectsInsert = Omit<TProjects, TImmutableDBKeys>;
export type TProjectsUpdate = Partial<Omit<TProjects, TImmutableDBKeys>>;
export type TProjectsInsert = Omit<z.input<typeof ProjectsSchema>, TImmutableDBKeys>;
export type TProjectsUpdate = Partial<Omit<z.input<typeof ProjectsSchema>, TImmutableDBKeys>>;

View File

@ -27,5 +27,5 @@ export const SamlConfigsSchema = z.object({
});
export type TSamlConfigs = z.infer<typeof SamlConfigsSchema>;
export type TSamlConfigsInsert = Omit<TSamlConfigs, TImmutableDBKeys>;
export type TSamlConfigsUpdate = Partial<Omit<TSamlConfigs, TImmutableDBKeys>>;
export type TSamlConfigsInsert = Omit<z.input<typeof SamlConfigsSchema>, TImmutableDBKeys>;
export type TSamlConfigsUpdate = Partial<Omit<z.input<typeof SamlConfigsSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const ScimTokensSchema = z.object({
});
export type TScimTokens = z.infer<typeof ScimTokensSchema>;
export type TScimTokensInsert = Omit<TScimTokens, TImmutableDBKeys>;
export type TScimTokensUpdate = Partial<Omit<TScimTokens, TImmutableDBKeys>>;
export type TScimTokensInsert = Omit<z.input<typeof ScimTokensSchema>, TImmutableDBKeys>;
export type TScimTokensUpdate = Partial<Omit<z.input<typeof ScimTokensSchema>, TImmutableDBKeys>>;

View File

@ -16,5 +16,10 @@ export const SecretApprovalPoliciesApproversSchema = z.object({
});
export type TSecretApprovalPoliciesApprovers = z.infer<typeof SecretApprovalPoliciesApproversSchema>;
export type TSecretApprovalPoliciesApproversInsert = Omit<TSecretApprovalPoliciesApprovers, TImmutableDBKeys>;
export type TSecretApprovalPoliciesApproversUpdate = Partial<Omit<TSecretApprovalPoliciesApprovers, TImmutableDBKeys>>;
export type TSecretApprovalPoliciesApproversInsert = Omit<
z.input<typeof SecretApprovalPoliciesApproversSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalPoliciesApproversUpdate = Partial<
Omit<z.input<typeof SecretApprovalPoliciesApproversSchema>, TImmutableDBKeys>
>;

View File

@ -18,5 +18,7 @@ export const SecretApprovalPoliciesSchema = z.object({
});
export type TSecretApprovalPolicies = z.infer<typeof SecretApprovalPoliciesSchema>;
export type TSecretApprovalPoliciesInsert = Omit<TSecretApprovalPolicies, TImmutableDBKeys>;
export type TSecretApprovalPoliciesUpdate = Partial<Omit<TSecretApprovalPolicies, TImmutableDBKeys>>;
export type TSecretApprovalPoliciesInsert = Omit<z.input<typeof SecretApprovalPoliciesSchema>, TImmutableDBKeys>;
export type TSecretApprovalPoliciesUpdate = Partial<
Omit<z.input<typeof SecretApprovalPoliciesSchema>, TImmutableDBKeys>
>;

View File

@ -16,5 +16,10 @@ export const SecretApprovalRequestSecretTagsSchema = z.object({
});
export type TSecretApprovalRequestSecretTags = z.infer<typeof SecretApprovalRequestSecretTagsSchema>;
export type TSecretApprovalRequestSecretTagsInsert = Omit<TSecretApprovalRequestSecretTags, TImmutableDBKeys>;
export type TSecretApprovalRequestSecretTagsUpdate = Partial<Omit<TSecretApprovalRequestSecretTags, TImmutableDBKeys>>;
export type TSecretApprovalRequestSecretTagsInsert = Omit<
z.input<typeof SecretApprovalRequestSecretTagsSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalRequestSecretTagsUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestSecretTagsSchema>, TImmutableDBKeys>
>;

View File

@ -17,5 +17,10 @@ export const SecretApprovalRequestsReviewersSchema = z.object({
});
export type TSecretApprovalRequestsReviewers = z.infer<typeof SecretApprovalRequestsReviewersSchema>;
export type TSecretApprovalRequestsReviewersInsert = Omit<TSecretApprovalRequestsReviewers, TImmutableDBKeys>;
export type TSecretApprovalRequestsReviewersUpdate = Partial<Omit<TSecretApprovalRequestsReviewers, TImmutableDBKeys>>;
export type TSecretApprovalRequestsReviewersInsert = Omit<
z.input<typeof SecretApprovalRequestsReviewersSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalRequestsReviewersUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestsReviewersSchema>, TImmutableDBKeys>
>;

View File

@ -35,5 +35,10 @@ export const SecretApprovalRequestsSecretsSchema = z.object({
});
export type TSecretApprovalRequestsSecrets = z.infer<typeof SecretApprovalRequestsSecretsSchema>;
export type TSecretApprovalRequestsSecretsInsert = Omit<TSecretApprovalRequestsSecrets, TImmutableDBKeys>;
export type TSecretApprovalRequestsSecretsUpdate = Partial<Omit<TSecretApprovalRequestsSecrets, TImmutableDBKeys>>;
export type TSecretApprovalRequestsSecretsInsert = Omit<
z.input<typeof SecretApprovalRequestsSecretsSchema>,
TImmutableDBKeys
>;
export type TSecretApprovalRequestsSecretsUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestsSecretsSchema>, TImmutableDBKeys>
>;

View File

@ -22,5 +22,7 @@ export const SecretApprovalRequestsSchema = z.object({
});
export type TSecretApprovalRequests = z.infer<typeof SecretApprovalRequestsSchema>;
export type TSecretApprovalRequestsInsert = Omit<TSecretApprovalRequests, TImmutableDBKeys>;
export type TSecretApprovalRequestsUpdate = Partial<Omit<TSecretApprovalRequests, TImmutableDBKeys>>;
export type TSecretApprovalRequestsInsert = Omit<z.input<typeof SecretApprovalRequestsSchema>, TImmutableDBKeys>;
export type TSecretApprovalRequestsUpdate = Partial<
Omit<z.input<typeof SecretApprovalRequestsSchema>, TImmutableDBKeys>
>;

View File

@ -20,5 +20,5 @@ export const SecretBlindIndexesSchema = z.object({
});
export type TSecretBlindIndexes = z.infer<typeof SecretBlindIndexesSchema>;
export type TSecretBlindIndexesInsert = Omit<TSecretBlindIndexes, TImmutableDBKeys>;
export type TSecretBlindIndexesUpdate = Partial<Omit<TSecretBlindIndexes, TImmutableDBKeys>>;
export type TSecretBlindIndexesInsert = Omit<z.input<typeof SecretBlindIndexesSchema>, TImmutableDBKeys>;
export type TSecretBlindIndexesUpdate = Partial<Omit<z.input<typeof SecretBlindIndexesSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const SecretFolderVersionsSchema = z.object({
});
export type TSecretFolderVersions = z.infer<typeof SecretFolderVersionsSchema>;
export type TSecretFolderVersionsInsert = Omit<TSecretFolderVersions, TImmutableDBKeys>;
export type TSecretFolderVersionsUpdate = Partial<Omit<TSecretFolderVersions, TImmutableDBKeys>>;
export type TSecretFolderVersionsInsert = Omit<z.input<typeof SecretFolderVersionsSchema>, TImmutableDBKeys>;
export type TSecretFolderVersionsUpdate = Partial<Omit<z.input<typeof SecretFolderVersionsSchema>, TImmutableDBKeys>>;

View File

@ -18,5 +18,5 @@ export const SecretFoldersSchema = z.object({
});
export type TSecretFolders = z.infer<typeof SecretFoldersSchema>;
export type TSecretFoldersInsert = Omit<TSecretFolders, TImmutableDBKeys>;
export type TSecretFoldersUpdate = Partial<Omit<TSecretFolders, TImmutableDBKeys>>;
export type TSecretFoldersInsert = Omit<z.input<typeof SecretFoldersSchema>, TImmutableDBKeys>;
export type TSecretFoldersUpdate = Partial<Omit<z.input<typeof SecretFoldersSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const SecretImportsSchema = z.object({
});
export type TSecretImports = z.infer<typeof SecretImportsSchema>;
export type TSecretImportsInsert = Omit<TSecretImports, TImmutableDBKeys>;
export type TSecretImportsUpdate = Partial<Omit<TSecretImports, TImmutableDBKeys>>;
export type TSecretImportsInsert = Omit<z.input<typeof SecretImportsSchema>, TImmutableDBKeys>;
export type TSecretImportsUpdate = Partial<Omit<z.input<typeof SecretImportsSchema>, TImmutableDBKeys>>;

View File

@ -15,5 +15,5 @@ export const SecretRotationOutputsSchema = z.object({
});
export type TSecretRotationOutputs = z.infer<typeof SecretRotationOutputsSchema>;
export type TSecretRotationOutputsInsert = Omit<TSecretRotationOutputs, TImmutableDBKeys>;
export type TSecretRotationOutputsUpdate = Partial<Omit<TSecretRotationOutputs, TImmutableDBKeys>>;
export type TSecretRotationOutputsInsert = Omit<z.input<typeof SecretRotationOutputsSchema>, TImmutableDBKeys>;
export type TSecretRotationOutputsUpdate = Partial<Omit<z.input<typeof SecretRotationOutputsSchema>, TImmutableDBKeys>>;

View File

@ -26,5 +26,5 @@ export const SecretRotationsSchema = z.object({
});
export type TSecretRotations = z.infer<typeof SecretRotationsSchema>;
export type TSecretRotationsInsert = Omit<TSecretRotations, TImmutableDBKeys>;
export type TSecretRotationsUpdate = Partial<Omit<TSecretRotations, TImmutableDBKeys>>;
export type TSecretRotationsInsert = Omit<z.input<typeof SecretRotationsSchema>, TImmutableDBKeys>;
export type TSecretRotationsUpdate = Partial<Omit<z.input<typeof SecretRotationsSchema>, TImmutableDBKeys>>;

View File

@ -42,5 +42,7 @@ export const SecretScanningGitRisksSchema = z.object({
});
export type TSecretScanningGitRisks = z.infer<typeof SecretScanningGitRisksSchema>;
export type TSecretScanningGitRisksInsert = Omit<TSecretScanningGitRisks, TImmutableDBKeys>;
export type TSecretScanningGitRisksUpdate = Partial<Omit<TSecretScanningGitRisks, TImmutableDBKeys>>;
export type TSecretScanningGitRisksInsert = Omit<z.input<typeof SecretScanningGitRisksSchema>, TImmutableDBKeys>;
export type TSecretScanningGitRisksUpdate = Partial<
Omit<z.input<typeof SecretScanningGitRisksSchema>, TImmutableDBKeys>
>;

View File

@ -17,5 +17,5 @@ export const SecretSnapshotFoldersSchema = z.object({
});
export type TSecretSnapshotFolders = z.infer<typeof SecretSnapshotFoldersSchema>;
export type TSecretSnapshotFoldersInsert = Omit<TSecretSnapshotFolders, TImmutableDBKeys>;
export type TSecretSnapshotFoldersUpdate = Partial<Omit<TSecretSnapshotFolders, TImmutableDBKeys>>;
export type TSecretSnapshotFoldersInsert = Omit<z.input<typeof SecretSnapshotFoldersSchema>, TImmutableDBKeys>;
export type TSecretSnapshotFoldersUpdate = Partial<Omit<z.input<typeof SecretSnapshotFoldersSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const SecretSnapshotSecretsSchema = z.object({
});
export type TSecretSnapshotSecrets = z.infer<typeof SecretSnapshotSecretsSchema>;
export type TSecretSnapshotSecretsInsert = Omit<TSecretSnapshotSecrets, TImmutableDBKeys>;
export type TSecretSnapshotSecretsUpdate = Partial<Omit<TSecretSnapshotSecrets, TImmutableDBKeys>>;
export type TSecretSnapshotSecretsInsert = Omit<z.input<typeof SecretSnapshotSecretsSchema>, TImmutableDBKeys>;
export type TSecretSnapshotSecretsUpdate = Partial<Omit<z.input<typeof SecretSnapshotSecretsSchema>, TImmutableDBKeys>>;

View File

@ -17,5 +17,5 @@ export const SecretSnapshotsSchema = z.object({
});
export type TSecretSnapshots = z.infer<typeof SecretSnapshotsSchema>;
export type TSecretSnapshotsInsert = Omit<TSecretSnapshots, TImmutableDBKeys>;
export type TSecretSnapshotsUpdate = Partial<Omit<TSecretSnapshots, TImmutableDBKeys>>;
export type TSecretSnapshotsInsert = Omit<z.input<typeof SecretSnapshotsSchema>, TImmutableDBKeys>;
export type TSecretSnapshotsUpdate = Partial<Omit<z.input<typeof SecretSnapshotsSchema>, TImmutableDBKeys>>;

View File

@ -14,5 +14,5 @@ export const SecretTagJunctionSchema = z.object({
});
export type TSecretTagJunction = z.infer<typeof SecretTagJunctionSchema>;
export type TSecretTagJunctionInsert = Omit<TSecretTagJunction, TImmutableDBKeys>;
export type TSecretTagJunctionUpdate = Partial<Omit<TSecretTagJunction, TImmutableDBKeys>>;
export type TSecretTagJunctionInsert = Omit<z.input<typeof SecretTagJunctionSchema>, TImmutableDBKeys>;
export type TSecretTagJunctionUpdate = Partial<Omit<z.input<typeof SecretTagJunctionSchema>, TImmutableDBKeys>>;

View File

@ -19,5 +19,5 @@ export const SecretTagsSchema = z.object({
});
export type TSecretTags = z.infer<typeof SecretTagsSchema>;
export type TSecretTagsInsert = Omit<TSecretTags, TImmutableDBKeys>;
export type TSecretTagsUpdate = Partial<Omit<TSecretTags, TImmutableDBKeys>>;
export type TSecretTagsInsert = Omit<z.input<typeof SecretTagsSchema>, TImmutableDBKeys>;
export type TSecretTagsUpdate = Partial<Omit<z.input<typeof SecretTagsSchema>, TImmutableDBKeys>>;

View File

@ -14,5 +14,7 @@ export const SecretVersionTagJunctionSchema = z.object({
});
export type TSecretVersionTagJunction = z.infer<typeof SecretVersionTagJunctionSchema>;
export type TSecretVersionTagJunctionInsert = Omit<TSecretVersionTagJunction, TImmutableDBKeys>;
export type TSecretVersionTagJunctionUpdate = Partial<Omit<TSecretVersionTagJunction, TImmutableDBKeys>>;
export type TSecretVersionTagJunctionInsert = Omit<z.input<typeof SecretVersionTagJunctionSchema>, TImmutableDBKeys>;
export type TSecretVersionTagJunctionUpdate = Partial<
Omit<z.input<typeof SecretVersionTagJunctionSchema>, TImmutableDBKeys>
>;

View File

@ -36,5 +36,5 @@ export const SecretVersionsSchema = z.object({
});
export type TSecretVersions = z.infer<typeof SecretVersionsSchema>;
export type TSecretVersionsInsert = Omit<TSecretVersions, TImmutableDBKeys>;
export type TSecretVersionsUpdate = Partial<Omit<TSecretVersions, TImmutableDBKeys>>;
export type TSecretVersionsInsert = Omit<z.input<typeof SecretVersionsSchema>, TImmutableDBKeys>;
export type TSecretVersionsUpdate = Partial<Omit<z.input<typeof SecretVersionsSchema>, TImmutableDBKeys>>;

View File

@ -34,5 +34,5 @@ export const SecretsSchema = z.object({
});
export type TSecrets = z.infer<typeof SecretsSchema>;
export type TSecretsInsert = Omit<TSecrets, TImmutableDBKeys>;
export type TSecretsUpdate = Partial<Omit<TSecrets, TImmutableDBKeys>>;
export type TSecretsInsert = Omit<z.input<typeof SecretsSchema>, TImmutableDBKeys>;
export type TSecretsUpdate = Partial<Omit<z.input<typeof SecretsSchema>, TImmutableDBKeys>>;

View File

@ -25,5 +25,5 @@ export const ServiceTokensSchema = z.object({
});
export type TServiceTokens = z.infer<typeof ServiceTokensSchema>;
export type TServiceTokensInsert = Omit<TServiceTokens, TImmutableDBKeys>;
export type TServiceTokensUpdate = Partial<Omit<TServiceTokens, TImmutableDBKeys>>;
export type TServiceTokensInsert = Omit<z.input<typeof ServiceTokensSchema>, TImmutableDBKeys>;
export type TServiceTokensUpdate = Partial<Omit<z.input<typeof ServiceTokensSchema>, TImmutableDBKeys>>;

View File

@ -13,9 +13,10 @@ export const SuperAdminSchema = z.object({
allowSignUp: z.boolean().default(true).nullable().optional(),
createdAt: z.date(),
updatedAt: z.date(),
allowedSignUpDomain: z.string().nullable().optional()
allowedSignUpDomain: z.string().nullable().optional(),
instanceId: z.string().uuid().default("00000000-0000-0000-0000-000000000000")
});
export type TSuperAdmin = z.infer<typeof SuperAdminSchema>;
export type TSuperAdminInsert = Omit<TSuperAdmin, TImmutableDBKeys>;
export type TSuperAdminUpdate = Partial<Omit<TSuperAdmin, TImmutableDBKeys>>;
export type TSuperAdminInsert = Omit<z.input<typeof SuperAdminSchema>, TImmutableDBKeys>;
export type TSuperAdminUpdate = Partial<Omit<z.input<typeof SuperAdminSchema>, TImmutableDBKeys>>;

View File

@ -20,5 +20,5 @@ export const TrustedIpsSchema = z.object({
});
export type TTrustedIps = z.infer<typeof TrustedIpsSchema>;
export type TTrustedIpsInsert = Omit<TTrustedIps, TImmutableDBKeys>;
export type TTrustedIpsUpdate = Partial<Omit<TTrustedIps, TImmutableDBKeys>>;
export type TTrustedIpsInsert = Omit<z.input<typeof TrustedIpsSchema>, TImmutableDBKeys>;
export type TTrustedIpsUpdate = Partial<Omit<z.input<typeof TrustedIpsSchema>, TImmutableDBKeys>>;

View File

@ -16,5 +16,5 @@ export const UserActionsSchema = z.object({
});
export type TUserActions = z.infer<typeof UserActionsSchema>;
export type TUserActionsInsert = Omit<TUserActions, TImmutableDBKeys>;
export type TUserActionsUpdate = Partial<Omit<TUserActions, TImmutableDBKeys>>;
export type TUserActionsInsert = Omit<z.input<typeof UserActionsSchema>, TImmutableDBKeys>;
export type TUserActionsUpdate = Partial<Omit<z.input<typeof UserActionsSchema>, TImmutableDBKeys>>;

View File

@ -25,5 +25,5 @@ export const UserEncryptionKeysSchema = z.object({
});
export type TUserEncryptionKeys = z.infer<typeof UserEncryptionKeysSchema>;
export type TUserEncryptionKeysInsert = Omit<TUserEncryptionKeys, TImmutableDBKeys>;
export type TUserEncryptionKeysUpdate = Partial<Omit<TUserEncryptionKeys, TImmutableDBKeys>>;
export type TUserEncryptionKeysInsert = Omit<z.input<typeof UserEncryptionKeysSchema>, TImmutableDBKeys>;
export type TUserEncryptionKeysUpdate = Partial<Omit<z.input<typeof UserEncryptionKeysSchema>, TImmutableDBKeys>>;

View File

@ -24,5 +24,5 @@ export const UsersSchema = z.object({
});
export type TUsers = z.infer<typeof UsersSchema>;
export type TUsersInsert = Omit<TUsers, TImmutableDBKeys>;
export type TUsersUpdate = Partial<Omit<TUsers, TImmutableDBKeys>>;
export type TUsersInsert = Omit<z.input<typeof UsersSchema>, TImmutableDBKeys>;
export type TUsersUpdate = Partial<Omit<z.input<typeof UsersSchema>, TImmutableDBKeys>>;

View File

@ -25,5 +25,5 @@ export const WebhooksSchema = z.object({
});
export type TWebhooks = z.infer<typeof WebhooksSchema>;
export type TWebhooksInsert = Omit<TWebhooks, TImmutableDBKeys>;
export type TWebhooksUpdate = Partial<Omit<TWebhooks, TImmutableDBKeys>>;
export type TWebhooksInsert = Omit<z.input<typeof WebhooksSchema>, TImmutableDBKeys>;
export type TWebhooksUpdate = Partial<Omit<z.input<typeof WebhooksSchema>, TImmutableDBKeys>>;

View File

@ -505,6 +505,9 @@ export const licenseServiceFactory = ({ orgDAL, permissionService, licenseDAL }:
get isValidLicense() {
return isValidLicense;
},
getInstanceType() {
return instanceType;
},
getPlan,
updateSubscriptionOrgMemberCount,
refreshPlan,

View File

@ -240,7 +240,7 @@ export const secretRotationQueueFactory = ({
);
});
telemetryService.sendPostHogEvents({
await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretRotated,
distinctId: "",
properties: {

View File

@ -158,7 +158,7 @@ export const secretScanningQueueFactory = ({
});
}
telemetryService.sendPostHogEvents({
await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretScannerPush,
distinctId: repository.fullName,
properties: {
@ -228,7 +228,7 @@ export const secretScanningQueueFactory = ({
});
}
telemetryService.sendPostHogEvents({
await telemetryService.sendPostHogEvents({
event: PostHogEventTypes.SecretScannerFull,
distinctId: repository.fullName,
properties: {

View File

@ -0,0 +1,20 @@
import { Redis } from "ioredis";
export type TKeyStoreFactory = ReturnType<typeof keyStoreFactory>;
export const keyStoreFactory = (redisUrl: string) => {
const redis = new Redis(redisUrl);
const setItem = async (key: string, value: string | number | Buffer) => redis.set(key, value);
const getItem = async (key: string) => redis.get(key);
const setItemWithExpiry = async (key: string, exp: number | string, value: string | number | Buffer) =>
redis.setex(key, exp, value);
const deleteItem = async (key: string) => redis.del(key);
const incrementBy = async (key: string, value: number) => redis.incrby(key, value);
return { setItem, getItem, setItemWithExpiry, deleteItem, incrementBy };
};

View File

@ -94,14 +94,17 @@ const envSchema = z
SECRET_SCANNING_WEBHOOK_SECRET: zpStr(z.string().optional()),
SECRET_SCANNING_GIT_APP_ID: zpStr(z.string().optional()),
SECRET_SCANNING_PRIVATE_KEY: zpStr(z.string().optional()),
// LICENCE
// LICENSE
LICENSE_SERVER_URL: zpStr(z.string().optional().default("https://portal.infisical.com")),
LICENSE_SERVER_KEY: zpStr(z.string().optional()),
LICENSE_KEY: zpStr(z.string().optional()),
// GENERIC
STANDALONE_MODE: z
.enum(["true", "false"])
.transform((val) => val === "true")
.optional()
.optional(),
INFISICAL_CLOUD: zodStrBool.default("false")
})
.transform((data) => ({
...data,

View File

@ -1,6 +1,7 @@
import dotenv from "dotenv";
import { initDbConnection } from "./db";
import { keyStoreFactory } from "./keystore/keystore";
import { formatSmtpConfig, initEnvConfig } from "./lib/config/env";
import { initLogger } from "./lib/logger";
import { queueServiceFactory } from "./queue";
@ -19,8 +20,9 @@ const run = async () => {
const smtp = smtpServiceFactory(formatSmtpConfig());
const queue = queueServiceFactory(appCfg.REDIS_URL);
const keyStore = keyStoreFactory(appCfg.REDIS_URL);
const server = await main({ db, smtp, logger, queue });
const server = await main({ db, smtp, logger, queue, keyStore });
const bootstrap = await bootstrapCheck({ db });
// eslint-disable-next-line
process.on("SIGINT", async () => {

View File

@ -13,6 +13,7 @@ export enum QueueName {
SecretReminder = "secret-reminder",
AuditLog = "audit-log",
AuditLogPrune = "audit-log-prune",
TelemetryInstanceStats = "telemtry-self-hosted-stats",
IntegrationSync = "sync-integrations",
SecretWebhook = "secret-webhook",
SecretFullRepoScan = "secret-full-repo-scan",
@ -26,6 +27,7 @@ export enum QueueJobs {
AuditLog = "audit-log-job",
AuditLogPrune = "audit-log-prune-job",
SecWebhook = "secret-webhook-trigger",
TelemetryInstanceStats = "telemetry-self-hosted-stats",
IntegrationSync = "secret-integration-pull",
SecretScan = "secret-scan",
UpgradeProjectToGhost = "upgrade-project-to-ghost-job"
@ -67,7 +69,6 @@ export type TQueueJobTypes = {
payload: TScanFullRepoEventPayload;
};
[QueueName.SecretPushEventScan]: { name: QueueJobs.SecretScan; payload: TScanPushEventPayload };
[QueueName.UpgradeProjectToGhost]: {
name: QueueJobs.UpgradeProjectToGhost;
payload: {
@ -81,6 +82,10 @@ export type TQueueJobTypes = {
};
};
};
[QueueName.TelemetryInstanceStats]: {
name: QueueJobs.TelemetryInstanceStats;
payload: undefined;
};
};
export type TQueueServiceFactory = ReturnType<typeof queueServiceFactory>;

View File

@ -14,6 +14,7 @@ import fasitfy from "fastify";
import { Knex } from "knex";
import { Logger } from "pino";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { TQueueServiceFactory } from "@app/queue";
import { TSmtpService } from "@app/services/smtp/smtp-service";
@ -31,10 +32,11 @@ type TMain = {
smtp: TSmtpService;
logger?: Logger;
queue: TQueueServiceFactory;
keyStore: TKeyStoreFactory;
};
// Run the server!
export const main = async ({ db, smtp, logger, queue }: TMain) => {
export const main = async ({ db, smtp, logger, queue, keyStore }: TMain) => {
const appCfg = getConfig();
const server = fasitfy({
logger: appCfg.NODE_ENV === "test" ? false : logger,
@ -70,7 +72,7 @@ export const main = async ({ db, smtp, logger, queue }: TMain) => {
}
await server.register(helmet, { contentSecurityPolicy: false });
await server.register(registerRoutes, { smtp, queue, db });
await server.register(registerRoutes, { smtp, queue, db, keyStore });
if (appCfg.isProductionMode) {
await server.register(registerExternalNextjs, {

View File

@ -34,6 +34,7 @@ import { snapshotFolderDALFactory } from "@app/ee/services/secret-snapshot/snaps
import { snapshotSecretDALFactory } from "@app/ee/services/secret-snapshot/snapshot-secret-dal";
import { trustedIpDALFactory } from "@app/ee/services/trusted-ip/trusted-ip-dal";
import { trustedIpServiceFactory } from "@app/ee/services/trusted-ip/trusted-ip-service";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { TQueueServiceFactory } from "@app/queue";
import { apiKeyDALFactory } from "@app/services/api-key/api-key-dal";
@ -96,6 +97,8 @@ import { serviceTokenServiceFactory } from "@app/services/service-token/service-
import { TSmtpService } from "@app/services/smtp/smtp-service";
import { superAdminDALFactory } from "@app/services/super-admin/super-admin-dal";
import { getServerCfg, superAdminServiceFactory } from "@app/services/super-admin/super-admin-service";
import { telemetryDALFactory } from "@app/services/telemetry/telemetry-dal";
import { telemetryQueueServiceFactory } from "@app/services/telemetry/telemetry-queue";
import { telemetryServiceFactory } from "@app/services/telemetry/telemetry-service";
import { userDALFactory } from "@app/services/user/user-dal";
import { userServiceFactory } from "@app/services/user/user-service";
@ -112,7 +115,12 @@ import { registerV3Routes } from "./v3";
export const registerRoutes = async (
server: FastifyZodProvider,
{ db, smtp: smtpService, queue: queueService }: { db: Knex; smtp: TSmtpService; queue: TQueueServiceFactory }
{
db,
smtp: smtpService,
queue: queueService,
keyStore
}: { db: Knex; smtp: TSmtpService; queue: TQueueServiceFactory; keyStore: TKeyStoreFactory }
) => {
await server.register(registerSecretScannerGhApp, { prefix: "/ss-webhook" });
@ -159,6 +167,7 @@ export const registerRoutes = async (
const auditLogDAL = auditLogDALFactory(db);
const trustedIpDAL = trustedIpDALFactory(db);
const scimDAL = scimDALFactory(db);
const telemetryDAL = telemetryDALFactory(db);
// ee db layer ops
const permissionDAL = permissionDALFactory(db);
@ -226,7 +235,16 @@ export const registerRoutes = async (
smtpService
});
const telemetryService = telemetryServiceFactory();
const telemetryService = telemetryServiceFactory({
keyStore,
licenseService
});
const telemetryQueue = telemetryQueueServiceFactory({
keyStore,
telemetryDAL,
queueService
});
const tokenService = tokenServiceFactory({ tokenDAL: authTokenDAL, userDAL });
const userService = userServiceFactory({ userDAL });
const loginService = authLoginServiceFactory({ userDAL, smtpService, tokenService });
@ -263,7 +281,8 @@ export const registerRoutes = async (
userDAL,
authService: loginService,
serverCfgDAL: superAdminDAL,
orgService
orgService,
keyStore
});
const apiKeyService = apiKeyServiceFactory({ apiKeyDAL, userDAL });
@ -491,9 +510,13 @@ export const registerRoutes = async (
});
await superAdminService.initServerCfg();
await auditLogQueue.startAuditLogPruneJob();
//
// setup the communication with license key server
await licenseService.init();
await auditLogQueue.startAuditLogPruneJob();
await telemetryQueue.startTelemetryCheck();
// inject all services
server.decorate<FastifyZodProvider["services"]>("services", {
login: loginService,

View File

@ -16,7 +16,7 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
schema: {
response: {
200: z.object({
config: SuperAdminSchema
config: SuperAdminSchema.omit({ createdAt: true, updatedAt: true })
})
}
},
@ -90,7 +90,7 @@ export const registerAdminRouter = async (server: FastifyZodProvider) => {
userAgent: req.headers["user-agent"] || ""
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.AdminInit,
distinctId: user.user.email,
properties: {

View File

@ -51,7 +51,7 @@ export const registerIdentityRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.MachineIdentityCreated,
distinctId: getTelemetryDistinctId(req),
properties: {

View File

@ -82,7 +82,7 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.IntegrationCreated,
distinctId: getTelemetryDistinctId(req),
properties: {

View File

@ -32,7 +32,7 @@ export const registerInviteOrgRouter = async (server: FastifyZodProvider) => {
actorOrgId: req.permission.orgId
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.UserOrgInvitation,
distinctId: getTelemetryDistinctId(req),
properties: {

View File

@ -154,7 +154,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
slug: req.body.slug
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.ProjectCreated,
distinctId: getTelemetryDistinctId(req),
properties: {

View File

@ -95,7 +95,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -185,7 +185,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -261,7 +261,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -336,7 +336,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -406,7 +406,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -512,7 +512,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
(req.headers["user-agent"] !== "k8-operator" || shouldRecordK8Event);
const approximateNumberTotalSecrets = secrets.length * 20;
if (shouldCapture) {
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -589,7 +589,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretPulled,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -752,7 +752,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -934,7 +934,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -1052,7 +1052,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -1172,7 +1172,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretCreated,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -1292,7 +1292,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretUpdated,
distinctId: getTelemetryDistinctId(req),
properties: {
@ -1400,7 +1400,7 @@ export const registerSecretRouter = async (server: FastifyZodProvider) => {
}
});
server.services.telemetry.sendPostHogEvents({
await server.services.telemetry.sendPostHogEvents({
event: PostHogEventTypes.SecretDeleted,
distinctId: getTelemetryDistinctId(req),
properties: {

View File

@ -238,6 +238,8 @@ export const projectMembershipServiceFactory = ({
if (orgMembers.length !== emails.length) throw new BadRequestError({ message: "Some users are not part of org" });
if (!orgMembers.length) return [];
const existingMembers = await projectMembershipDAL.find({
projectId,
$in: { userId: orgMembers.map(({ user }) => user.id).filter(Boolean) }

View File

@ -19,7 +19,7 @@ export const secretTagServiceFactory = ({ secretTagDAL, permissionService }: TSe
const { permission } = await permissionService.getProjectPermission(actor, actorId, projectId, actorOrgId);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Create, ProjectPermissionSub.Tags);
const existingTag = await secretTagDAL.findOne({ slug });
const existingTag = await secretTagDAL.findOne({ slug, projectId });
if (existingTag) throw new BadRequestError({ message: "Tag already exist" });
const newTag = await secretTagDAL.create({

View File

@ -1,4 +1,5 @@
import { TSuperAdmin, TSuperAdminUpdate } from "@app/db/schemas";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { BadRequestError } from "@app/lib/errors";
@ -14,6 +15,7 @@ type TSuperAdminServiceFactoryDep = {
userDAL: TUserDALFactory;
authService: Pick<TAuthLoginFactory, "generateUserTokens">;
orgService: Pick<TOrgServiceFactory, "createOrganization">;
keyStore: Pick<TKeyStoreFactory, "getItem" | "setItemWithExpiry" | "deleteItem">;
};
export type TSuperAdminServiceFactory = ReturnType<typeof superAdminServiceFactory>;
@ -21,26 +23,53 @@ export type TSuperAdminServiceFactory = ReturnType<typeof superAdminServiceFacto
// eslint-disable-next-line
export let getServerCfg: () => Promise<TSuperAdmin>;
const ADMIN_CONFIG_KEY = "infisical-admin-cfg";
const ADMIN_CONFIG_KEY_EXP = 60; // 60s
const ADMIN_CONFIG_DB_UUID = "00000000-0000-0000-0000-000000000000";
export const superAdminServiceFactory = ({
serverCfgDAL,
userDAL,
authService,
orgService
orgService,
keyStore
}: TSuperAdminServiceFactoryDep) => {
const initServerCfg = async () => {
// TODO(akhilmhdh): bad pattern time less change this later to me itself
getServerCfg = () => serverCfgDAL.findOne({});
getServerCfg = async () => {
const config = await keyStore.getItem(ADMIN_CONFIG_KEY);
// missing in keystore means fetch from db
if (!config) {
const serverCfg = await serverCfgDAL.findById(ADMIN_CONFIG_DB_UUID);
if (serverCfg) {
await keyStore.setItemWithExpiry(ADMIN_CONFIG_KEY, ADMIN_CONFIG_KEY_EXP, JSON.stringify(serverCfg)); // insert it back to keystore
}
return serverCfg;
}
const serverCfg = await serverCfgDAL.findOne({});
const keyStoreServerCfg = JSON.parse(config) as TSuperAdmin;
return {
...keyStoreServerCfg,
// this is to allow admin router to work
createdAt: new Date(keyStoreServerCfg.createdAt),
updatedAt: new Date(keyStoreServerCfg.updatedAt)
};
};
// reset on initialized
await keyStore.deleteItem(ADMIN_CONFIG_KEY);
const serverCfg = await serverCfgDAL.findById(ADMIN_CONFIG_DB_UUID);
if (serverCfg) return;
const newCfg = await serverCfgDAL.create({ initialized: false, allowSignUp: true });
// @ts-expect-error id is kept as fixed for idempotence and to avoid race condition
const newCfg = await serverCfgDAL.create({ initialized: false, allowSignUp: true, id: ADMIN_CONFIG_DB_UUID });
return newCfg;
};
const updateServerCfg = async (data: TSuperAdminUpdate) => {
const serverCfg = await getServerCfg();
const cfg = await serverCfgDAL.updateById(serverCfg.id, data);
return cfg;
const updatedServerCfg = await serverCfgDAL.updateById(ADMIN_CONFIG_DB_UUID, data);
await keyStore.setItemWithExpiry(ADMIN_CONFIG_KEY, ADMIN_CONFIG_KEY_EXP, JSON.stringify(updatedServerCfg));
return updatedServerCfg;
};
const adminSignUp = async ({

View File

@ -0,0 +1,39 @@
import { TDbClient } from "@app/db";
import { TableName } from "@app/db/schemas";
import { DatabaseError } from "@app/lib/errors";
export type TTelemetryDALFactory = ReturnType<typeof telemetryDALFactory>;
export const telemetryDALFactory = (db: TDbClient) => {
const getTelemetryInstanceStats = async () => {
try {
const userCount = (await db(TableName.Users).where({ isGhost: false }).count().first())?.count as string;
const users = parseInt(userCount || "0", 10);
const identityCount = (await db(TableName.Identity).count().first())?.count as string;
const identities = parseInt(identityCount || "0", 10);
const projectCount = (await db(TableName.Project).count().first())?.count as string;
const projects = parseInt(projectCount || "0", 10);
const secretCount = (await db(TableName.Secret).count().first())?.count as string;
const secrets = parseInt(secretCount || "0", 10);
const organizationNames = await db(TableName.Organization).select("name");
const organizations = organizationNames.length;
return {
users,
identities,
projects,
secrets,
organizations,
organizationNames: organizationNames.map(({ name }) => name)
};
} catch (error) {
throw new DatabaseError({ error, name: "TelemtryInstanceStats" });
}
};
return { getTelemetryInstanceStats };
};

View File

@ -0,0 +1,78 @@
import { PostHog } from "posthog-node";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { logger } from "@app/lib/logger";
import { QueueJobs, QueueName, TQueueServiceFactory } from "@app/queue";
import { getServerCfg } from "../super-admin/super-admin-service";
import { TTelemetryDALFactory } from "./telemetry-dal";
import { TELEMETRY_SECRET_OPERATIONS_KEY, TELEMETRY_SECRET_PROCESSED_KEY } from "./telemetry-service";
import { PostHogEventTypes } from "./telemetry-types";
type TTelemetryQueueServiceFactoryDep = {
queueService: TQueueServiceFactory;
keyStore: Pick<TKeyStoreFactory, "getItem" | "deleteItem">;
telemetryDAL: TTelemetryDALFactory;
};
export type TTelemetryQueueServiceFactory = ReturnType<typeof telemetryQueueServiceFactory>;
export const telemetryQueueServiceFactory = ({
queueService,
keyStore,
telemetryDAL
}: TTelemetryQueueServiceFactoryDep) => {
const appCfg = getConfig();
const postHog =
appCfg.isProductionMode && appCfg.TELEMETRY_ENABLED
? new PostHog(appCfg.POSTHOG_PROJECT_API_KEY, { host: appCfg.POSTHOG_HOST, flushAt: 1, flushInterval: 0 })
: undefined;
queueService.start(QueueName.TelemetryInstanceStats, async () => {
const { instanceId } = await getServerCfg();
const telemtryStats = await telemetryDAL.getTelemetryInstanceStats();
// parse the redis values into integer
const numberOfSecretOperationsMade = parseInt((await keyStore.getItem(TELEMETRY_SECRET_OPERATIONS_KEY)) || "0", 10);
const numberOfSecretProcessed = parseInt((await keyStore.getItem(TELEMETRY_SECRET_PROCESSED_KEY)) || "0", 10);
const stats = { ...telemtryStats, numberOfSecretProcessed, numberOfSecretOperationsMade };
// send to postHog
postHog?.capture({
event: PostHogEventTypes.TelemetryInstanceStats,
distinctId: instanceId,
properties: stats
});
// reset the stats
await keyStore.deleteItem(TELEMETRY_SECRET_PROCESSED_KEY);
await keyStore.deleteItem(TELEMETRY_SECRET_OPERATIONS_KEY);
});
// every day at midnight a telemetry job executes on self hosted
// this sends some telemetry information like instance id secrets operated etc
const startTelemetryCheck = async () => {
// this is a fast way to check its cloud or not
if (appCfg.INFISICAL_CLOUD) return;
// clear previous job
await queueService.stopRepeatableJob(
QueueName.TelemetryInstanceStats,
QueueJobs.TelemetryInstanceStats,
{ pattern: "0 0 * * *", utc: true },
QueueName.TelemetryInstanceStats // just a job id
);
if (postHog) {
await queueService.queue(QueueName.TelemetryInstanceStats, QueueJobs.TelemetryInstanceStats, undefined, {
jobId: QueueName.TelemetryInstanceStats,
repeat: { pattern: "0 0 * * *", utc: true }
});
}
};
queueService.listen(QueueName.TelemetryInstanceStats, "failed", (err) => {
logger.error(err?.failedReason, `${QueueName.TelemetryInstanceStats}: failed`);
});
return {
startTelemetryCheck
};
};

View File

@ -1,15 +1,24 @@
import { PostHog } from "posthog-node";
import { TLicenseServiceFactory } from "@app/ee/services/license/license-service";
import { InstanceType } from "@app/ee/services/license/license-types";
import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request";
import { logger } from "@app/lib/logger";
import { TPostHogEvent } from "./telemetry-types";
import { PostHogEventTypes, TPostHogEvent, TSecretModifiedEvent } from "./telemetry-types";
export const TELEMETRY_SECRET_PROCESSED_KEY = "telemetry-secret-processed";
export const TELEMETRY_SECRET_OPERATIONS_KEY = "telemetry-secret-operations";
export type TTelemetryServiceFactory = ReturnType<typeof telemetryServiceFactory>;
export type TTelemetryServiceFactoryDep = {
keyStore: Pick<TKeyStoreFactory, "getItem" | "incrementBy">;
licenseService: Pick<TLicenseServiceFactory, "getInstanceType">;
};
// type TTelemetryServiceFactoryDep = {};
export const telemetryServiceFactory = () => {
export const telemetryServiceFactory = ({ keyStore, licenseService }: TTelemetryServiceFactoryDep) => {
const appCfg = getConfig();
if (appCfg.isProductionMode && !appCfg.TELEMETRY_ENABLED) {
@ -21,10 +30,9 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
`);
}
const postHog =
appCfg.isProductionMode && appCfg.TELEMETRY_ENABLED
? new PostHog(appCfg.POSTHOG_PROJECT_API_KEY, { host: appCfg.POSTHOG_HOST })
: undefined;
const postHog = appCfg.TELEMETRY_ENABLED
? new PostHog(appCfg.POSTHOG_PROJECT_API_KEY, { host: appCfg.POSTHOG_HOST })
: undefined;
// used for email marketting email sending purpose
const sendLoopsEvent = async (email: string, firstName?: string, lastName?: string) => {
@ -51,13 +59,33 @@ To opt into telemetry, you can set "TELEMETRY_ENABLED=true" within the environme
}
};
const sendPostHogEvents = (event: TPostHogEvent) => {
const sendPostHogEvents = async (event: TPostHogEvent) => {
if (postHog) {
postHog.capture({
event: event.event,
distinctId: event.distinctId,
properties: event.properties
});
const instanceType = licenseService.getInstanceType();
// capture posthog only when its cloud or signup event happens in self hosted
if (instanceType === InstanceType.Cloud || event.event === PostHogEventTypes.UserSignedUp) {
postHog.capture({
event: event.event,
distinctId: event.distinctId,
properties: event.properties
});
return;
}
if (
[
PostHogEventTypes.SecretPulled,
PostHogEventTypes.SecretCreated,
PostHogEventTypes.SecretDeleted,
PostHogEventTypes.SecretUpdated
].includes(event.event)
) {
await keyStore.incrementBy(
TELEMETRY_SECRET_PROCESSED_KEY,
(event as TSecretModifiedEvent).properties.numberOfSecrets
);
await keyStore.incrementBy(TELEMETRY_SECRET_OPERATIONS_KEY, 1);
}
}
};

View File

@ -12,7 +12,8 @@ export enum PostHogEventTypes {
ProjectCreated = "Project Created",
IntegrationCreated = "Integration Created",
MachineIdentityCreated = "Machine Identity Created",
UserOrgInvitation = "User Org Invitation"
UserOrgInvitation = "User Org Invitation",
TelemetryInstanceStats = "Self Hosted Instance Stats"
}
export type TSecretModifiedEvent = {
@ -101,6 +102,20 @@ export type TUserOrgInvitedEvent = {
};
};
export type TTelemetryInstanceStatsEvent = {
event: PostHogEventTypes.TelemetryInstanceStats;
properties: {
users: number;
identities: number;
projects: number;
secrets: number;
organizations: number;
organizationNames: number;
numberOfSecretOperationsMade: number;
numberOfSecretProcessed: number;
};
};
export type TPostHogEvent = { distinctId: string } & (
| TSecretModifiedEvent
| TAdminInitEvent
@ -110,4 +125,5 @@ export type TPostHogEvent = { distinctId: string } & (
| TMachineIdentityCreatedEvent
| TIntegrationCreatedEvent
| TProjectCreateEvent
| TTelemetryInstanceStatsEvent
);

View File

@ -87,16 +87,12 @@ var exportCmd = &cobra.Command{
var output string
if shouldExpandSecrets {
substitutions := util.ExpandSecrets(secrets, infisicalToken, "")
output, err = formatEnvs(substitutions, format)
if err != nil {
util.HandleError(err)
}
} else {
output, err = formatEnvs(secrets, format)
if err != nil {
util.HandleError(err)
}
secrets = util.ExpandSecrets(secrets, infisicalToken, "")
}
secrets = util.FilterSecretsByTag(secrets, tagSlugs)
output, err = formatEnvs(secrets, format)
if err != nil {
util.HandleError(err)
}
fmt.Print(output)

View File

@ -220,6 +220,30 @@ func InjectImportedSecret(plainTextWorkspaceKey []byte, secrets []models.SingleE
return secrets, nil
}
func FilterSecretsByTag(plainTextSecrets []models.SingleEnvironmentVariable, tagSlugs string) []models.SingleEnvironmentVariable {
if tagSlugs == "" {
return plainTextSecrets
}
tagSlugsMap := make(map[string]bool)
tagSlugsList := strings.Split(tagSlugs, ",")
for _, slug := range tagSlugsList {
tagSlugsMap[slug] = true
}
filteredSecrets := []models.SingleEnvironmentVariable{}
for _, secret := range plainTextSecrets {
for _, tag := range secret.Tags {
if tagSlugsMap[tag.Slug] {
filteredSecrets = append(filteredSecrets, secret)
break
}
}
}
return filteredSecrets
}
func GetAllEnvironmentVariables(params models.GetAllSecretsParameters, projectConfigFilePath string) ([]models.SingleEnvironmentVariable, error) {
var infisicalToken string
if params.InfisicalToken == "" {

View File

@ -86,6 +86,7 @@ services:
environment:
- NODE_ENV=development
- DB_CONNECTION_URI=postgres://infisical:infisical@db/infisical?sslmode=disable
- TELEMETRY_ENABLED=false
volumes:
- ./backend/src:/app/src

View File

@ -4,10 +4,12 @@ services:
db-migration:
container_name: infisical-db-migration
depends_on:
- db
db:
condition: service_healthy
image: infisical/infisical:latest-postgres
env_file: .env
command: npm run migration:latest
pull_policy: always
networks:
- infisical
@ -16,12 +18,13 @@ services:
restart: unless-stopped
depends_on:
db:
condition: service_started
condition: service_healthy
redis:
condition: service_started
db-migration:
condition: service_completed_successfully
image: infisical/infisical:latest-postgres
pull_policy: always
env_file: .env
ports:
- 80:8080
@ -49,9 +52,14 @@ services:
restart: always
env_file: .env
volumes:
- pg_data:/data/db
- pg_data:/var/lib/postgresql/data
networks:
- infisical
healthcheck:
test: "pg_isready --username=${POSTGRES_USER} && psql --username=${POSTGRES_USER} --list"
interval: 5s
timeout: 10s
retries: 10
volumes:
pg_data:

View File

@ -16,49 +16,7 @@ git checkout -b MY_BRANCH_NAME
## Set up environment variables
Start by creating a .env file at the root of the Infisical directory then copy the contents of the file below into the .env file.
<Accordion title=".env file content">
```env
# Keys
# Required key for platform encryption/decryption ops
ENCRYPTION_KEY=6c1fe4e407b8911c104518103505b218
# JWT
# Required secrets to sign JWT tokens
JWT_SIGNUP_SECRET=3679e04ca949f914c03332aaaeba805a
JWT_REFRESH_SECRET=5f2f3c8f0159068dc2bbb3a652a716ff
JWT_AUTH_SECRET=4be6ba5602e0fa0ac6ac05c3cd4d247f
JWT_SERVICE_SECRET=f32f716d70a42c5703f4656015e76200
# MongoDB
# Backend will connect to the MongoDB instance at connection string MONGO_URL which can either be a ref
# to the MongoDB container instance or Mongo Cloud
# Required
MONGO_URL=mongodb://root:example@mongo:27017/?authSource=admin
# Optional credentials for MongoDB container instance and Mongo-Express
MONGO_USERNAME=root
MONGO_PASSWORD=example
# Website URL
# Required
SITE_URL=http://localhost:8080
# Mail/SMTP
SMTP_HOST='smtp-server'
SMTP_PORT='1025'
SMTP_NAME='local'
SMTP_USERNAME='team@infisical.com'
SMTP_PASSWORD=
```
</Accordion>
<Warning>
The pre-populated environment variable values above are meant to be used in development only. They should never be used in production.
</Warning>
View all available [environment variables](https://infisical.com/docs/self-hosting/configuration/envars) and guidance for each.
Start by creating a .env file at the root of the Infisical directory then copy the contents of the file linked [here](https://github.com/Infisical/infisical/blob/main/.env.example). View all available [environment variables](https://infisical.com/docs/self-hosting/configuration/envars) and guidance for each.
## Starting Infisical for development
@ -72,10 +30,7 @@ docker-compose -f docker-compose.dev.yml up --build --force-recreate
```
#### Access local server
Once all the services have spun up, browse to http://localhost:8080. To sign in, you may use the default credentials listed below.
Email: `test@localhost.local`
Password: `testInfisical1`
Once all the services have spun up, browse to http://localhost:8080.
#### Shutdown local server

View File

@ -2,7 +2,7 @@
title: "Introduction"
---
Infisical is an [open-source](https://opensource.com/resources/what-open-source), [end-to-end encrypted](https://en.wikipedia.org/wiki/End-to-end_encryption) secret management platform for storing, managing, and syncing
Infisical is an [open-source](https://opensource.com/resources/what-open-source), [end-to-end encrypted](https://en.wikipedia.org/wiki/End-to-end_encryption) secrets management platform for storing, managing, and syncing
application configuration and secrets like API keys, database credentials, and environment variables across applications and infrastructure.
Start syncing environment variables with [Infisical Cloud](https://app.infisical.com) or learn how to [host Infisical](/self-hosting/overview) yourself.

View File

@ -0,0 +1,36 @@
---
title: "LDAP"
description: "Log in to Infisical with LDAP"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact team@infisical.com to purchase an enterprise license to use it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol); this includes support for Active Directory.
<Steps>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
Next, input your LDAP server settings.
![LDAP configuration](/images/platform/ldap/ldap-config.png)
Here's some guidance for each field:
- URL: The LDAP server to connect to such as `ldap://ldap.your-org.com`, `ldaps://ldap.myorg.com:636` (for connection over SSL/TLS), etc.
- Bind DN: The distinguished name of object to bind when performing the user search such as `cn=infisical,ou=Users,dc=acme,dc=com`.
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=example,dc=com`
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate.
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>

View File

@ -0,0 +1,36 @@
---
title: "General LDAP"
description: "Log in to Infisical with LDAP"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact team@infisical.com to purchase an enterprise license to use it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol)
<Steps>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
Next, input your LDAP server settings.
![LDAP configuration](/images/platform/ldap/ldap-config.png)
Here's some guidance for each field:
- URL: The LDAP server to connect to such as `ldap://ldap.your-org.com`, `ldaps://ldap.myorg.com:636` (for connection over SSL/TLS), etc.
- Bind DN: The distinguished name of object to bind when performing the user search such as `cn=infisical,ou=Users,dc=acme,dc=com`.
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search such as `ou=Users,dc=example,dc=com`
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate.
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>

View File

@ -0,0 +1,56 @@
---
title: "JumpCloud LDAP"
description: "Configure JumpCloud LDAP for Logging into Infisical"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact team@infisical.com to purchase an enterprise license to use it.
</Info>
<Steps>
<Step title="Prepare LDAP in JumpCloud">
In JumpCloud, head to USER MANAGEMENT > Users and create a new user via the **Manual user entry** option. This user
will be used as a privileged service account to facilitate Infisical's ability to bind/search the LDAP directory.
When creating the user, input their **First Name**, **Last Name**, **Username** (required), **Company Email** (required), and **Description**.
Also, create a password for the user.
Next, under User Security Settings and Permissions > Permission Settings, check the box next to **Enable as LDAP Bind DN**.
![LDAP JumpCloud](/images/platform/ldap/jumpcloud/ldap-jumpcloud-enable-bind-dn.png)
</Step>
<Step title="Prepare the LDAP configuration in Infisical">
In Infisical, head to your Organization Settings > Authentication > LDAP Configuration and select **Set up LDAP**.
Next, input your JumpCloud LDAP server settings.
![LDAP configuration](/images/platform/ldap/ldap-config.png)
Here's some guidance for each field:
- URL: The LDAP server to connect to (`ldaps://ldap.jumpcloud.com:636`).
- Bind DN: The distinguished name of object to bind when performing the user search (`uid=<ldap-user-username>,ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com`).
- Bind Pass: The password to use along with `Bind DN` when performing the user search.
- Search Base / User DN: Base DN under which to perform user search (`ou=Users,o=<your-org-id>,dc=jumpcloud,dc=com`).
- CA Certificate: The CA certificate to use when verifying the LDAP server certificate (instructions to obtain the certificate for JumpCloud [here](https://jumpcloud.com/support/connect-to-ldap-with-tls-ssl)).
<Tip>
When filling out the **Bind DN** and **Bind Pass** fields, refer to the username and password of the user created in Step 1.
Also, for the **Bind DN** and **Search Base / User DN** fields, you'll want to use the organization ID that appears
in your LDAP instance **ORG DN**.
</Tip>
</Step>
<Step title="Enable LDAP in Infisical">
Enabling LDAP allows members in your organization to log into Infisical via LDAP.
![LDAP toggle](/images/platform/ldap/ldap-toggle.png)
</Step>
</Steps>
Resources:
- [JumpCloud Cloud LDAP Guide](https://jumpcloud.com/support/use-cloud-ldap)

View File

@ -0,0 +1,23 @@
---
title: "LDAP Overview"
description: "Log in to Infisical with LDAP"
---
<Info>
LDAP is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical,
then you should contact sales@infisical.com to purchase an enterprise license to use it.
</Info>
You can configure your organization in Infisical to have members authenticate with the platform via [LDAP](https://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol)
To note, configuring LDAP retains the end-to-end encrypted architecture of Infisical because we decouple the authentication and decryption steps; the LDAP server cannot and will not have access to the decryption key needed to decrypt your secrets.
LDAP providers:
- Active Directory
- [JumpCloud LDAP](/documentation/platform/ldap/jumpcloud)
- AWS Directory Service
- Foxpass
Check out the general instructions for configuring LDAP [here](/documentation/platform/ldap/general).

View File

@ -0,0 +1,21 @@
---
title: "Enhancing Security and Usability: Project Upgrades"
---
At Infisical, we're constantly striving to elevate the security and usability standards of our platform to better serve our users.
With this commitment in mind, we're excited to introduce our latest addition, non-E2EE projects, aimed at addressing two significant issues while enhancing how clients interact with Infisical programmatically.
Previously, users encountered a challenge where projects risked becoming inaccessible if the project creator deleted their account.
Additionally, our API lacked the capability to interact with projects without dealing with complex cryptographic operations.
These obstacles made API driven automation and collaboration a painful experience for a majority of our users.
To overcome these limitations, our upgrade focuses on disabling end-to-end encryption (E2EE) for projects.
While this may raise eyebrows, it's important to understand that this decision is a strategic move to make Infisical easer to use and interact with.
But what does this mean for our users? Essentially nothing, there are no changes required on your end.
Rest assured, all sensitive data remains encrypted at rest according to the latest industry standards.
Our commitment to security remains unwavering, and this upgrade is a testament to our dedication to delivering on our promises in both security and usability when it comes to secrets management.
To increase consistency with existing and future integrations, all projects created on Infisical from now on will have end-to-end encryption (E2EE) disabled by default.
This will not only reduce confusion for end users, but will also make the Infisical API seamless to use.

View File

@ -1,21 +1,17 @@
---
title: "PostgreSQL/CockroachDB"
description: "Rotated database user password of a postgreSQL or cockroach db"
description: "Rotated database user password of a PostgreSQL or Cockroach DB"
---
Infisical will update periodically the provided database user's password.
<Warning>
At present Infisical do require access to your database. We will soon be released Infisical agent based rotation which would help you rotate without direct database access from Infisical cloud.
</Warning>
## Working
1. User's has to create the two user's for Infisical to rotate and provide them required database access
2. Infisical will connect with your database with admin access
3. If last rotated one was username1, then username2 is chosen to be rotated
5. Update it's password with random value
6. After testing it gets saved to the provided secret mapping
1. User's has to create the two user's for Infisical to rotate and provide them required database access.
2. Infisical will connect with your database with admin access.
3. If last rotated one was username1, then username2 is chosen to be rotated.
5. Update it's password with random value.
6. After testing it gets saved to the provided secret mapping.
## Rotation Configuration
@ -34,4 +30,4 @@ Infisical will update periodically the provided database user's password.
- Finally select the secrets in your provided board to replace with new secret after each rotation
- Your done and good to go.
Congrats. You have 10x your PostgreSQL/CockroachDB access security.
Congratulations. You have improved your PostgreSQL/CockroachDB access security.

Some files were not shown because too many files have changed in this diff Show More