Compare commits

...

119 Commits

Author SHA1 Message Date
Scott Wilson
6c4cb5e084 improvements: address feedback 2024-11-28 08:54:27 -08:00
Scott Wilson
18a2547b24 improvement: move user groups to own tab and add pagination/search/sort to groups tables 2024-11-27 20:35:15 -08:00
Scott Wilson
588b3c77f9 improvement: add pagination/sort to org members table 2024-11-27 19:23:54 -08:00
Scott Wilson
a04834c7c9 improvement: add pagination to project members table 2024-11-27 18:41:20 -08:00
Scott Wilson
c3956c60e9 improvement: add pagination, sort and filtering to identity projects table with minor UI adjustments 2024-11-27 12:05:46 -08:00
Scott Wilson
ecea79f040 fix: hide pagination when no search match 2024-11-26 17:20:49 -08:00
Scott Wilson
586b901318 improvement: add pagination, filtering and sort to users projects table with minor UI improvements 2024-11-26 17:17:18 -08:00
Maidul Islam
ad8d247cdc Merge pull request #2801 from Infisical/omar/eng-1952-address-key-vault-integration-failing-due-to-disabled-secret
Fix(Azure Key Vault): Ignore disabled secrets
2024-11-26 18:54:28 -05:00
McPizza0
33411335ed avoid syncing disabled azure keys 2024-11-27 00:15:10 +01:00
McPizza0
728f023263 remove superfolous trycatch 2024-11-26 23:46:23 +01:00
McPizza0
229706f57f improve filtering 2024-11-26 23:35:32 +01:00
McPizza0
6cf2488326 Fix(Azure Key Vault): Ignore disabled secrets 2024-11-26 23:22:07 +01:00
McPizza
92ce05283b feat: Add new tag when creating secret (#2791)
* feat: Add new tag when creating secret
2024-11-26 21:10:14 +01:00
Maidul Islam
39d92ce6ff Merge pull request #2799 from Infisical/misc/finalize-env-default
misc: finalized env schema handling
2024-11-26 15:10:06 -05:00
Sheen Capadngan
44a026446e misc: finalized env schema handling of bool 2024-11-27 04:06:05 +08:00
Scott Wilson
539e5b1907 Merge pull request #2782 from Infisical/fix-remove-payment-method
Fix: Resolve Remove Payment Method Error
2024-11-26 10:54:55 -08:00
Scott Wilson
44b02d5324 Merge pull request #2780 from Infisical/octopus-deploy-integration
Feature: Octopus Deploy Integration
2024-11-26 08:46:25 -08:00
Scott Wilson
71fb6f1d11 Merge branch 'main' into octopus-deploy-integration 2024-11-26 08:36:09 -08:00
Maidul Islam
e64100fab1 Merge pull request #2796 from akhilmhdh/feat/empty-env-stuck
Random patches
2024-11-26 10:54:16 -05:00
=
5bcf07b32b feat: resolved loading screen frozen on no environment and project switch causes forEach undefined error 2024-11-26 20:27:30 +05:30
=
3b0c48052b fix: frontend failing to give token back in cli login 2024-11-26 20:26:14 +05:30
Maidul Islam
df50e3b0f9 Merge pull request #2793 from akhilmhdh/fix/signup-allow-saml
feat: resolved saml failing when signup is disabled
2024-11-26 09:39:52 -05:00
Daniel Hougaard
bdf2ae40b6 Merge pull request #2751 from Infisical/daniel/sap-ase-db
feat(dynamic-secrets): SAP ASE
2024-11-26 15:13:02 +04:00
Scott Wilson
b6c05a2f25 improvements: address feedback/requests 2024-11-25 16:35:56 -08:00
Daniel Hougaard
960efb9cf9 docs(dynamic-secrets): SAP ASE 2024-11-26 01:54:16 +04:00
Daniel Hougaard
aa8d58abad feat: TDS driver docker support 2024-11-26 01:54:16 +04:00
Daniel Hougaard
cfb0cc4fea Update types.ts 2024-11-26 01:54:16 +04:00
Daniel Hougaard
7712df296c feat: SAP ASE Dynamic Secrets 2024-11-26 01:54:16 +04:00
Daniel Hougaard
7c38932121 fix: minor types improvement 2024-11-26 01:54:16 +04:00
Daniel Hougaard
69ad9845e1 improvement: added $ pattern to existing dynamic providers 2024-11-26 01:54:16 +04:00
Daniel Hougaard
7321c237d7 Merge pull request #2792 from Infisical/daniel/dynamic-secret-renewals
fix(dynamic-secrets): renewal 500 error
2024-11-26 01:52:28 +04:00
McPizza
32430a6a16 feat: Add Project Descriptions (#2774)
* feat:  initial backend project description
2024-11-25 21:59:14 +01:00
=
f034adba76 feat: resolved saml failing when signup is disabled 2024-11-25 22:22:54 +05:30
Daniel Hougaard
463eb0014e fix(dynamic-secrets): renewal 500 error 2024-11-25 20:17:50 +04:00
Daniel Hougaard
21403f6fe5 Merge pull request #2761 from Infisical/daniel/cli-login-domains-fix
fix: allow preset domains for `infisical login`
2024-11-25 16:16:08 +04:00
Daniel Hougaard
2f9e542b31 Merge pull request #2760 from Infisical/daniel/request-ids
feat: request ID support
2024-11-25 16:13:19 +04:00
Daniel Hougaard
089d6812fd Update ldap-fns.ts 2024-11-25 16:00:20 +04:00
Maidul Islam
71c9c0fa1e Merge pull request #2781 from Infisical/daniel/project-slug-500-error
fix: improve project DAL error handling
2024-11-24 19:43:26 -05:00
Scott Wilson
46ad1d47a9 fix: correct payment ID to remove payment method and add confirmation/notification for removal 2024-11-22 19:47:52 -08:00
Daniel Hougaard
2b977eeb33 fix: improve project error handling 2024-11-23 03:42:54 +04:00
McPizza
a692148597 feat(integrations): Add AWS Secrets Manager IAM Role + Region (#2778) 2024-11-23 00:04:33 +01:00
Scott Wilson
b762816e66 chore: remove unused lib 2024-11-22 14:58:47 -08:00
Scott Wilson
cf275979ba feature: octopus deploy integration 2024-11-22 14:47:15 -08:00
Maidul Islam
64bfa4f334 Merge pull request #2779 from Infisical/fix-delete-project-role
Fix: Prevent Updating Identity/User Project Role to reserved "Custom" Slug
2024-11-22 16:23:22 -05:00
Scott Wilson
e3eb14bfd9 fix: add custom slug check to user 2024-11-22 13:09:47 -08:00
Scott Wilson
24b50651c9 fix: correct update role mapping for identity/user and prevent updating role slug to "custom" 2024-11-22 13:02:00 -08:00
Daniel Hougaard
1cd459fda7 Merge branch 'heads/main' into daniel/request-ids 2024-11-23 00:14:50 +04:00
Daniel Hougaard
38917327d9 feat: request lifecycle request ID 2024-11-22 23:19:07 +04:00
Maidul Islam
d7b494c6f8 Merge pull request #2775 from akhilmhdh/fix/patches-3
fix: db error on token auth and permission issue
2024-11-22 12:43:20 -05:00
=
93208afb36 fix: db error on token auth and permission issue 2024-11-22 22:41:53 +05:30
Maidul Islam
1a084d8fcf add direct link td provider 2024-11-21 21:26:46 -05:00
Sheen
dd4f133c6c Merge pull request #2769 from Infisical/misc/made-identity-metadata-value-not-nullable-again
misc: made identity metadata value not nullable
2024-11-22 01:59:01 +08:00
Sheen Capadngan
c41d27e1ae misc: made identity metadata value not nullable 2024-11-21 21:27:56 +08:00
Sheen
1866ed8d23 Merge pull request #2742 from Infisical/feat/totp-dynamic-secret
feat: TOTP dynamic secret provider
2024-11-21 12:00:12 +08:00
Scott Wilson
7b3b232dde replace loader with spinner 2024-11-20 14:09:26 -08:00
Scott Wilson
9d618b4ae9 minor text revisions/additions and add colors/icons to totp token expiry countdown 2024-11-20 14:01:40 -08:00
Vlad Matsiiako
5330ab2171 Merge pull request #2768 from BnjmnZmmrmn/k8s_integration_docs_typo
fixing small typo in docs/integrations/platforms/kubernetes
2024-11-20 15:49:35 -05:00
Sheen Capadngan
662e588c22 misc: add handling for lease regen 2024-11-21 04:43:21 +08:00
Akhil Mohan
90057d80ff Merge pull request #2767 from akhilmhdh/feat/permission-error
Detail error when permission validation error occurs
2024-11-21 02:00:51 +05:30
Scott Wilson
1eda7aaaac reverse license 2024-11-20 12:14:14 -08:00
Sheen Capadngan
00dcadbc08 misc: added timer 2024-11-21 04:09:19 +08:00
Benjamin Riley Zimmerman
7a7289ebd0 fixing typo in docs/integrations/platforms/kubernetes 2024-11-20 11:50:13 -08:00
Scott Wilson
e5d4677fd6 improvements: minor UI/labeling adjustments, only show tags loading if can read, and remove rounded bottom on overview table 2024-11-20 11:50:10 -08:00
Sheen Capadngan
bce3f3d676 misc: addressed review comments 2024-11-21 02:37:56 +08:00
=
300372fa98 feat: resolve dependency cycle error 2024-11-20 23:59:49 +05:30
Maidul Islam
47a4f8bae9 Merge pull request #2766 from Infisical/omar/eng-1886-make-terraform-integration-secrets-marked-as-sensitive
Improvement(Terraform Cloud Integration): Synced secrets are hidden from Terraform UI
2024-11-20 13:16:45 -05:00
=
863719f296 feat: added action button for notification toast and one action each for forbidden error and validation error details 2024-11-20 22:55:14 +05:30
=
7317dc1cf5 feat: modified error handler to return possible rules for a validation failed rules 2024-11-20 22:50:21 +05:30
Daniel Hougaard
75df898e78 Merge pull request #2762 from Infisical/daniel/cli-installer-readme
chore(cli-installer): readme improvements
2024-11-20 20:29:23 +04:00
McPizza0
0de6add3f7 set all new and existing secrets to be sensitive: true 2024-11-20 17:28:09 +01:00
Daniel Hougaard
0c008b6393 Update README.md 2024-11-20 20:26:00 +04:00
Sheen Capadngan
0c3894496c feat: added support for configuring totp with secret key 2024-11-20 23:40:36 +08:00
Daniel Hougaard
35fbd5d49d Merge pull request #2764 from Infisical/daniel/pre-commit-cli-check
chore: check for CLI installation before pre-commit
2024-11-20 19:01:54 +04:00
Daniel Hougaard
d03b453e3d Merge pull request #2765 from Infisical/daniel/actor-id-mismatch
fix(audit-logs): actor / actor ID mismatch
2024-11-20 18:58:33 +04:00
Daniel Hougaard
96e331b678 fix(audit-logs): actor / actor ID mismatch 2024-11-20 18:50:29 +04:00
Daniel Hougaard
d4d468660d chore: check for CLI installation before pre-commit 2024-11-20 17:29:36 +04:00
Daniel Hougaard
75a4965928 requested changes 2024-11-20 16:23:59 +04:00
Sheen Capadngan
660c09ded4 Merge branch 'feat/totp-dynamic-secret' of https://github.com/Infisical/infisical into feat/totp-dynamic-secret 2024-11-20 18:56:56 +08:00
Sheen Capadngan
b5287d91c0 misc: addressed comments 2024-11-20 18:56:16 +08:00
Scott Wilson
6a17763237 docs: dynamic secret doc typos addressed 2024-11-19 19:58:01 -08:00
Daniel Hougaard
f2bd3daea2 Update README.md 2024-11-20 03:24:05 +04:00
Daniel Hougaard
7f70f96936 fix: allow preset domains for infisical login 2024-11-20 01:06:18 +04:00
Daniel Hougaard
73e0a54518 feat: request ID support 2024-11-20 00:01:25 +04:00
Daniel Hougaard
0d295a2824 fix: application crash on zod api error 2024-11-20 00:00:30 +04:00
Daniel Hougaard
9a62efea4f Merge pull request #2759 from Infisical/docs-update-note
update docs note
2024-11-19 23:51:44 +04:00
Vladyslav Matsiiako
506c30bcdb update docs note 2024-11-19 14:47:39 -05:00
Maidul Islam
735ad4ff65 Merge pull request #1924 from Infisical/misc/metrics-observability
feat: added setup for production observability (metrics via OTEL)
2024-11-19 13:56:29 -05:00
Daniel Hougaard
421d8578b7 Merge pull request #2756 from Infisical/daniel/access-token-cleanup
fix(identity): remove access tokens when auth method is removed
2024-11-19 22:31:51 +04:00
Daniel Hougaard
6685f8aa0a fix(identity): remove access tokens when auth method is removed 2024-11-19 22:24:17 +04:00
Maidul Islam
54f3f94185 Merge pull request #2741 from phamleduy04/sort-repo-github-intergration-app
Add sort to Github integration dropdown box
2024-11-19 11:46:43 -05:00
Scott Wilson
907537f7c0 Merge pull request #2755 from Infisical/empty-secret-value-fixes
Fix: Handle Empty Secret Values in Update, Bulk Create and Bulk Update Secret(s)
2024-11-19 08:45:38 -08:00
Scott Wilson
61263b9384 fix: unhandle empty value in bulk create/insert secrets 2024-11-19 08:30:58 -08:00
Scott Wilson
b6d8be2105 fix: handle empty string to allow clearing secret on update 2024-11-19 08:16:30 -08:00
Maidul Islam
61d516ef35 Merge pull request #2754 from Infisical/daniel/azure-auth-better-error 2024-11-19 09:00:23 -05:00
Daniel Hougaard
31fc64fb4c Update identity-azure-auth-service.ts 2024-11-19 17:54:31 +04:00
Maidul Islam
8bf7e4c4d1 Merge pull request #2743 from akhilmhdh/fix/auth-method-migration
fix: migration in loop due to cornercase
2024-11-18 16:01:04 -05:00
=
2027d4b44e feat: moved auth method deletion to top 2024-11-19 02:17:25 +05:30
Maidul Islam
d401c9074e Merge pull request #2715 from Infisical/misc/finalize-org-migration-script
misc: finalize org migration script
2024-11-18 14:15:20 -05:00
Sheen
afe35dbbb5 Merge pull request #2747 from Infisical/misc/finalized-design-of-totp-registration
misc: finalized design of totp registration
2024-11-19 02:13:54 +08:00
Maidul Islam
6ff1602fd5 Merge pull request #2708 from Infisical/misc/oidc-setup-extra-handling
misc: added OIDC error and edge-case handling
2024-11-18 10:56:09 -05:00
Maidul Islam
6603364749 Merge pull request #2750 from Infisical/daniel/migrate-unlock-command
fix: add migration unlock command
2024-11-18 10:28:43 -05:00
Daniel Hougaard
53bea22b85 fix: added unlock command 2024-11-18 19:22:43 +04:00
Sheen
d521ee7b7e Merge pull request #2748 from Infisical/misc/address-role-slugs-issue-invite-user-endpoint
misc: address role slug issue in invite user endpoint
2024-11-18 21:58:31 +08:00
Sheen Capadngan
827931e416 misc: addressed comment 2024-11-18 21:52:36 +08:00
Sheen Capadngan
faa83344a7 misc: address role slug issue in invite user endpoint 2024-11-18 21:43:06 +08:00
Sheen Capadngan
3be3d807d2 misc: added URL string validation 2024-11-18 19:32:57 +08:00
Sheen Capadngan
9f7ea3c4e5 doc: added docs for totp dynamic secret 2024-11-18 19:27:45 +08:00
Sheen Capadngan
e67218f170 misc: finalized option setting logic 2024-11-18 18:34:27 +08:00
Sheen Capadngan
269c40c67c Merge remote-tracking branch 'origin/main' into feat/totp-dynamic-secret 2024-11-18 17:31:19 +08:00
Sheen Capadngan
089a7e880b misc: added message for bypass 2024-11-18 17:29:01 +08:00
Sheen Capadngan
64ec741f1a misc: updated documentation totp ui 2024-11-18 17:24:03 +08:00
Sheen Capadngan
c98233ddaf misc: finalized design of totp registration 2024-11-18 17:14:21 +08:00
=
d4cfd0b6ed fix: migration in loop due to cornercase 2024-11-16 00:37:57 +05:30
Sheen Capadngan
ba1fd8a3f7 feat: totp dynamic secret 2024-11-16 02:48:28 +08:00
Duy Pham Le
e8f09d2c7b fix(ui): add sort to github integration dropdown box 2024-11-15 10:26:38 -06:00
Sheen Capadngan
ada63b9e7d misc: finalize org migration script 2024-11-10 11:49:25 +08:00
Sheen Capadngan
3f6a0c77f1 misc: finalized user messages 2024-11-09 01:51:11 +08:00
Sheen Capadngan
9e4b66e215 misc: made users automatically verified 2024-11-09 00:38:45 +08:00
Sheen Capadngan
8a14914bc3 misc: added more error handling 2024-11-08 21:43:25 +08:00
225 changed files with 6531 additions and 1849 deletions

View File

@@ -74,8 +74,8 @@ CAPTCHA_SECRET=
NEXT_PUBLIC_CAPTCHA_SITE_KEY= NEXT_PUBLIC_CAPTCHA_SITE_KEY=
OTEL_TELEMETRY_COLLECTION_ENABLED= OTEL_TELEMETRY_COLLECTION_ENABLED=false
OTEL_EXPORT_TYPE= OTEL_EXPORT_TYPE=prometheus
OTEL_EXPORT_OTLP_ENDPOINT= OTEL_EXPORT_OTLP_ENDPOINT=
OTEL_OTLP_PUSH_INTERVAL= OTEL_OTLP_PUSH_INTERVAL=

View File

@@ -1,6 +1,12 @@
#!/usr/bin/env sh #!/usr/bin/env sh
. "$(dirname -- "$0")/_/husky.sh" . "$(dirname -- "$0")/_/husky.sh"
# Check if infisical is installed
if ! command -v infisical >/dev/null 2>&1; then
echo "\nError: Infisical CLI is not installed. Please install the Infisical CLI before comitting.\n You can refer to the documentation at https://infisical.com/docs/cli/overview\n\n"
exit 1
fi
npx lint-staged npx lint-staged
infisical scan git-changes --staged -v infisical scan git-changes --staged -v

View File

@@ -69,13 +69,21 @@ RUN groupadd -r -g 1001 nodejs && useradd -r -u 1001 -g nodejs non-root-user
WORKDIR /app WORKDIR /app
# Required for pkcs11js # Required for pkcs11js and ODBC
RUN apt-get update && apt-get install -y \ RUN apt-get update && apt-get install -y \
python3 \ python3 \
make \ make \
g++ \ g++ \
unixodbc \
unixodbc-dev \
freetds-dev \
freetds-bin \
tdsodbc \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# Configure ODBC
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\nSetup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so\nFileUsage = 1\n" > /etc/odbcinst.ini
COPY backend/package*.json ./ COPY backend/package*.json ./
RUN npm ci --only-production RUN npm ci --only-production
@@ -91,13 +99,21 @@ ENV ChrystokiConfigurationPath=/usr/safenet/lunaclient/
WORKDIR /app WORKDIR /app
# Required for pkcs11js # Required for pkcs11js and ODBC
RUN apt-get update && apt-get install -y \ RUN apt-get update && apt-get install -y \
python3 \ python3 \
make \ make \
g++ \ g++ \
unixodbc \
unixodbc-dev \
freetds-dev \
freetds-bin \
tdsodbc \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# Configure ODBC
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\nSetup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so\nFileUsage = 1\n" > /etc/odbcinst.ini
COPY backend/package*.json ./ COPY backend/package*.json ./
RUN npm ci --only-production RUN npm ci --only-production
@@ -108,13 +124,24 @@ RUN mkdir frontend-build
# Production stage # Production stage
FROM base AS production FROM base AS production
# Install necessary packages # Install necessary packages including ODBC
RUN apt-get update && apt-get install -y \ RUN apt-get update && apt-get install -y \
ca-certificates \ ca-certificates \
curl \ curl \
git \ git \
python3 \
make \
g++ \
unixodbc \
unixodbc-dev \
freetds-dev \
freetds-bin \
tdsodbc \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# Configure ODBC in production
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so\nSetup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so\nFileUsage = 1\n" > /etc/odbcinst.ini
# Install Infisical CLI # Install Infisical CLI
RUN curl -1sLf 'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.deb.sh' | bash \ RUN curl -1sLf 'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.deb.sh' | bash \
&& apt-get update && apt-get install -y infisical=0.31.1 \ && apt-get update && apt-get install -y infisical=0.31.1 \

View File

@@ -72,8 +72,16 @@ RUN addgroup --system --gid 1001 nodejs \
WORKDIR /app WORKDIR /app
# Required for pkcs11js # Install all required dependencies for build
RUN apk add --no-cache python3 make g++ RUN apk --update add \
python3 \
make \
g++ \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
COPY backend/package*.json ./ COPY backend/package*.json ./
RUN npm ci --only-production RUN npm ci --only-production
@@ -88,8 +96,19 @@ FROM base AS backend-runner
WORKDIR /app WORKDIR /app
# Required for pkcs11js # Install all required dependencies for runtime
RUN apk add --no-cache python3 make g++ RUN apk --update add \
python3 \
make \
g++ \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
# Configure ODBC
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
COPY backend/package*.json ./ COPY backend/package*.json ./
RUN npm ci --only-production RUN npm ci --only-production
@@ -100,11 +119,32 @@ RUN mkdir frontend-build
# Production stage # Production stage
FROM base AS production FROM base AS production
RUN apk add --upgrade --no-cache ca-certificates RUN apk add --upgrade --no-cache ca-certificates
RUN apk add --no-cache bash curl && curl -1sLf \ RUN apk add --no-cache bash curl && curl -1sLf \
'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.alpine.sh' | bash \ 'https://dl.cloudsmith.io/public/infisical/infisical-cli/setup.alpine.sh' | bash \
&& apk add infisical=0.31.1 && apk add --no-cache git && apk add infisical=0.31.1 && apk add --no-cache git
WORKDIR /
# Install all required runtime dependencies
RUN apk --update add \
python3 \
make \
g++ \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev \
bash \
curl \
git
# Configure ODBC in production
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
# Setup user permissions
RUN addgroup --system --gid 1001 nodejs \ RUN addgroup --system --gid 1001 nodejs \
&& adduser --system --uid 1001 non-root-user && adduser --system --uid 1001 non-root-user
@@ -127,7 +167,6 @@ ARG CAPTCHA_SITE_KEY
ENV NEXT_PUBLIC_CAPTCHA_SITE_KEY=$CAPTCHA_SITE_KEY \ ENV NEXT_PUBLIC_CAPTCHA_SITE_KEY=$CAPTCHA_SITE_KEY \
BAKED_NEXT_PUBLIC_CAPTCHA_SITE_KEY=$CAPTCHA_SITE_KEY BAKED_NEXT_PUBLIC_CAPTCHA_SITE_KEY=$CAPTCHA_SITE_KEY
WORKDIR /
COPY --from=backend-runner /app /backend COPY --from=backend-runner /app /backend
@@ -149,4 +188,4 @@ EXPOSE 443
USER non-root-user USER non-root-user
CMD ["./standalone-entrypoint.sh"] CMD ["./standalone-entrypoint.sh"]

View File

@@ -9,6 +9,15 @@ RUN apk --update add \
make \ make \
g++ g++
# install dependencies for TDS driver (required for SAP ASE dynamic secrets)
RUN apk add --no-cache \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
COPY package*.json ./ COPY package*.json ./
RUN npm ci --only-production RUN npm ci --only-production
@@ -28,6 +37,17 @@ RUN apk --update add \
make \ make \
g++ g++
# install dependencies for TDS driver (required for SAP ASE dynamic secrets)
RUN apk add --no-cache \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
RUN npm ci --only-production && npm cache clean --force RUN npm ci --only-production && npm cache clean --force
COPY --from=build /app . COPY --from=build /app .

View File

@@ -7,7 +7,7 @@ ARG SOFTHSM2_VERSION=2.5.0
ENV SOFTHSM2_VERSION=${SOFTHSM2_VERSION} \ ENV SOFTHSM2_VERSION=${SOFTHSM2_VERSION} \
SOFTHSM2_SOURCES=/tmp/softhsm2 SOFTHSM2_SOURCES=/tmp/softhsm2
# install build dependencies including python3 # install build dependencies including python3 (required for pkcs11js and partially TDS driver)
RUN apk --update add \ RUN apk --update add \
alpine-sdk \ alpine-sdk \
autoconf \ autoconf \
@@ -19,7 +19,19 @@ RUN apk --update add \
make \ make \
g++ g++
# install dependencies for TDS driver (required for SAP ASE dynamic secrets)
RUN apk add --no-cache \
unixodbc \
freetds \
unixodbc-dev \
libc-dev \
freetds-dev
RUN printf "[FreeTDS]\nDescription = FreeTDS Driver\nDriver = /usr/lib/libtdsodbc.so\nSetup = /usr/lib/libtdsodbc.so\nFileUsage = 1\n" > /etc/odbcinst.ini
# build and install SoftHSM2 # build and install SoftHSM2
RUN git clone https://github.com/opendnssec/SoftHSMv2.git ${SOFTHSM2_SOURCES} RUN git clone https://github.com/opendnssec/SoftHSMv2.git ${SOFTHSM2_SOURCES}
WORKDIR ${SOFTHSM2_SOURCES} WORKDIR ${SOFTHSM2_SOURCES}

View File

@@ -5,6 +5,9 @@ export const mockSmtpServer = (): TSmtpService => {
return { return {
sendMail: async (data) => { sendMail: async (data) => {
storage.push(data); storage.push(data);
},
verify: async () => {
return true;
} }
}; };
}; };

View File

@@ -24,6 +24,7 @@
"@fastify/multipart": "8.3.0", "@fastify/multipart": "8.3.0",
"@fastify/passport": "^2.4.0", "@fastify/passport": "^2.4.0",
"@fastify/rate-limit": "^9.0.0", "@fastify/rate-limit": "^9.0.0",
"@fastify/request-context": "^5.1.0",
"@fastify/session": "^10.7.0", "@fastify/session": "^10.7.0",
"@fastify/swagger": "^8.14.0", "@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^2.1.0", "@fastify/swagger-ui": "^2.1.0",
@@ -32,6 +33,7 @@
"@octokit/plugin-retry": "^5.0.5", "@octokit/plugin-retry": "^5.0.5",
"@octokit/rest": "^20.0.2", "@octokit/rest": "^20.0.2",
"@octokit/webhooks-types": "^7.3.1", "@octokit/webhooks-types": "^7.3.1",
"@octopusdeploy/api-client": "^3.4.1",
"@opentelemetry/api": "^1.9.0", "@opentelemetry/api": "^1.9.0",
"@opentelemetry/auto-instrumentations-node": "^0.53.0", "@opentelemetry/auto-instrumentations-node": "^0.53.0",
"@opentelemetry/exporter-metrics-otlp-proto": "^0.55.0", "@opentelemetry/exporter-metrics-otlp-proto": "^0.55.0",
@@ -80,6 +82,7 @@
"mysql2": "^3.9.8", "mysql2": "^3.9.8",
"nanoid": "^3.3.4", "nanoid": "^3.3.4",
"nodemailer": "^6.9.9", "nodemailer": "^6.9.9",
"odbc": "^2.4.9",
"openid-client": "^5.6.5", "openid-client": "^5.6.5",
"ora": "^7.0.1", "ora": "^7.0.1",
"oracledb": "^6.4.0", "oracledb": "^6.4.0",
@@ -5528,6 +5531,15 @@
"toad-cache": "^3.3.0" "toad-cache": "^3.3.0"
} }
}, },
"node_modules/@fastify/request-context": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/@fastify/request-context/-/request-context-5.1.0.tgz",
"integrity": "sha512-PM7wrLJOEylVDpxabOFLaYsdAiaa0lpDUcP2HMFJ1JzgiWuC6k4r3duf6Pm9YLnzlGmT+Yp4tkQjqsu7V/pSOA==",
"license": "MIT",
"dependencies": {
"fastify-plugin": "^4.0.0"
}
},
"node_modules/@fastify/send": { "node_modules/@fastify/send": {
"version": "2.1.0", "version": "2.1.0",
"resolved": "https://registry.npmjs.org/@fastify/send/-/send-2.1.0.tgz", "resolved": "https://registry.npmjs.org/@fastify/send/-/send-2.1.0.tgz",
@@ -6944,6 +6956,21 @@
"resolved": "https://registry.npmjs.org/@octokit/webhooks-types/-/webhooks-types-7.1.0.tgz", "resolved": "https://registry.npmjs.org/@octokit/webhooks-types/-/webhooks-types-7.1.0.tgz",
"integrity": "sha512-y92CpG4kFFtBBjni8LHoV12IegJ+KFxLgKRengrVjKmGE5XMeCuGvlfRe75lTRrgXaG6XIWJlFpIDTlkoJsU8w==" "integrity": "sha512-y92CpG4kFFtBBjni8LHoV12IegJ+KFxLgKRengrVjKmGE5XMeCuGvlfRe75lTRrgXaG6XIWJlFpIDTlkoJsU8w=="
}, },
"node_modules/@octopusdeploy/api-client": {
"version": "3.4.1",
"resolved": "https://registry.npmjs.org/@octopusdeploy/api-client/-/api-client-3.4.1.tgz",
"integrity": "sha512-j6FRgDNzc6AQoT3CAguYLWxoMR4W5TKCT1BCPpqjEN9mknmdMSKfYORs3djn/Yj/BhqtITTydDpBoREbzKY5+g==",
"license": "Apache-2.0",
"dependencies": {
"adm-zip": "^0.5.9",
"axios": "^1.2.1",
"form-data": "^4.0.0",
"glob": "^8.0.3",
"lodash": "^4.17.21",
"semver": "^7.3.8",
"urijs": "^1.19.11"
}
},
"node_modules/@opentelemetry/api": { "node_modules/@opentelemetry/api": {
"version": "1.9.0", "version": "1.9.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.9.0.tgz", "resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.9.0.tgz",
@@ -17862,6 +17889,27 @@
"jsonwebtoken": "^9.0.2" "jsonwebtoken": "^9.0.2"
} }
}, },
"node_modules/odbc": {
"version": "2.4.9",
"resolved": "https://registry.npmjs.org/odbc/-/odbc-2.4.9.tgz",
"integrity": "sha512-sHFWOKfyj4oFYds7YBlN+fq9ZjC2J6CsCN5CNMABpKLp+NZdb8bnanb57OaoDy1VFXEOTE91S+F900J/aIPu6w==",
"hasInstallScript": true,
"license": "MIT",
"dependencies": {
"@mapbox/node-pre-gyp": "^1.0.5",
"async": "^3.0.1",
"node-addon-api": "^3.0.2"
},
"engines": {
"node": ">=18.0.0"
}
},
"node_modules/odbc/node_modules/node-addon-api": {
"version": "3.2.1",
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-3.2.1.tgz",
"integrity": "sha512-mmcei9JghVNDYydghQmeDX8KoAm0FAiYyIcUt/N4nhyAipB17pllZQDOJD2fotxABnt4Mdz+dKTO7eftLg4d0A==",
"license": "MIT"
},
"node_modules/oidc-token-hash": { "node_modules/oidc-token-hash": {
"version": "5.0.3", "version": "5.0.3",
"resolved": "https://registry.npmjs.org/oidc-token-hash/-/oidc-token-hash-5.0.3.tgz", "resolved": "https://registry.npmjs.org/oidc-token-hash/-/oidc-token-hash-5.0.3.tgz",
@@ -22365,6 +22413,12 @@
"punycode": "^2.1.0" "punycode": "^2.1.0"
} }
}, },
"node_modules/urijs": {
"version": "1.19.11",
"resolved": "https://registry.npmjs.org/urijs/-/urijs-1.19.11.tgz",
"integrity": "sha512-HXgFDgDommxn5/bIv0cnQZsPhHDA90NPHD6+c/v21U5+Sx5hoP8+dP9IZXBU1gIfvdRfhG8cel9QNPeionfcCQ==",
"license": "MIT"
},
"node_modules/url": { "node_modules/url": {
"version": "0.10.3", "version": "0.10.3",
"resolved": "https://registry.npmjs.org/url/-/url-0.10.3.tgz", "resolved": "https://registry.npmjs.org/url/-/url-0.10.3.tgz",

View File

@@ -50,6 +50,7 @@
"auditlog-migration:down": "knex --knexfile ./src/db/auditlog-knexfile.ts --client pg migrate:down", "auditlog-migration:down": "knex --knexfile ./src/db/auditlog-knexfile.ts --client pg migrate:down",
"auditlog-migration:list": "knex --knexfile ./src/db/auditlog-knexfile.ts --client pg migrate:list", "auditlog-migration:list": "knex --knexfile ./src/db/auditlog-knexfile.ts --client pg migrate:list",
"auditlog-migration:status": "knex --knexfile ./src/db/auditlog-knexfile.ts --client pg migrate:status", "auditlog-migration:status": "knex --knexfile ./src/db/auditlog-knexfile.ts --client pg migrate:status",
"auditlog-migration:unlock": "knex --knexfile ./src/db/auditlog-knexfile.ts migrate:unlock",
"auditlog-migration:rollback": "knex --knexfile ./src/db/auditlog-knexfile.ts migrate:rollback", "auditlog-migration:rollback": "knex --knexfile ./src/db/auditlog-knexfile.ts migrate:rollback",
"migration:new": "tsx ./scripts/create-migration.ts", "migration:new": "tsx ./scripts/create-migration.ts",
"migration:up": "npm run auditlog-migration:up && knex --knexfile ./src/db/knexfile.ts --client pg migrate:up", "migration:up": "npm run auditlog-migration:up && knex --knexfile ./src/db/knexfile.ts --client pg migrate:up",
@@ -58,6 +59,7 @@
"migration:latest": "npm run auditlog-migration:latest && knex --knexfile ./src/db/knexfile.ts --client pg migrate:latest", "migration:latest": "npm run auditlog-migration:latest && knex --knexfile ./src/db/knexfile.ts --client pg migrate:latest",
"migration:status": "npm run auditlog-migration:status && knex --knexfile ./src/db/knexfile.ts --client pg migrate:status", "migration:status": "npm run auditlog-migration:status && knex --knexfile ./src/db/knexfile.ts --client pg migrate:status",
"migration:rollback": "npm run auditlog-migration:rollback && knex --knexfile ./src/db/knexfile.ts migrate:rollback", "migration:rollback": "npm run auditlog-migration:rollback && knex --knexfile ./src/db/knexfile.ts migrate:rollback",
"migration:unlock": "npm run auditlog-migration:unlock && knex --knexfile ./src/db/knexfile.ts migrate:unlock",
"migrate:org": "tsx ./scripts/migrate-organization.ts", "migrate:org": "tsx ./scripts/migrate-organization.ts",
"seed:new": "tsx ./scripts/create-seed-file.ts", "seed:new": "tsx ./scripts/create-seed-file.ts",
"seed": "knex --knexfile ./src/db/knexfile.ts --client pg seed:run", "seed": "knex --knexfile ./src/db/knexfile.ts --client pg seed:run",
@@ -130,6 +132,7 @@
"@fastify/multipart": "8.3.0", "@fastify/multipart": "8.3.0",
"@fastify/passport": "^2.4.0", "@fastify/passport": "^2.4.0",
"@fastify/rate-limit": "^9.0.0", "@fastify/rate-limit": "^9.0.0",
"@fastify/request-context": "^5.1.0",
"@fastify/session": "^10.7.0", "@fastify/session": "^10.7.0",
"@fastify/swagger": "^8.14.0", "@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^2.1.0", "@fastify/swagger-ui": "^2.1.0",
@@ -138,6 +141,7 @@
"@octokit/plugin-retry": "^5.0.5", "@octokit/plugin-retry": "^5.0.5",
"@octokit/rest": "^20.0.2", "@octokit/rest": "^20.0.2",
"@octokit/webhooks-types": "^7.3.1", "@octokit/webhooks-types": "^7.3.1",
"@octopusdeploy/api-client": "^3.4.1",
"@opentelemetry/api": "^1.9.0", "@opentelemetry/api": "^1.9.0",
"@opentelemetry/auto-instrumentations-node": "^0.53.0", "@opentelemetry/auto-instrumentations-node": "^0.53.0",
"@opentelemetry/exporter-metrics-otlp-proto": "^0.55.0", "@opentelemetry/exporter-metrics-otlp-proto": "^0.55.0",
@@ -186,6 +190,7 @@
"mysql2": "^3.9.8", "mysql2": "^3.9.8",
"nanoid": "^3.3.4", "nanoid": "^3.3.4",
"nodemailer": "^6.9.9", "nodemailer": "^6.9.9",
"odbc": "^2.4.9",
"openid-client": "^5.6.5", "openid-client": "^5.6.5",
"ora": "^7.0.1", "ora": "^7.0.1",
"oracledb": "^6.4.0", "oracledb": "^6.4.0",

View File

@@ -8,61 +8,80 @@ const prompt = promptSync({
sigint: true sigint: true
}); });
const sanitizeInputParam = (value: string) => {
// Escape double quotes and wrap the entire value in double quotes
if (value) {
return `"${value.replace(/"/g, '\\"')}"`;
}
return '""';
};
const exportDb = () => { const exportDb = () => {
const exportHost = prompt("Enter your Postgres Host to migrate from: "); const exportHost = sanitizeInputParam(prompt("Enter your Postgres Host to migrate from: "));
const exportPort = prompt("Enter your Postgres Port to migrate from [Default = 5432]: ") ?? "5432"; const exportPort = sanitizeInputParam(
const exportUser = prompt("Enter your Postgres User to migrate from: [Default = infisical]: ") ?? "infisical"; prompt("Enter your Postgres Port to migrate from [Default = 5432]: ") ?? "5432"
const exportPassword = prompt("Enter your Postgres Password to migrate from: "); );
const exportDatabase = prompt("Enter your Postgres Database to migrate from [Default = infisical]: ") ?? "infisical"; const exportUser = sanitizeInputParam(
prompt("Enter your Postgres User to migrate from: [Default = infisical]: ") ?? "infisical"
);
const exportPassword = sanitizeInputParam(prompt("Enter your Postgres Password to migrate from: "));
const exportDatabase = sanitizeInputParam(
prompt("Enter your Postgres Database to migrate from [Default = infisical]: ") ?? "infisical"
);
// we do not include the audit_log and secret_sharing entries // we do not include the audit_log and secret_sharing entries
execSync( execSync(
`PGDATABASE="${exportDatabase}" PGPASSWORD="${exportPassword}" PGHOST="${exportHost}" PGPORT=${exportPort} PGUSER=${exportUser} pg_dump infisical --exclude-table-data="secret_sharing" --exclude-table-data="audit_log*" > ${path.join( `PGDATABASE=${exportDatabase} PGPASSWORD=${exportPassword} PGHOST=${exportHost} PGPORT=${exportPort} PGUSER=${exportUser} pg_dump -Fc infisical --exclude-table-data="secret_sharing" --exclude-table-data="audit_log*" > ${path.join(
__dirname, __dirname,
"../src/db/dump.sql" "../src/db/backup.dump"
)}`, )}`,
{ stdio: "inherit" } { stdio: "inherit" }
); );
}; };
const importDbForOrg = () => { const importDbForOrg = () => {
const importHost = prompt("Enter your Postgres Host to migrate to: "); const importHost = sanitizeInputParam(prompt("Enter your Postgres Host to migrate to: "));
const importPort = prompt("Enter your Postgres Port to migrate to [Default = 5432]: ") ?? "5432"; const importPort = sanitizeInputParam(prompt("Enter your Postgres Port to migrate to [Default = 5432]: ") ?? "5432");
const importUser = prompt("Enter your Postgres User to migrate to: [Default = infisical]: ") ?? "infisical"; const importUser = sanitizeInputParam(
const importPassword = prompt("Enter your Postgres Password to migrate to: "); prompt("Enter your Postgres User to migrate to: [Default = infisical]: ") ?? "infisical"
const importDatabase = prompt("Enter your Postgres Database to migrate to [Default = infisical]: ") ?? "infisical"; );
const orgId = prompt("Enter the organization ID to migrate: "); const importPassword = sanitizeInputParam(prompt("Enter your Postgres Password to migrate to: "));
const importDatabase = sanitizeInputParam(
prompt("Enter your Postgres Database to migrate to [Default = infisical]: ") ?? "infisical"
);
const orgId = sanitizeInputParam(prompt("Enter the organization ID to migrate: "));
if (!existsSync(path.join(__dirname, "../src/db/dump.sql"))) { if (!existsSync(path.join(__dirname, "../src/db/backup.dump"))) {
console.log("File not found, please export the database first."); console.log("File not found, please export the database first.");
return; return;
} }
execSync( execSync(
`PGDATABASE="${importDatabase}" PGPASSWORD="${importPassword}" PGHOST="${importHost}" PGPORT=${importPort} PGUSER=${importUser} psql -f ${path.join( `PGDATABASE=${importDatabase} PGPASSWORD=${importPassword} PGHOST=${importHost} PGPORT=${importPort} PGUSER=${importUser} pg_restore -d ${importDatabase} --verbose ${path.join(
__dirname, __dirname,
"../src/db/dump.sql" "../src/db/backup.dump"
)}` )}`,
{ maxBuffer: 1024 * 1024 * 4096 }
); );
execSync( execSync(
`PGDATABASE="${importDatabase}" PGPASSWORD="${importPassword}" PGHOST="${importHost}" PGPORT=${importPort} PGUSER=${importUser} psql -c "DELETE FROM public.organizations WHERE id != '${orgId}'"` `PGDATABASE=${importDatabase} PGPASSWORD=${importPassword} PGHOST=${importHost} PGPORT=${importPort} PGUSER=${importUser} psql -c "DELETE FROM public.organizations WHERE id != '${orgId}'"`
); );
// delete global/instance-level resources not relevant to the organization to migrate // delete global/instance-level resources not relevant to the organization to migrate
// users // users
execSync( execSync(
`PGDATABASE="${importDatabase}" PGPASSWORD="${importPassword}" PGHOST="${importHost}" PGPORT=${importPort} PGUSER=${importUser} psql -c 'DELETE FROM users WHERE users.id NOT IN (SELECT org_memberships."userId" FROM org_memberships)'` `PGDATABASE=${importDatabase} PGPASSWORD=${importPassword} PGHOST=${importHost} PGPORT=${importPort} PGUSER=${importUser} psql -c 'DELETE FROM users WHERE users.id NOT IN (SELECT org_memberships."userId" FROM org_memberships)'`
); );
// identities // identities
execSync( execSync(
`PGDATABASE="${importDatabase}" PGPASSWORD="${importPassword}" PGHOST="${importHost}" PGPORT=${importPort} PGUSER=${importUser} psql -c 'DELETE FROM identities WHERE id NOT IN (SELECT "identityId" FROM identity_org_memberships)'` `PGDATABASE=${importDatabase} PGPASSWORD=${importPassword} PGHOST=${importHost} PGPORT=${importPort} PGUSER=${importUser} psql -c 'DELETE FROM identities WHERE id NOT IN (SELECT "identityId" FROM identity_org_memberships)'`
); );
// reset slack configuration in superAdmin // reset slack configuration in superAdmin
execSync( execSync(
`PGDATABASE="${importDatabase}" PGPASSWORD="${importPassword}" PGHOST="${importHost}" PGPORT=${importPort} PGUSER=${importUser} psql -c 'UPDATE super_admin SET "encryptedSlackClientId" = null, "encryptedSlackClientSecret" = null'` `PGDATABASE=${importDatabase} PGPASSWORD=${importPassword} PGHOST=${importHost} PGPORT=${importPort} PGUSER=${importUser} psql -c 'UPDATE super_admin SET "encryptedSlackClientId" = null, "encryptedSlackClientSecret" = null'`
); );
console.log("Organization migrated successfully."); console.log("Organization migrated successfully.");

View File

@@ -0,0 +1,7 @@
import "@fastify/request-context";
declare module "@fastify/request-context" {
interface RequestContextData {
requestId: string;
}
}

View File

@@ -1,6 +1,6 @@
import { FastifyInstance, RawReplyDefaultExpression, RawRequestDefaultExpression, RawServerDefault } from "fastify"; import { FastifyInstance, RawReplyDefaultExpression, RawRequestDefaultExpression, RawServerDefault } from "fastify";
import { Logger } from "pino";
import { CustomLogger } from "@app/lib/logger/logger";
import { ZodTypeProvider } from "@app/server/plugins/fastify-zod"; import { ZodTypeProvider } from "@app/server/plugins/fastify-zod";
declare global { declare global {
@@ -8,7 +8,7 @@ declare global {
RawServerDefault, RawServerDefault,
RawRequestDefaultExpression<RawServerDefault>, RawRequestDefaultExpression<RawServerDefault>,
RawReplyDefaultExpression<RawServerDefault>, RawReplyDefaultExpression<RawServerDefault>,
Readonly<Logger>, Readonly<CustomLogger>,
ZodTypeProvider ZodTypeProvider
>; >;

View File

@@ -2,7 +2,7 @@ import { Knex } from "knex";
import { TableName } from "../schemas"; import { TableName } from "../schemas";
const BATCH_SIZE = 30_000; const BATCH_SIZE = 10_000;
export async function up(knex: Knex): Promise<void> { export async function up(knex: Knex): Promise<void> {
const hasAuthMethodColumnAccessToken = await knex.schema.hasColumn(TableName.IdentityAccessToken, "authMethod"); const hasAuthMethodColumnAccessToken = await knex.schema.hasColumn(TableName.IdentityAccessToken, "authMethod");
@@ -12,7 +12,18 @@ export async function up(knex: Knex): Promise<void> {
t.string("authMethod").nullable(); t.string("authMethod").nullable();
}); });
let nullableAccessTokens = await knex(TableName.IdentityAccessToken).whereNull("authMethod").limit(BATCH_SIZE); // first we remove identities without auth method that is unused
// ! We delete all access tokens where the identity has no auth method set!
// ! Which means un-configured identities that for some reason have access tokens, will have their access tokens deleted.
await knex(TableName.IdentityAccessToken)
.leftJoin(TableName.Identity, `${TableName.Identity}.id`, `${TableName.IdentityAccessToken}.identityId`)
.whereNull(`${TableName.Identity}.authMethod`)
.delete();
let nullableAccessTokens = await knex(TableName.IdentityAccessToken)
.whereNull("authMethod")
.limit(BATCH_SIZE)
.select("id");
let totalUpdated = 0; let totalUpdated = 0;
do { do {
@@ -33,24 +44,15 @@ export async function up(knex: Knex): Promise<void> {
}); });
// eslint-disable-next-line no-await-in-loop // eslint-disable-next-line no-await-in-loop
nullableAccessTokens = await knex(TableName.IdentityAccessToken).whereNull("authMethod").limit(BATCH_SIZE); nullableAccessTokens = await knex(TableName.IdentityAccessToken)
.whereNull("authMethod")
.limit(BATCH_SIZE)
.select("id");
totalUpdated += batchIds.length; totalUpdated += batchIds.length;
console.log(`Updated ${batchIds.length} access tokens in batch <> Total updated: ${totalUpdated}`); console.log(`Updated ${batchIds.length} access tokens in batch <> Total updated: ${totalUpdated}`);
} while (nullableAccessTokens.length > 0); } while (nullableAccessTokens.length > 0);
// ! We delete all access tokens where the identity has no auth method set!
// ! Which means un-configured identities that for some reason have access tokens, will have their access tokens deleted.
await knex(TableName.IdentityAccessToken)
.whereNotExists((queryBuilder) => {
void queryBuilder
.select("id")
.from(TableName.Identity)
.whereRaw(`${TableName.IdentityAccessToken}."identityId" = ${TableName.Identity}.id`)
.whereNotNull("authMethod");
})
.delete();
// Finally we set the authMethod to notNullable after populating the column. // Finally we set the authMethod to notNullable after populating the column.
// This will fail if the data is not populated correctly, so it's safe. // This will fail if the data is not populated correctly, so it's safe.
await knex.schema.alterTable(TableName.IdentityAccessToken, (t) => { await knex.schema.alterTable(TableName.IdentityAccessToken, (t) => {

View File

@@ -0,0 +1,21 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
if (await knex.schema.hasColumn(TableName.OidcConfig, "orgId")) {
await knex.schema.alterTable(TableName.OidcConfig, (t) => {
t.dropForeign("orgId");
t.foreign("orgId").references("id").inTable(TableName.Organization).onDelete("CASCADE");
});
}
}
export async function down(knex: Knex): Promise<void> {
if (await knex.schema.hasColumn(TableName.OidcConfig, "orgId")) {
await knex.schema.alterTable(TableName.OidcConfig, (t) => {
t.dropForeign("orgId");
t.foreign("orgId").references("id").inTable(TableName.Organization);
});
}
}

View File

@@ -0,0 +1,23 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
const hasProjectDescription = await knex.schema.hasColumn(TableName.Project, "description");
if (!hasProjectDescription) {
await knex.schema.alterTable(TableName.Project, (t) => {
t.string("description");
});
}
}
export async function down(knex: Knex): Promise<void> {
const hasProjectDescription = await knex.schema.hasColumn(TableName.Project, "description");
if (hasProjectDescription) {
await knex.schema.alterTable(TableName.Project, (t) => {
t.dropColumn("description");
});
}
}

View File

@@ -0,0 +1,20 @@
import { Knex } from "knex";
import { TableName } from "../schemas";
export async function up(knex: Knex): Promise<void> {
if (await knex.schema.hasColumn(TableName.IdentityMetadata, "value")) {
await knex(TableName.IdentityMetadata).whereNull("value").delete();
await knex.schema.alterTable(TableName.IdentityMetadata, (t) => {
t.string("value", 1020).notNullable().alter();
});
}
}
export async function down(knex: Knex): Promise<void> {
if (await knex.schema.hasColumn(TableName.IdentityMetadata, "value")) {
await knex.schema.alterTable(TableName.IdentityMetadata, (t) => {
t.string("value", 1020).alter();
});
}
}

View File

@@ -12,7 +12,7 @@ import { TImmutableDBKeys } from "./models";
export const KmsRootConfigSchema = z.object({ export const KmsRootConfigSchema = z.object({
id: z.string().uuid(), id: z.string().uuid(),
encryptedRootKey: zodBuffer, encryptedRootKey: zodBuffer,
encryptionStrategy: z.string(), encryptionStrategy: z.string().default("SOFTWARE").nullable().optional(),
createdAt: z.date(), createdAt: z.date(),
updatedAt: z.date() updatedAt: z.date()
}); });

View File

@@ -23,7 +23,8 @@ export const ProjectsSchema = z.object({
kmsCertificateKeyId: z.string().uuid().nullable().optional(), kmsCertificateKeyId: z.string().uuid().nullable().optional(),
auditLogsRetentionDays: z.number().nullable().optional(), auditLogsRetentionDays: z.number().nullable().optional(),
kmsSecretManagerKeyId: z.string().uuid().nullable().optional(), kmsSecretManagerKeyId: z.string().uuid().nullable().optional(),
kmsSecretManagerEncryptedDataKey: zodBuffer.nullable().optional() kmsSecretManagerEncryptedDataKey: zodBuffer.nullable().optional(),
description: z.string().nullable().optional()
}); });
export type TProjects = z.infer<typeof ProjectsSchema>; export type TProjects = z.infer<typeof ProjectsSchema>;

View File

@@ -80,7 +80,7 @@ const ElastiCacheUserManager = (credentials: TBasicAWSCredentials, region: strin
} }
}; };
const addUserToInfisicalGroup = async (userId: string) => { const $addUserToInfisicalGroup = async (userId: string) => {
// figure out if the default user is already in the group, if it is, then we shouldn't add it again // figure out if the default user is already in the group, if it is, then we shouldn't add it again
const addUserToGroupCommand = new ModifyUserGroupCommand({ const addUserToGroupCommand = new ModifyUserGroupCommand({
@@ -96,7 +96,7 @@ const ElastiCacheUserManager = (credentials: TBasicAWSCredentials, region: strin
await ensureInfisicalGroupExists(clusterName); await ensureInfisicalGroupExists(clusterName);
await elastiCache.send(new CreateUserCommand(creationInput)); // First create the user await elastiCache.send(new CreateUserCommand(creationInput)); // First create the user
await addUserToInfisicalGroup(creationInput.UserId); // Then add the user to the group. We know the group is already a part of the cluster because of ensureInfisicalGroupExists() await $addUserToInfisicalGroup(creationInput.UserId); // Then add the user to the group. We know the group is already a part of the cluster because of ensureInfisicalGroupExists()
return { return {
userId: creationInput.UserId, userId: creationInput.UserId,
@@ -212,7 +212,7 @@ export const AwsElastiCacheDatabaseProvider = (): TDynamicProviderFns => {
}; };
const renew = async (inputs: unknown, entityId: string) => { const renew = async (inputs: unknown, entityId: string) => {
// Do nothing // No renewal necessary
return { entityId }; return { entityId };
}; };

View File

@@ -33,7 +33,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretAwsIamSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretAwsIamSchema>) => {
const client = new IAMClient({ const client = new IAMClient({
region: providerInputs.region, region: providerInputs.region,
credentials: { credentials: {
@@ -47,7 +47,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const isConnected = await client.send(new GetUserCommand({})).then(() => true); const isConnected = await client.send(new GetUserCommand({})).then(() => true);
return isConnected; return isConnected;
@@ -55,7 +55,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => { const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const { policyArns, userGroups, policyDocument, awsPath, permissionBoundaryPolicyArn } = providerInputs; const { policyArns, userGroups, policyDocument, awsPath, permissionBoundaryPolicyArn } = providerInputs;
@@ -118,7 +118,7 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = entityId; const username = entityId;
@@ -179,9 +179,8 @@ export const AwsIamProvider = (): TDynamicProviderFns => {
}; };
const renew = async (_inputs: unknown, entityId: string) => { const renew = async (_inputs: unknown, entityId: string) => {
// do nothing // No renewal necessary
const username = entityId; return { entityId };
return { entityId: username };
}; };
return { return {

View File

@@ -23,7 +23,7 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
return providerInputs; return providerInputs;
}; };
const getToken = async ( const $getToken = async (
tenantId: string, tenantId: string,
applicationId: string, applicationId: string,
clientSecret: string clientSecret: string
@@ -51,18 +51,13 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const data = await getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret); const data = await $getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret);
return data.success; return data.success;
}; };
const renew = async (inputs: unknown, entityId: string) => {
// Do nothing
return { entityId };
};
const create = async (inputs: unknown) => { const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const data = await getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret); const data = await $getToken(providerInputs.tenantId, providerInputs.applicationId, providerInputs.clientSecret);
if (!data.success) { if (!data.success) {
throw new BadRequestError({ message: "Failed to authorize to Microsoft Entra ID" }); throw new BadRequestError({ message: "Failed to authorize to Microsoft Entra ID" });
} }
@@ -98,7 +93,7 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
}; };
const fetchAzureEntraIdUsers = async (tenantId: string, applicationId: string, clientSecret: string) => { const fetchAzureEntraIdUsers = async (tenantId: string, applicationId: string, clientSecret: string) => {
const data = await getToken(tenantId, applicationId, clientSecret); const data = await $getToken(tenantId, applicationId, clientSecret);
if (!data.success) { if (!data.success) {
throw new BadRequestError({ message: "Failed to authorize to Microsoft Entra ID" }); throw new BadRequestError({ message: "Failed to authorize to Microsoft Entra ID" });
} }
@@ -127,6 +122,11 @@ export const AzureEntraIDProvider = (): TDynamicProviderFns & {
return users; return users;
}; };
const renew = async (inputs: unknown, entityId: string) => {
// No renewal necessary
return { entityId };
};
return { return {
validateProviderInputs, validateProviderInputs,
validateConnection, validateConnection,

View File

@@ -27,7 +27,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretCassandraSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretCassandraSchema>) => {
const sslOptions = providerInputs.ca ? { rejectUnauthorized: false, ca: providerInputs.ca } : undefined; const sslOptions = providerInputs.ca ? { rejectUnauthorized: false, ca: providerInputs.ca } : undefined;
const client = new cassandra.Client({ const client = new cassandra.Client({
sslOptions, sslOptions,
@@ -47,7 +47,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const isConnected = await client.execute("SELECT * FROM system_schema.keyspaces").then(() => true); const isConnected = await client.execute("SELECT * FROM system_schema.keyspaces").then(() => true);
await client.shutdown(); await client.shutdown();
@@ -56,7 +56,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => { const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -82,7 +82,7 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = entityId; const username = entityId;
const { keyspace } = providerInputs; const { keyspace } = providerInputs;
@@ -99,20 +99,24 @@ export const CassandraProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => { const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); if (!providerInputs.renewStatement) return { entityId };
const client = await $getClient(providerInputs);
const username = entityId;
const expiration = new Date(expireAt).toISOString(); const expiration = new Date(expireAt).toISOString();
const { keyspace } = providerInputs; const { keyspace } = providerInputs;
const renewStatement = handlebars.compile(providerInputs.revocationStatement)({ username, keyspace, expiration }); const renewStatement = handlebars.compile(providerInputs.renewStatement)({
username: entityId,
keyspace,
expiration
});
const queries = renewStatement.toString().split(";").filter(Boolean); const queries = renewStatement.toString().split(";").filter(Boolean);
for (const query of queries) { for await (const query of queries) {
// eslint-disable-next-line
await client.execute(query); await client.execute(query);
} }
await client.shutdown(); await client.shutdown();
return { entityId: username }; return { entityId };
}; };
return { return {

View File

@@ -24,7 +24,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretElasticSearchSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretElasticSearchSchema>) => {
const connection = new ElasticSearchClient({ const connection = new ElasticSearchClient({
node: { node: {
url: new URL(`${providerInputs.host}:${providerInputs.port}`), url: new URL(`${providerInputs.host}:${providerInputs.port}`),
@@ -55,7 +55,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const infoResponse = await connection const infoResponse = await connection
.info() .info()
@@ -67,7 +67,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => { const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -85,7 +85,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
await connection.security.deleteUser({ await connection.security.deleteUser({
username: entityId username: entityId
@@ -96,7 +96,7 @@ export const ElasticSearchProvider = (): TDynamicProviderFns => {
}; };
const renew = async (inputs: unknown, entityId: string) => { const renew = async (inputs: unknown, entityId: string) => {
// Do nothing // No renewal necessary
return { entityId }; return { entityId };
}; };

View File

@@ -6,15 +6,17 @@ import { AzureEntraIDProvider } from "./azure-entra-id";
import { CassandraProvider } from "./cassandra"; import { CassandraProvider } from "./cassandra";
import { ElasticSearchProvider } from "./elastic-search"; import { ElasticSearchProvider } from "./elastic-search";
import { LdapProvider } from "./ldap"; import { LdapProvider } from "./ldap";
import { DynamicSecretProviders } from "./models"; import { DynamicSecretProviders, TDynamicProviderFns } from "./models";
import { MongoAtlasProvider } from "./mongo-atlas"; import { MongoAtlasProvider } from "./mongo-atlas";
import { MongoDBProvider } from "./mongo-db"; import { MongoDBProvider } from "./mongo-db";
import { RabbitMqProvider } from "./rabbit-mq"; import { RabbitMqProvider } from "./rabbit-mq";
import { RedisDatabaseProvider } from "./redis"; import { RedisDatabaseProvider } from "./redis";
import { SapAseProvider } from "./sap-ase";
import { SapHanaProvider } from "./sap-hana"; import { SapHanaProvider } from "./sap-hana";
import { SqlDatabaseProvider } from "./sql-database"; import { SqlDatabaseProvider } from "./sql-database";
import { TotpProvider } from "./totp";
export const buildDynamicSecretProviders = () => ({ export const buildDynamicSecretProviders = (): Record<DynamicSecretProviders, TDynamicProviderFns> => ({
[DynamicSecretProviders.SqlDatabase]: SqlDatabaseProvider(), [DynamicSecretProviders.SqlDatabase]: SqlDatabaseProvider(),
[DynamicSecretProviders.Cassandra]: CassandraProvider(), [DynamicSecretProviders.Cassandra]: CassandraProvider(),
[DynamicSecretProviders.AwsIam]: AwsIamProvider(), [DynamicSecretProviders.AwsIam]: AwsIamProvider(),
@@ -27,5 +29,7 @@ export const buildDynamicSecretProviders = () => ({
[DynamicSecretProviders.AzureEntraID]: AzureEntraIDProvider(), [DynamicSecretProviders.AzureEntraID]: AzureEntraIDProvider(),
[DynamicSecretProviders.Ldap]: LdapProvider(), [DynamicSecretProviders.Ldap]: LdapProvider(),
[DynamicSecretProviders.SapHana]: SapHanaProvider(), [DynamicSecretProviders.SapHana]: SapHanaProvider(),
[DynamicSecretProviders.Snowflake]: SnowflakeProvider() [DynamicSecretProviders.Snowflake]: SnowflakeProvider(),
[DynamicSecretProviders.Totp]: TotpProvider(),
[DynamicSecretProviders.SapAse]: SapAseProvider()
}); });

View File

@@ -52,7 +52,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof LdapSchema>): Promise<ldapjs.Client> => { const $getClient = async (providerInputs: z.infer<typeof LdapSchema>): Promise<ldapjs.Client> => {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const client = ldapjs.createClient({ const client = ldapjs.createClient({
url: providerInputs.url, url: providerInputs.url,
@@ -83,7 +83,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
return client.connected; return client.connected;
}; };
@@ -191,7 +191,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => { const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
if (providerInputs.credentialType === LdapCredentialType.Static) { if (providerInputs.credentialType === LdapCredentialType.Static) {
const dnMatch = providerInputs.rotationLdif.match(/^dn:\s*(.+)/m); const dnMatch = providerInputs.rotationLdif.match(/^dn:\s*(.+)/m);
@@ -235,7 +235,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
if (providerInputs.credentialType === LdapCredentialType.Static) { if (providerInputs.credentialType === LdapCredentialType.Static) {
const dnMatch = providerInputs.rotationLdif.match(/^dn:\s*(.+)/m); const dnMatch = providerInputs.rotationLdif.match(/^dn:\s*(.+)/m);
@@ -268,7 +268,7 @@ export const LdapProvider = (): TDynamicProviderFns => {
}; };
const renew = async (inputs: unknown, entityId: string) => { const renew = async (inputs: unknown, entityId: string) => {
// Do nothing // No renewal necessary
return { entityId }; return { entityId };
}; };

View File

@@ -4,7 +4,8 @@ export enum SqlProviders {
Postgres = "postgres", Postgres = "postgres",
MySQL = "mysql2", MySQL = "mysql2",
Oracle = "oracledb", Oracle = "oracledb",
MsSQL = "mssql" MsSQL = "mssql",
SapAse = "sap-ase"
} }
export enum ElasticSearchAuthTypes { export enum ElasticSearchAuthTypes {
@@ -17,6 +18,17 @@ export enum LdapCredentialType {
Static = "static" Static = "static"
} }
export enum TotpConfigType {
URL = "url",
MANUAL = "manual"
}
export enum TotpAlgorithm {
SHA1 = "sha1",
SHA256 = "sha256",
SHA512 = "sha512"
}
export const DynamicSecretRedisDBSchema = z.object({ export const DynamicSecretRedisDBSchema = z.object({
host: z.string().trim().toLowerCase(), host: z.string().trim().toLowerCase(),
port: z.number(), port: z.number(),
@@ -107,6 +119,16 @@ export const DynamicSecretCassandraSchema = z.object({
ca: z.string().optional() ca: z.string().optional()
}); });
export const DynamicSecretSapAseSchema = z.object({
host: z.string().trim().toLowerCase(),
port: z.number(),
database: z.string().trim(),
username: z.string().trim(),
password: z.string().trim(),
creationStatement: z.string().trim(),
revocationStatement: z.string().trim()
});
export const DynamicSecretAwsIamSchema = z.object({ export const DynamicSecretAwsIamSchema = z.object({
accessKey: z.string().trim().min(1), accessKey: z.string().trim().min(1),
secretAccessKey: z.string().trim().min(1), secretAccessKey: z.string().trim().min(1),
@@ -221,6 +243,34 @@ export const LdapSchema = z.union([
}) })
]); ]);
export const DynamicSecretTotpSchema = z.discriminatedUnion("configType", [
z.object({
configType: z.literal(TotpConfigType.URL),
url: z
.string()
.url()
.trim()
.min(1)
.refine((val) => {
const urlObj = new URL(val);
const secret = urlObj.searchParams.get("secret");
return Boolean(secret);
}, "OTP URL must contain secret field")
}),
z.object({
configType: z.literal(TotpConfigType.MANUAL),
secret: z
.string()
.trim()
.min(1)
.transform((val) => val.replace(/\s+/g, "")),
period: z.number().optional(),
algorithm: z.nativeEnum(TotpAlgorithm).optional(),
digits: z.number().optional()
})
]);
export enum DynamicSecretProviders { export enum DynamicSecretProviders {
SqlDatabase = "sql-database", SqlDatabase = "sql-database",
Cassandra = "cassandra", Cassandra = "cassandra",
@@ -234,12 +284,15 @@ export enum DynamicSecretProviders {
AzureEntraID = "azure-entra-id", AzureEntraID = "azure-entra-id",
Ldap = "ldap", Ldap = "ldap",
SapHana = "sap-hana", SapHana = "sap-hana",
Snowflake = "snowflake" Snowflake = "snowflake",
Totp = "totp",
SapAse = "sap-ase"
} }
export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [ export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(DynamicSecretProviders.SqlDatabase), inputs: DynamicSecretSqlDBSchema }), z.object({ type: z.literal(DynamicSecretProviders.SqlDatabase), inputs: DynamicSecretSqlDBSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Cassandra), inputs: DynamicSecretCassandraSchema }), z.object({ type: z.literal(DynamicSecretProviders.Cassandra), inputs: DynamicSecretCassandraSchema }),
z.object({ type: z.literal(DynamicSecretProviders.SapAse), inputs: DynamicSecretSapAseSchema }),
z.object({ type: z.literal(DynamicSecretProviders.AwsIam), inputs: DynamicSecretAwsIamSchema }), z.object({ type: z.literal(DynamicSecretProviders.AwsIam), inputs: DynamicSecretAwsIamSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Redis), inputs: DynamicSecretRedisDBSchema }), z.object({ type: z.literal(DynamicSecretProviders.Redis), inputs: DynamicSecretRedisDBSchema }),
z.object({ type: z.literal(DynamicSecretProviders.SapHana), inputs: DynamicSecretSapHanaSchema }), z.object({ type: z.literal(DynamicSecretProviders.SapHana), inputs: DynamicSecretSapHanaSchema }),
@@ -250,7 +303,8 @@ export const DynamicSecretProviderSchema = z.discriminatedUnion("type", [
z.object({ type: z.literal(DynamicSecretProviders.RabbitMq), inputs: DynamicSecretRabbitMqSchema }), z.object({ type: z.literal(DynamicSecretProviders.RabbitMq), inputs: DynamicSecretRabbitMqSchema }),
z.object({ type: z.literal(DynamicSecretProviders.AzureEntraID), inputs: AzureEntraIDSchema }), z.object({ type: z.literal(DynamicSecretProviders.AzureEntraID), inputs: AzureEntraIDSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Ldap), inputs: LdapSchema }), z.object({ type: z.literal(DynamicSecretProviders.Ldap), inputs: LdapSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Snowflake), inputs: DynamicSecretSnowflakeSchema }) z.object({ type: z.literal(DynamicSecretProviders.Snowflake), inputs: DynamicSecretSnowflakeSchema }),
z.object({ type: z.literal(DynamicSecretProviders.Totp), inputs: DynamicSecretTotpSchema })
]); ]);
export type TDynamicProviderFns = { export type TDynamicProviderFns = {

View File

@@ -22,7 +22,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoAtlasSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoAtlasSchema>) => {
const client = axios.create({ const client = axios.create({
baseURL: "https://cloud.mongodb.com/api/atlas", baseURL: "https://cloud.mongodb.com/api/atlas",
headers: { headers: {
@@ -40,7 +40,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const isConnected = await client({ const isConnected = await client({
method: "GET", method: "GET",
@@ -59,7 +59,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => { const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -87,7 +87,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = entityId; const username = entityId;
const isExisting = await client({ const isExisting = await client({
@@ -114,7 +114,7 @@ export const MongoAtlasProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => { const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = entityId; const username = entityId;
const expiration = new Date(expireAt).toISOString(); const expiration = new Date(expireAt).toISOString();

View File

@@ -23,7 +23,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoDBSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretMongoDBSchema>) => {
const isSrv = !providerInputs.port; const isSrv = !providerInputs.port;
const uri = isSrv const uri = isSrv
? `mongodb+srv://${providerInputs.host}` ? `mongodb+srv://${providerInputs.host}`
@@ -42,7 +42,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const isConnected = await client const isConnected = await client
.db(providerInputs.database) .db(providerInputs.database)
@@ -55,7 +55,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => { const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -74,7 +74,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = entityId; const username = entityId;
@@ -88,6 +88,7 @@ export const MongoDBProvider = (): TDynamicProviderFns => {
}; };
const renew = async (_inputs: unknown, entityId: string) => { const renew = async (_inputs: unknown, entityId: string) => {
// No renewal necessary
return { entityId }; return { entityId };
}; };

View File

@@ -84,7 +84,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretRabbitMqSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretRabbitMqSchema>) => {
const axiosInstance = axios.create({ const axiosInstance = axios.create({
baseURL: `${removeTrailingSlash(providerInputs.host)}:${providerInputs.port}/api`, baseURL: `${removeTrailingSlash(providerInputs.host)}:${providerInputs.port}/api`,
auth: { auth: {
@@ -105,7 +105,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const infoResponse = await connection.get("/whoami").then(() => true); const infoResponse = await connection.get("/whoami").then(() => true);
@@ -114,7 +114,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown) => { const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -134,7 +134,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
await deleteRabbitMqUser({ axiosInstance: connection, usernameToDelete: entityId }); await deleteRabbitMqUser({ axiosInstance: connection, usernameToDelete: entityId });
@@ -142,7 +142,7 @@ export const RabbitMqProvider = (): TDynamicProviderFns => {
}; };
const renew = async (inputs: unknown, entityId: string) => { const renew = async (inputs: unknown, entityId: string) => {
// Do nothing // No renewal necessary
return { entityId }; return { entityId };
}; };

View File

@@ -55,7 +55,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretRedisDBSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretRedisDBSchema>) => {
let connection: Redis | null = null; let connection: Redis | null = null;
try { try {
connection = new Redis({ connection = new Redis({
@@ -92,7 +92,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const pingResponse = await connection const pingResponse = await connection
.ping() .ping()
@@ -104,7 +104,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => { const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -126,7 +126,7 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); const connection = await $getClient(providerInputs);
const username = entityId; const username = entityId;
@@ -141,7 +141,9 @@ export const RedisDatabaseProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => { const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const connection = await getClient(providerInputs); if (!providerInputs.renewStatement) return { entityId };
const connection = await $getClient(providerInputs);
const username = entityId; const username = entityId;
const expiration = new Date(expireAt).toISOString(); const expiration = new Date(expireAt).toISOString();

View File

@@ -0,0 +1,145 @@
import handlebars from "handlebars";
import { customAlphabet } from "nanoid";
import odbc from "odbc";
import { z } from "zod";
import { BadRequestError } from "@app/lib/errors";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { verifyHostInputValidity } from "../dynamic-secret-fns";
import { DynamicSecretSapAseSchema, TDynamicProviderFns } from "./models";
const generatePassword = (size = 48) => {
const charset = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
return customAlphabet(charset, 48)(size);
};
const generateUsername = () => {
return alphaNumericNanoId(25);
};
enum SapCommands {
CreateLogin = "sp_addlogin",
DropLogin = "sp_droplogin"
}
export const SapAseProvider = (): TDynamicProviderFns => {
const validateProviderInputs = async (inputs: unknown) => {
const providerInputs = await DynamicSecretSapAseSchema.parseAsync(inputs);
verifyHostInputValidity(providerInputs.host);
return providerInputs;
};
const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSapAseSchema>, useMaster?: boolean) => {
const connectionString =
`DRIVER={FreeTDS};` +
`SERVER=${providerInputs.host};` +
`PORT=${providerInputs.port};` +
`DATABASE=${useMaster ? "master" : providerInputs.database};` +
`UID=${providerInputs.username};` +
`PWD=${providerInputs.password};` +
`TDS_VERSION=5.0`;
const client = await odbc.connect(connectionString);
return client;
};
const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const masterClient = await $getClient(providerInputs, true);
const client = await $getClient(providerInputs);
const [resultFromMasterDatabase] = await masterClient.query<{ version: string }>("SELECT @@VERSION AS version");
const [resultFromSelectedDatabase] = await client.query<{ version: string }>("SELECT @@VERSION AS version");
if (!resultFromSelectedDatabase.version) {
throw new BadRequestError({
message: "Failed to validate SAP ASE connection, version query failed"
});
}
if (resultFromMasterDatabase.version !== resultFromSelectedDatabase.version) {
throw new BadRequestError({
message: "Failed to validate SAP ASE connection (master), version mismatch"
});
}
return true;
};
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const username = `inf_${generateUsername()}`;
const password = `${generatePassword()}`;
const client = await $getClient(providerInputs);
const masterClient = await $getClient(providerInputs, true);
const creationStatement = handlebars.compile(providerInputs.creationStatement, { noEscape: true })({
username,
password
});
const queries = creationStatement.trim().replace(/\n/g, "").split(";").filter(Boolean);
for await (const query of queries) {
// If it's an adduser query, we need to first call sp_addlogin on the MASTER database.
// If not done, then the newly created user won't be able to authenticate.
await (query.startsWith(SapCommands.CreateLogin) ? masterClient : client).query(query);
}
await masterClient.close();
await client.close();
return { entityId: username, data: { DB_USERNAME: username, DB_PASSWORD: password } };
};
const revoke = async (inputs: unknown, username: string) => {
const providerInputs = await validateProviderInputs(inputs);
const revokeStatement = handlebars.compile(providerInputs.revocationStatement, { noEscape: true })({
username
});
const queries = revokeStatement.trim().replace(/\n/g, "").split(";").filter(Boolean);
const client = await $getClient(providerInputs);
const masterClient = await $getClient(providerInputs, true);
// Get all processes for this login and kill them. If there are active connections to the database when drop login happens, it will throw an error.
const result = await masterClient.query<{ spid?: string }>(`sp_who '${username}'`);
if (result && result.length > 0) {
for await (const row of result) {
if (row.spid) {
await masterClient.query(`KILL ${row.spid.trim()}`);
}
}
}
for await (const query of queries) {
await (query.startsWith(SapCommands.DropLogin) ? masterClient : client).query(query);
}
await masterClient.close();
await client.close();
return { entityId: username };
};
const renew = async (_: unknown, username: string) => {
// No need for renewal
return { entityId: username };
};
return {
validateProviderInputs,
validateConnection,
create,
revoke,
renew
};
};

View File

@@ -32,7 +32,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretSapHanaSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSapHanaSchema>) => {
const client = hdb.createClient({ const client = hdb.createClient({
host: providerInputs.host, host: providerInputs.host,
port: providerInputs.port, port: providerInputs.port,
@@ -64,9 +64,9 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const testResult: boolean = await new Promise((resolve, reject) => { const testResult = await new Promise<boolean>((resolve, reject) => {
client.exec("SELECT 1 FROM DUMMY;", (err: any) => { client.exec("SELECT 1 FROM DUMMY;", (err: any) => {
if (err) { if (err) {
reject(); reject();
@@ -86,7 +86,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
const password = generatePassword(); const password = generatePassword();
const expiration = new Date(expireAt).toISOString(); const expiration = new Date(expireAt).toISOString();
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const creationStatement = handlebars.compile(providerInputs.creationStatement, { noEscape: true })({ const creationStatement = handlebars.compile(providerInputs.creationStatement, { noEscape: true })({
username, username,
password, password,
@@ -114,7 +114,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, username: string) => { const revoke = async (inputs: unknown, username: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const revokeStatement = handlebars.compile(providerInputs.revocationStatement)({ username }); const revokeStatement = handlebars.compile(providerInputs.revocationStatement)({ username });
const queries = revokeStatement.toString().split(";").filter(Boolean); const queries = revokeStatement.toString().split(";").filter(Boolean);
for await (const query of queries) { for await (const query of queries) {
@@ -135,13 +135,15 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
return { entityId: username }; return { entityId: username };
}; };
const renew = async (inputs: unknown, username: string, expireAt: number) => { const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); if (!providerInputs.renewStatement) return { entityId };
const client = await $getClient(providerInputs);
try { try {
const expiration = new Date(expireAt).toISOString(); const expiration = new Date(expireAt).toISOString();
const renewStatement = handlebars.compile(providerInputs.renewStatement)({ username, expiration }); const renewStatement = handlebars.compile(providerInputs.renewStatement)({ username: entityId, expiration });
const queries = renewStatement.toString().split(";").filter(Boolean); const queries = renewStatement.toString().split(";").filter(Boolean);
for await (const query of queries) { for await (const query of queries) {
await new Promise((resolve, reject) => { await new Promise((resolve, reject) => {
@@ -161,7 +163,7 @@ export const SapHanaProvider = (): TDynamicProviderFns => {
client.disconnect(); client.disconnect();
} }
return { entityId: username }; return { entityId };
}; };
return { return {

View File

@@ -34,7 +34,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretSnowflakeSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSnowflakeSchema>) => {
const client = snowflake.createConnection({ const client = snowflake.createConnection({
account: `${providerInputs.orgId}-${providerInputs.accountId}`, account: `${providerInputs.orgId}-${providerInputs.accountId}`,
username: providerInputs.username, username: providerInputs.username,
@@ -49,7 +49,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
let isValidConnection: boolean; let isValidConnection: boolean;
@@ -72,7 +72,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => { const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
const username = generateUsername(); const username = generateUsername();
const password = generatePassword(); const password = generatePassword();
@@ -107,7 +107,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, username: string) => { const revoke = async (inputs: unknown, username: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const client = await getClient(providerInputs); const client = await $getClient(providerInputs);
try { try {
const revokeStatement = handlebars.compile(providerInputs.revocationStatement)({ username }); const revokeStatement = handlebars.compile(providerInputs.revocationStatement)({ username });
@@ -131,17 +131,16 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
return { entityId: username }; return { entityId: username };
}; };
const renew = async (inputs: unknown, username: string, expireAt: number) => { const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
if (!providerInputs.renewStatement) return { entityId };
if (!providerInputs.renewStatement) return { entityId: username }; const client = await $getClient(providerInputs);
const client = await getClient(providerInputs);
try { try {
const expiration = getDaysToExpiry(new Date(expireAt)); const expiration = getDaysToExpiry(new Date(expireAt));
const renewStatement = handlebars.compile(providerInputs.renewStatement)({ const renewStatement = handlebars.compile(providerInputs.renewStatement)({
username, username: entityId,
expiration expiration
}); });
@@ -161,7 +160,7 @@ export const SnowflakeProvider = (): TDynamicProviderFns => {
client.destroy(noop); client.destroy(noop);
} }
return { entityId: username }; return { entityId };
}; };
return { return {

View File

@@ -32,7 +32,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
return providerInputs; return providerInputs;
}; };
const getClient = async (providerInputs: z.infer<typeof DynamicSecretSqlDBSchema>) => { const $getClient = async (providerInputs: z.infer<typeof DynamicSecretSqlDBSchema>) => {
const ssl = providerInputs.ca ? { rejectUnauthorized: false, ca: providerInputs.ca } : undefined; const ssl = providerInputs.ca ? { rejectUnauthorized: false, ca: providerInputs.ca } : undefined;
const db = knex({ const db = knex({
client: providerInputs.client, client: providerInputs.client,
@@ -52,7 +52,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const validateConnection = async (inputs: unknown) => { const validateConnection = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs); const db = await $getClient(providerInputs);
// oracle needs from keyword // oracle needs from keyword
const testStatement = providerInputs.client === SqlProviders.Oracle ? "SELECT 1 FROM DUAL" : "SELECT 1"; const testStatement = providerInputs.client === SqlProviders.Oracle ? "SELECT 1 FROM DUAL" : "SELECT 1";
@@ -63,7 +63,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const create = async (inputs: unknown, expireAt: number) => { const create = async (inputs: unknown, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs); const db = await $getClient(providerInputs);
const username = generateUsername(providerInputs.client); const username = generateUsername(providerInputs.client);
const password = generatePassword(providerInputs.client); const password = generatePassword(providerInputs.client);
@@ -90,7 +90,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const revoke = async (inputs: unknown, entityId: string) => { const revoke = async (inputs: unknown, entityId: string) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs); const db = await $getClient(providerInputs);
const username = entityId; const username = entityId;
const { database } = providerInputs; const { database } = providerInputs;
@@ -110,13 +110,19 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
const renew = async (inputs: unknown, entityId: string, expireAt: number) => { const renew = async (inputs: unknown, entityId: string, expireAt: number) => {
const providerInputs = await validateProviderInputs(inputs); const providerInputs = await validateProviderInputs(inputs);
const db = await getClient(providerInputs); if (!providerInputs.renewStatement) return { entityId };
const db = await $getClient(providerInputs);
const username = entityId;
const expiration = new Date(expireAt).toISOString(); const expiration = new Date(expireAt).toISOString();
const { database } = providerInputs; const { database } = providerInputs;
const renewStatement = handlebars.compile(providerInputs.renewStatement)({ username, expiration, database }); const renewStatement = handlebars.compile(providerInputs.renewStatement)({
username: entityId,
expiration,
database
});
if (renewStatement) { if (renewStatement) {
const queries = renewStatement.toString().split(";").filter(Boolean); const queries = renewStatement.toString().split(";").filter(Boolean);
await db.transaction(async (tx) => { await db.transaction(async (tx) => {
@@ -128,7 +134,7 @@ export const SqlDatabaseProvider = (): TDynamicProviderFns => {
} }
await db.destroy(); await db.destroy();
return { entityId: username }; return { entityId };
}; };
return { return {

View File

@@ -0,0 +1,90 @@
import { authenticator } from "otplib";
import { HashAlgorithms } from "otplib/core";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { DynamicSecretTotpSchema, TDynamicProviderFns, TotpConfigType } from "./models";
export const TotpProvider = (): TDynamicProviderFns => {
const validateProviderInputs = async (inputs: unknown) => {
const providerInputs = await DynamicSecretTotpSchema.parseAsync(inputs);
return providerInputs;
};
const validateConnection = async () => {
return true;
};
const create = async (inputs: unknown) => {
const providerInputs = await validateProviderInputs(inputs);
const entityId = alphaNumericNanoId(32);
const authenticatorInstance = authenticator.clone();
let secret: string;
let period: number | null | undefined;
let digits: number | null | undefined;
let algorithm: HashAlgorithms | null | undefined;
if (providerInputs.configType === TotpConfigType.URL) {
const urlObj = new URL(providerInputs.url);
secret = urlObj.searchParams.get("secret") as string;
const periodFromUrl = urlObj.searchParams.get("period");
const digitsFromUrl = urlObj.searchParams.get("digits");
const algorithmFromUrl = urlObj.searchParams.get("algorithm");
if (periodFromUrl) {
period = +periodFromUrl;
}
if (digitsFromUrl) {
digits = +digitsFromUrl;
}
if (algorithmFromUrl) {
algorithm = algorithmFromUrl.toLowerCase() as HashAlgorithms;
}
} else {
secret = providerInputs.secret;
period = providerInputs.period;
digits = providerInputs.digits;
algorithm = providerInputs.algorithm as unknown as HashAlgorithms;
}
if (digits) {
authenticatorInstance.options = { digits };
}
if (algorithm) {
authenticatorInstance.options = { algorithm };
}
if (period) {
authenticatorInstance.options = { step: period };
}
return {
entityId,
data: { TOTP: authenticatorInstance.generate(secret), TIME_REMAINING: authenticatorInstance.timeRemaining() }
};
};
const revoke = async (_inputs: unknown, entityId: string) => {
return { entityId };
};
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const renew = async (_inputs: unknown, entityId: string) => {
// No renewal necessary
return { entityId };
};
return {
validateProviderInputs,
validateConnection,
create,
revoke,
renew
};
};

View File

@@ -27,7 +27,7 @@ export const initializeHsmModule = () => {
logger.info("PKCS#11 module initialized"); logger.info("PKCS#11 module initialized");
} catch (err) { } catch (err) {
logger.error("Failed to initialize PKCS#11 module:", err); logger.error(err, "Failed to initialize PKCS#11 module");
throw err; throw err;
} }
}; };
@@ -39,7 +39,7 @@ export const initializeHsmModule = () => {
isInitialized = false; isInitialized = false;
logger.info("PKCS#11 module finalized"); logger.info("PKCS#11 module finalized");
} catch (err) { } catch (err) {
logger.error("Failed to finalize PKCS#11 module:", err); logger.error(err, "Failed to finalize PKCS#11 module");
throw err; throw err;
} }
} }

View File

@@ -36,8 +36,7 @@ export const testLDAPConfig = async (ldapConfig: TLDAPConfig): Promise<boolean>
}); });
ldapClient.on("error", (err) => { ldapClient.on("error", (err) => {
logger.error("LDAP client error:", err); logger.error(err, "LDAP client error");
logger.error(err);
resolve(false); resolve(false);
}); });

View File

@@ -161,8 +161,8 @@ export const licenseServiceFactory = ({
} }
} catch (error) { } catch (error) {
logger.error( logger.error(
`getPlan: encountered an error when fetching pan [orgId=${orgId}] [projectId=${projectId}] [error]`, error,
error `getPlan: encountered an error when fetching pan [orgId=${orgId}] [projectId=${projectId}] [error]`
); );
await keyStore.setItemWithExpiry( await keyStore.setItemWithExpiry(
FEATURE_CACHE_KEY(orgId), FEATURE_CACHE_KEY(orgId),

View File

@@ -17,7 +17,7 @@ import {
infisicalSymmetricDecrypt, infisicalSymmetricDecrypt,
infisicalSymmetricEncypt infisicalSymmetricEncypt
} from "@app/lib/crypto/encryption"; } from "@app/lib/crypto/encryption";
import { BadRequestError, ForbiddenRequestError, NotFoundError } from "@app/lib/errors"; import { BadRequestError, ForbiddenRequestError, NotFoundError, OidcAuthError } from "@app/lib/errors";
import { AuthMethod, AuthTokenType } from "@app/services/auth/auth-type"; import { AuthMethod, AuthTokenType } from "@app/services/auth/auth-type";
import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-service"; import { TAuthTokenServiceFactory } from "@app/services/auth-token/auth-token-service";
import { TokenType } from "@app/services/auth-token/auth-token-types"; import { TokenType } from "@app/services/auth-token/auth-token-types";
@@ -56,7 +56,7 @@ type TOidcConfigServiceFactoryDep = {
orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "create" | "transaction">; orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "create" | "transaction">;
licenseService: Pick<TLicenseServiceFactory, "getPlan" | "updateSubscriptionOrgMemberCount">; licenseService: Pick<TLicenseServiceFactory, "getPlan" | "updateSubscriptionOrgMemberCount">;
tokenService: Pick<TAuthTokenServiceFactory, "createTokenForUser">; tokenService: Pick<TAuthTokenServiceFactory, "createTokenForUser">;
smtpService: Pick<TSmtpService, "sendMail">; smtpService: Pick<TSmtpService, "sendMail" | "verify">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
oidcConfigDAL: Pick<TOidcConfigDALFactory, "findOne" | "update" | "create">; oidcConfigDAL: Pick<TOidcConfigDALFactory, "findOne" | "update" | "create">;
}; };
@@ -223,6 +223,7 @@ export const oidcConfigServiceFactory = ({
let newUser: TUsers | undefined; let newUser: TUsers | undefined;
if (serverCfg.trustOidcEmails) { if (serverCfg.trustOidcEmails) {
// we prioritize getting the most complete user to create the new alias under
newUser = await userDAL.findOne( newUser = await userDAL.findOne(
{ {
email, email,
@@ -230,6 +231,23 @@ export const oidcConfigServiceFactory = ({
}, },
tx tx
); );
if (!newUser) {
// this fetches user entries created via invites
newUser = await userDAL.findOne(
{
username: email
},
tx
);
if (newUser && !newUser.isEmailVerified) {
// we automatically mark it as email-verified because we've configured trust for OIDC emails
newUser = await userDAL.updateById(newUser.id, {
isEmailVerified: true
});
}
}
} }
if (!newUser) { if (!newUser) {
@@ -332,14 +350,20 @@ export const oidcConfigServiceFactory = ({
userId: user.id userId: user.id
}); });
await smtpService.sendMail({ await smtpService
template: SmtpTemplates.EmailVerification, .sendMail({
subjectLine: "Infisical confirmation code", template: SmtpTemplates.EmailVerification,
recipients: [user.email], subjectLine: "Infisical confirmation code",
substitutions: { recipients: [user.email],
code: token substitutions: {
} code: token
}); }
})
.catch((err: Error) => {
throw new OidcAuthError({
message: `Error sending email confirmation code for user registration - contact the Infisical instance admin. ${err.message}`
});
});
} }
return { isUserCompleted, providerAuthToken }; return { isUserCompleted, providerAuthToken };
@@ -395,6 +419,18 @@ export const oidcConfigServiceFactory = ({
message: `Organization bot for organization with ID '${org.id}' not found`, message: `Organization bot for organization with ID '${org.id}' not found`,
name: "OrgBotNotFound" name: "OrgBotNotFound"
}); });
const serverCfg = await getServerCfg();
if (isActive && !serverCfg.trustOidcEmails) {
const isSmtpConnected = await smtpService.verify();
if (!isSmtpConnected) {
throw new BadRequestError({
message:
"Cannot enable OIDC when there are issues with the instance's SMTP configuration. Bypass this by turning on trust for OIDC emails in the server admin console."
});
}
}
const key = infisicalSymmetricDecrypt({ const key = infisicalSymmetricDecrypt({
ciphertext: orgBot.encryptedSymmetricKey, ciphertext: orgBot.encryptedSymmetricKey,
iv: orgBot.symmetricKeyIV, iv: orgBot.symmetricKeyIV,

View File

@@ -127,14 +127,15 @@ export const permissionDALFactory = (db: TDbClient) => {
const getProjectPermission = async (userId: string, projectId: string) => { const getProjectPermission = async (userId: string, projectId: string) => {
try { try {
const subQueryUserGroups = db(TableName.UserGroupMembership).where("userId", userId).select("groupId");
const docs = await db const docs = await db
.replicaNode()(TableName.Users) .replicaNode()(TableName.Users)
.where(`${TableName.Users}.id`, userId) .where(`${TableName.Users}.id`, userId)
.leftJoin(TableName.UserGroupMembership, `${TableName.UserGroupMembership}.userId`, `${TableName.Users}.id`)
.leftJoin(TableName.GroupProjectMembership, (queryBuilder) => { .leftJoin(TableName.GroupProjectMembership, (queryBuilder) => {
void queryBuilder void queryBuilder
.on(`${TableName.GroupProjectMembership}.projectId`, db.raw("?", [projectId])) .on(`${TableName.GroupProjectMembership}.projectId`, db.raw("?", [projectId]))
.andOn(`${TableName.GroupProjectMembership}.groupId`, `${TableName.UserGroupMembership}.groupId`); // @ts-expect-error akhilmhdh: this is valid knexjs query. Its just ts type argument is missing it
.andOnIn(`${TableName.GroupProjectMembership}.groupId`, subQueryUserGroups);
}) })
.leftJoin( .leftJoin(
TableName.GroupProjectMembershipRole, TableName.GroupProjectMembershipRole,

View File

@@ -1,14 +1,7 @@
import picomatch from "picomatch"; import picomatch from "picomatch";
import { z } from "zod"; import { z } from "zod";
export enum PermissionConditionOperators { import { PermissionConditionOperators } from "@app/lib/casl";
$IN = "$in",
$ALL = "$all",
$REGEX = "$regex",
$EQ = "$eq",
$NEQ = "$ne",
$GLOB = "$glob"
}
export const PermissionConditionSchema = { export const PermissionConditionSchema = {
[PermissionConditionOperators.$IN]: z.string().trim().min(1).array(), [PermissionConditionOperators.$IN]: z.string().trim().min(1).array(),

View File

@@ -1,10 +1,10 @@
import { AbilityBuilder, createMongoAbility, ForcedSubject, MongoAbility } from "@casl/ability"; import { AbilityBuilder, createMongoAbility, ForcedSubject, MongoAbility } from "@casl/ability";
import { z } from "zod"; import { z } from "zod";
import { conditionsMatcher } from "@app/lib/casl"; import { conditionsMatcher, PermissionConditionOperators } from "@app/lib/casl";
import { UnpackedPermissionSchema } from "@app/server/routes/santizedSchemas/permission"; import { UnpackedPermissionSchema } from "@app/server/routes/santizedSchemas/permission";
import { PermissionConditionOperators, PermissionConditionSchema } from "./permission-types"; import { PermissionConditionSchema } from "./permission-types";
export enum ProjectPermissionActions { export enum ProjectPermissionActions {
Read = "read", Read = "read",

View File

@@ -46,7 +46,7 @@ export const rateLimitServiceFactory = ({ rateLimitDAL, licenseService }: TRateL
} }
return rateLimit; return rateLimit;
} catch (err) { } catch (err) {
logger.error("Error fetching rate limits %o", err); logger.error(err, "Error fetching rate limits");
return undefined; return undefined;
} }
}; };
@@ -69,12 +69,12 @@ export const rateLimitServiceFactory = ({ rateLimitDAL, licenseService }: TRateL
mfaRateLimit: rateLimit.mfaRateLimit mfaRateLimit: rateLimit.mfaRateLimit
}; };
logger.info(`syncRateLimitConfiguration: rate limit configuration: %o`, newRateLimitMaxConfiguration); logger.info(newRateLimitMaxConfiguration, "syncRateLimitConfiguration: rate limit configuration");
Object.freeze(newRateLimitMaxConfiguration); Object.freeze(newRateLimitMaxConfiguration);
rateLimitMaxConfiguration = newRateLimitMaxConfiguration; rateLimitMaxConfiguration = newRateLimitMaxConfiguration;
} }
} catch (error) { } catch (error) {
logger.error(`Error syncing rate limit configurations: %o`, error); logger.error(error, "Error syncing rate limit configurations");
} }
}; };

View File

@@ -238,11 +238,11 @@ export const secretScanningQueueFactory = ({
}); });
queueService.listen(QueueName.SecretPushEventScan, "failed", (job, err) => { queueService.listen(QueueName.SecretPushEventScan, "failed", (job, err) => {
logger.error("Failed to secret scan on push", job?.data, err); logger.error(err, "Failed to secret scan on push", job?.data);
}); });
queueService.listen(QueueName.SecretFullRepoScan, "failed", (job, err) => { queueService.listen(QueueName.SecretFullRepoScan, "failed", (job, err) => {
logger.error("Failed to do full repo secret scan", job?.data, err); logger.error(err, "Failed to do full repo secret scan", job?.data);
}); });
return { startFullRepoScan, startPushEventScan }; return { startFullRepoScan, startPushEventScan };

View File

@@ -391,6 +391,7 @@ export const PROJECTS = {
CREATE: { CREATE: {
organizationSlug: "The slug of the organization to create the project in.", organizationSlug: "The slug of the organization to create the project in.",
projectName: "The name of the project to create.", projectName: "The name of the project to create.",
projectDescription: "An optional description label for the project.",
slug: "An optional slug for the project.", slug: "An optional slug for the project.",
template: "The name of the project template, if specified, to apply to this project." template: "The name of the project template, if specified, to apply to this project."
}, },
@@ -403,6 +404,7 @@ export const PROJECTS = {
UPDATE: { UPDATE: {
workspaceId: "The ID of the project to update.", workspaceId: "The ID of the project to update.",
name: "The new name of the project.", name: "The new name of the project.",
projectDescription: "An optional description label for the project.",
autoCapitalization: "Disable or enable auto-capitalization for the project." autoCapitalization: "Disable or enable auto-capitalization for the project."
}, },
GET_KEY: { GET_KEY: {
@@ -1080,7 +1082,8 @@ export const INTEGRATION = {
shouldDisableDelete: "The flag to disable deletion of secrets in AWS Parameter Store.", shouldDisableDelete: "The flag to disable deletion of secrets in AWS Parameter Store.",
shouldMaskSecrets: "Specifies if the secrets synced from Infisical to Gitlab should be marked as 'Masked'.", shouldMaskSecrets: "Specifies if the secrets synced from Infisical to Gitlab should be marked as 'Masked'.",
shouldProtectSecrets: "Specifies if the secrets synced from Infisical to Gitlab should be marked as 'Protected'.", shouldProtectSecrets: "Specifies if the secrets synced from Infisical to Gitlab should be marked as 'Protected'.",
shouldEnableDelete: "The flag to enable deletion of secrets." shouldEnableDelete: "The flag to enable deletion of secrets.",
octopusDeployScopeValues: "Specifies the scope values to set on synced secrets to Octopus Deploy."
} }
}, },
UPDATE: { UPDATE: {

View File

@@ -54,3 +54,12 @@ export const isAtLeastAsPrivileged = (permissions1: MongoAbility, permissions2:
return set1.size >= set2.size; return set1.size >= set2.size;
}; };
export enum PermissionConditionOperators {
$IN = "$in",
$ALL = "$all",
$REGEX = "$regex",
$EQ = "$eq",
$NEQ = "$ne",
$GLOB = "$glob"
}

View File

@@ -1,7 +1,7 @@
import { Logger } from "pino";
import { z } from "zod"; import { z } from "zod";
import { removeTrailingSlash } from "../fn"; import { removeTrailingSlash } from "../fn";
import { CustomLogger } from "../logger/logger";
import { zpStr } from "../zod"; import { zpStr } from "../zod";
export const GITLAB_URL = "https://gitlab.com"; export const GITLAB_URL = "https://gitlab.com";
@@ -10,7 +10,7 @@ export const GITLAB_URL = "https://gitlab.com";
export const IS_PACKAGED = (process as any)?.pkg !== undefined; export const IS_PACKAGED = (process as any)?.pkg !== undefined;
const zodStrBool = z const zodStrBool = z
.enum(["true", "false"]) .string()
.optional() .optional()
.transform((val) => val === "true"); .transform((val) => val === "true");
@@ -212,7 +212,7 @@ let envCfg: Readonly<z.infer<typeof envSchema>>;
export const getConfig = () => envCfg; export const getConfig = () => envCfg;
// cannot import singleton logger directly as it needs config to load various transport // cannot import singleton logger directly as it needs config to load various transport
export const initEnvConfig = (logger?: Logger) => { export const initEnvConfig = (logger?: CustomLogger) => {
const parsedEnv = envSchema.safeParse(process.env); const parsedEnv = envSchema.safeParse(process.env);
if (!parsedEnv.success) { if (!parsedEnv.success) {
(logger ?? console).error("Invalid environment variables. Check the error below"); (logger ?? console).error("Invalid environment variables. Check the error below");

View File

@@ -133,3 +133,15 @@ export class ScimRequestError extends Error {
this.status = status; this.status = status;
} }
} }
export class OidcAuthError extends Error {
name: string;
error: unknown;
constructor({ name, error, message }: { message?: string; name?: string; error?: unknown }) {
super(message || "Something went wrong");
this.name = name || "OidcAuthError";
this.error = error;
}
}

View File

@@ -1,6 +1,8 @@
/* eslint-disable @typescript-eslint/no-unsafe-argument */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */ /* eslint-disable @typescript-eslint/no-unsafe-assignment */
// logger follows a singleton pattern // logger follows a singleton pattern
// easier to use it that's all. // easier to use it that's all.
import { requestContext } from "@fastify/request-context";
import pino, { Logger } from "pino"; import pino, { Logger } from "pino";
import { z } from "zod"; import { z } from "zod";
@@ -13,14 +15,37 @@ const logLevelToSeverityLookup: Record<string, string> = {
"60": "CRITICAL" "60": "CRITICAL"
}; };
// eslint-disable-next-line import/no-mutable-exports
export let logger: Readonly<Logger>;
// akhilmhdh: // akhilmhdh:
// The logger is not placed in the main app config to avoid a circular dependency. // The logger is not placed in the main app config to avoid a circular dependency.
// The config requires the logger to display errors when an invalid environment is supplied. // The config requires the logger to display errors when an invalid environment is supplied.
// On the other hand, the logger needs the config to obtain credentials for AWS or other transports. // On the other hand, the logger needs the config to obtain credentials for AWS or other transports.
// By keeping the logger separate, it becomes an independent package. // By keeping the logger separate, it becomes an independent package.
// We define our own custom logger interface to enforce structure to the logging methods.
export interface CustomLogger extends Omit<Logger, "info" | "error" | "warn" | "debug"> {
info: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
error: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
warn: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
debug: {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(obj: unknown, msg?: string, ...args: any[]): void;
};
}
// eslint-disable-next-line import/no-mutable-exports
export let logger: Readonly<CustomLogger>;
const loggerConfig = z.object({ const loggerConfig = z.object({
AWS_CLOUDWATCH_LOG_GROUP_NAME: z.string().default("infisical-log-stream"), AWS_CLOUDWATCH_LOG_GROUP_NAME: z.string().default("infisical-log-stream"),
AWS_CLOUDWATCH_LOG_REGION: z.string().default("us-east-1"), AWS_CLOUDWATCH_LOG_REGION: z.string().default("us-east-1"),
@@ -62,6 +87,17 @@ const redactedKeys = [
"config" "config"
]; ];
const UNKNOWN_REQUEST_ID = "UNKNOWN_REQUEST_ID";
const extractRequestId = () => {
try {
return requestContext.get("requestId") || UNKNOWN_REQUEST_ID;
} catch (err) {
console.log("failed to get request context", err);
return UNKNOWN_REQUEST_ID;
}
};
export const initLogger = async () => { export const initLogger = async () => {
const cfg = loggerConfig.parse(process.env); const cfg = loggerConfig.parse(process.env);
const targets: pino.TransportMultiOptions["targets"][number][] = [ const targets: pino.TransportMultiOptions["targets"][number][] = [
@@ -94,6 +130,30 @@ export const initLogger = async () => {
targets targets
}); });
const wrapLogger = (originalLogger: Logger): CustomLogger => {
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.info = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).info(obj, msg, ...args);
};
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.error = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).error(obj, msg, ...args);
};
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.warn = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).warn(obj, msg, ...args);
};
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-explicit-any
originalLogger.debug = (obj: unknown, msg?: string, ...args: any[]) => {
return originalLogger.child({ requestId: extractRequestId() }).debug(obj, msg, ...args);
};
return originalLogger;
};
logger = pino( logger = pino(
{ {
mixin(_context, level) { mixin(_context, level) {
@@ -113,5 +173,6 @@ export const initLogger = async () => {
// eslint-disable-next-line @typescript-eslint/no-unsafe-argument // eslint-disable-next-line @typescript-eslint/no-unsafe-argument
transport transport
); );
return logger;
return wrapLogger(logger);
}; };

View File

@@ -10,13 +10,15 @@ import fastifyFormBody from "@fastify/formbody";
import helmet from "@fastify/helmet"; import helmet from "@fastify/helmet";
import type { FastifyRateLimitOptions } from "@fastify/rate-limit"; import type { FastifyRateLimitOptions } from "@fastify/rate-limit";
import ratelimiter from "@fastify/rate-limit"; import ratelimiter from "@fastify/rate-limit";
import { fastifyRequestContext } from "@fastify/request-context";
import fastify from "fastify"; import fastify from "fastify";
import { Knex } from "knex"; import { Knex } from "knex";
import { Logger } from "pino";
import { HsmModule } from "@app/ee/services/hsm/hsm-types"; import { HsmModule } from "@app/ee/services/hsm/hsm-types";
import { TKeyStoreFactory } from "@app/keystore/keystore"; import { TKeyStoreFactory } from "@app/keystore/keystore";
import { getConfig, IS_PACKAGED } from "@app/lib/config/env"; import { getConfig, IS_PACKAGED } from "@app/lib/config/env";
import { CustomLogger } from "@app/lib/logger/logger";
import { alphaNumericNanoId } from "@app/lib/nanoid";
import { TQueueServiceFactory } from "@app/queue"; import { TQueueServiceFactory } from "@app/queue";
import { TSmtpService } from "@app/services/smtp/smtp-service"; import { TSmtpService } from "@app/services/smtp/smtp-service";
@@ -35,7 +37,7 @@ type TMain = {
auditLogDb?: Knex; auditLogDb?: Knex;
db: Knex; db: Knex;
smtp: TSmtpService; smtp: TSmtpService;
logger?: Logger; logger?: CustomLogger;
queue: TQueueServiceFactory; queue: TQueueServiceFactory;
keyStore: TKeyStoreFactory; keyStore: TKeyStoreFactory;
hsmModule: HsmModule; hsmModule: HsmModule;
@@ -47,7 +49,9 @@ export const main = async ({ db, hsmModule, auditLogDb, smtp, logger, queue, key
const server = fastify({ const server = fastify({
logger: appCfg.NODE_ENV === "test" ? false : logger, logger: appCfg.NODE_ENV === "test" ? false : logger,
genReqId: () => `req-${alphaNumericNanoId(14)}`,
trustProxy: true, trustProxy: true,
connectionTimeout: appCfg.isHsmConfigured ? 90_000 : 30_000, connectionTimeout: appCfg.isHsmConfigured ? 90_000 : 30_000,
ignoreTrailingSlash: true, ignoreTrailingSlash: true,
pluginTimeout: 40_000 pluginTimeout: 40_000
@@ -104,6 +108,13 @@ export const main = async ({ db, hsmModule, auditLogDb, smtp, logger, queue, key
await server.register(maintenanceMode); await server.register(maintenanceMode);
await server.register(fastifyRequestContext, {
defaultStoreValues: (request) => ({
requestId: request.id,
log: request.log.child({ requestId: request.id })
})
});
await server.register(registerRoutes, { smtp, queue, db, auditLogDb, keyStore, hsmModule }); await server.register(registerRoutes, { smtp, queue, db, auditLogDb, keyStore, hsmModule });
if (appCfg.isProductionMode) { if (appCfg.isProductionMode) {

View File

@@ -46,10 +46,10 @@ export const bootstrapCheck = async ({ db }: BootstrapOpt) => {
await createTransport(smtpCfg) await createTransport(smtpCfg)
.verify() .verify()
.then(async () => { .then(async () => {
console.info("SMTP successfully connected"); console.info(`SMTP - Verified connection to ${appCfg.SMTP_HOST}:${appCfg.SMTP_PORT}`);
}) })
.catch((err) => { .catch((err: Error) => {
console.error(`SMTP - Failed to connect to ${appCfg.SMTP_HOST}:${appCfg.SMTP_PORT}`); console.error(`SMTP - Failed to connect to ${appCfg.SMTP_HOST}:${appCfg.SMTP_PORT} - ${err.message}`);
logger.error(err); logger.error(err);
}); });

View File

@@ -1,4 +1,4 @@
import { ForbiddenError } from "@casl/ability"; import { ForbiddenError, PureAbility } from "@casl/ability";
import fastifyPlugin from "fastify-plugin"; import fastifyPlugin from "fastify-plugin";
import jwt from "jsonwebtoken"; import jwt from "jsonwebtoken";
import { ZodError } from "zod"; import { ZodError } from "zod";
@@ -10,6 +10,7 @@ import {
GatewayTimeoutError, GatewayTimeoutError,
InternalServerError, InternalServerError,
NotFoundError, NotFoundError,
OidcAuthError,
RateLimitError, RateLimitError,
ScimRequestError, ScimRequestError,
UnauthorizedError UnauthorizedError
@@ -38,74 +39,102 @@ export const fastifyErrHandler = fastifyPlugin(async (server: FastifyZodProvider
if (error instanceof BadRequestError) { if (error instanceof BadRequestError) {
void res void res
.status(HttpStatusCodes.BadRequest) .status(HttpStatusCodes.BadRequest)
.send({ statusCode: HttpStatusCodes.BadRequest, message: error.message, error: error.name }); .send({ requestId: req.id, statusCode: HttpStatusCodes.BadRequest, message: error.message, error: error.name });
} else if (error instanceof NotFoundError) { } else if (error instanceof NotFoundError) {
void res void res
.status(HttpStatusCodes.NotFound) .status(HttpStatusCodes.NotFound)
.send({ statusCode: HttpStatusCodes.NotFound, message: error.message, error: error.name }); .send({ requestId: req.id, statusCode: HttpStatusCodes.NotFound, message: error.message, error: error.name });
} else if (error instanceof UnauthorizedError) { } else if (error instanceof UnauthorizedError) {
void res void res.status(HttpStatusCodes.Unauthorized).send({
.status(HttpStatusCodes.Unauthorized) requestId: req.id,
.send({ statusCode: HttpStatusCodes.Unauthorized, message: error.message, error: error.name }); statusCode: HttpStatusCodes.Unauthorized,
message: error.message,
error: error.name
});
} else if (error instanceof DatabaseError || error instanceof InternalServerError) { } else if (error instanceof DatabaseError || error instanceof InternalServerError) {
void res void res.status(HttpStatusCodes.InternalServerError).send({
.status(HttpStatusCodes.InternalServerError) requestId: req.id,
.send({ statusCode: HttpStatusCodes.InternalServerError, message: "Something went wrong", error: error.name }); statusCode: HttpStatusCodes.InternalServerError,
message: "Something went wrong",
error: error.name
});
} else if (error instanceof GatewayTimeoutError) { } else if (error instanceof GatewayTimeoutError) {
void res void res.status(HttpStatusCodes.GatewayTimeout).send({
.status(HttpStatusCodes.GatewayTimeout) requestId: req.id,
.send({ statusCode: HttpStatusCodes.GatewayTimeout, message: error.message, error: error.name }); statusCode: HttpStatusCodes.GatewayTimeout,
message: error.message,
error: error.name
});
} else if (error instanceof ZodError) { } else if (error instanceof ZodError) {
void res void res.status(HttpStatusCodes.Unauthorized).send({
.status(HttpStatusCodes.Unauthorized) requestId: req.id,
.send({ statusCode: HttpStatusCodes.Unauthorized, error: "ValidationFailure", message: error.issues }); statusCode: HttpStatusCodes.Unauthorized,
error: "ValidationFailure",
message: error.issues
});
} else if (error instanceof ForbiddenError) { } else if (error instanceof ForbiddenError) {
void res.status(HttpStatusCodes.Forbidden).send({ void res.status(HttpStatusCodes.Forbidden).send({
requestId: req.id,
statusCode: HttpStatusCodes.Forbidden, statusCode: HttpStatusCodes.Forbidden,
error: "PermissionDenied", error: "PermissionDenied",
message: `You are not allowed to ${error.action} on ${error.subjectType} - ${JSON.stringify(error.subject)}` message: `You are not allowed to ${error.action} on ${error.subjectType}`,
details: (error.ability as PureAbility).rulesFor(error.action as string, error.subjectType).map((el) => ({
action: el.action,
inverted: el.inverted,
subject: el.subject,
conditions: el.conditions
}))
}); });
} else if (error instanceof ForbiddenRequestError) { } else if (error instanceof ForbiddenRequestError) {
void res.status(HttpStatusCodes.Forbidden).send({ void res.status(HttpStatusCodes.Forbidden).send({
requestId: req.id,
statusCode: HttpStatusCodes.Forbidden, statusCode: HttpStatusCodes.Forbidden,
message: error.message, message: error.message,
error: error.name error: error.name
}); });
} else if (error instanceof RateLimitError) { } else if (error instanceof RateLimitError) {
void res.status(HttpStatusCodes.TooManyRequests).send({ void res.status(HttpStatusCodes.TooManyRequests).send({
requestId: req.id,
statusCode: HttpStatusCodes.TooManyRequests, statusCode: HttpStatusCodes.TooManyRequests,
message: error.message, message: error.message,
error: error.name error: error.name
}); });
} else if (error instanceof ScimRequestError) { } else if (error instanceof ScimRequestError) {
void res.status(error.status).send({ void res.status(error.status).send({
requestId: req.id,
schemas: error.schemas, schemas: error.schemas,
status: error.status, status: error.status,
detail: error.detail detail: error.detail
}); });
// Handle JWT errors and make them more human-readable for the end-user. } else if (error instanceof OidcAuthError) {
void res.status(HttpStatusCodes.InternalServerError).send({
requestId: req.id,
statusCode: HttpStatusCodes.InternalServerError,
message: error.message,
error: error.name
});
} else if (error instanceof jwt.JsonWebTokenError) { } else if (error instanceof jwt.JsonWebTokenError) {
const message = (() => { let errorMessage = error.message;
if (error.message === JWTErrors.JwtExpired) {
return "Your token has expired. Please re-authenticate.";
}
if (error.message === JWTErrors.JwtMalformed) {
return "The provided access token is malformed. Please use a valid token or generate a new one and try again.";
}
if (error.message === JWTErrors.InvalidAlgorithm) {
return "The access token is signed with an invalid algorithm. Please provide a valid token and try again.";
}
return error.message; if (error.message === JWTErrors.JwtExpired) {
})(); errorMessage = "Your token has expired. Please re-authenticate.";
} else if (error.message === JWTErrors.JwtMalformed) {
errorMessage =
"The provided access token is malformed. Please use a valid token or generate a new one and try again.";
} else if (error.message === JWTErrors.InvalidAlgorithm) {
errorMessage =
"The access token is signed with an invalid algorithm. Please provide a valid token and try again.";
}
void res.status(HttpStatusCodes.Forbidden).send({ void res.status(HttpStatusCodes.Forbidden).send({
requestId: req.id,
statusCode: HttpStatusCodes.Forbidden, statusCode: HttpStatusCodes.Forbidden,
error: "TokenError", error: "TokenError",
message message: errorMessage
}); });
} else { } else {
void res.status(HttpStatusCodes.InternalServerError).send({ void res.status(HttpStatusCodes.InternalServerError).send({
requestId: req.id,
statusCode: HttpStatusCodes.InternalServerError, statusCode: HttpStatusCodes.InternalServerError,
error: "InternalServerError", error: "InternalServerError",
message: "Something went wrong" message: "Something went wrong"

View File

@@ -19,7 +19,7 @@ export const registerSecretScannerGhApp = async (server: FastifyZodProvider) =>
app.on("installation", async (context) => { app.on("installation", async (context) => {
const { payload } = context; const { payload } = context;
logger.info("Installed secret scanner to:", { repositories: payload.repositories }); logger.info({ repositories: payload.repositories }, "Installed secret scanner to");
}); });
app.on("push", async (context) => { app.on("push", async (context) => {

View File

@@ -30,26 +30,32 @@ export const integrationAuthPubSchema = IntegrationAuthsSchema.pick({
export const DefaultResponseErrorsSchema = { export const DefaultResponseErrorsSchema = {
400: z.object({ 400: z.object({
requestId: z.string(),
statusCode: z.literal(400), statusCode: z.literal(400),
message: z.string(), message: z.string(),
error: z.string() error: z.string()
}), }),
404: z.object({ 404: z.object({
requestId: z.string(),
statusCode: z.literal(404), statusCode: z.literal(404),
message: z.string(), message: z.string(),
error: z.string() error: z.string()
}), }),
401: z.object({ 401: z.object({
requestId: z.string(),
statusCode: z.literal(401), statusCode: z.literal(401),
message: z.any(), message: z.any(),
error: z.string() error: z.string()
}), }),
403: z.object({ 403: z.object({
requestId: z.string(),
statusCode: z.literal(403), statusCode: z.literal(403),
message: z.string(), message: z.string(),
details: z.any().optional(),
error: z.string() error: z.string()
}), }),
500: z.object({ 500: z.object({
requestId: z.string(),
statusCode: z.literal(500), statusCode: z.literal(500),
message: z.string(), message: z.string(),
error: z.string() error: z.string()
@@ -206,6 +212,7 @@ export const SanitizedAuditLogStreamSchema = z.object({
export const SanitizedProjectSchema = ProjectsSchema.pick({ export const SanitizedProjectSchema = ProjectsSchema.pick({
id: true, id: true,
name: true, name: true,
description: true,
slug: true, slug: true,
autoCapitalization: true, autoCapitalization: true,
orgId: true, orgId: true,

View File

@@ -5,6 +5,7 @@ import { INTEGRATION_AUTH } from "@app/lib/api-docs";
import { readLimit, writeLimit } from "@app/server/config/rateLimiter"; import { readLimit, writeLimit } from "@app/server/config/rateLimiter";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth"; import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type"; import { AuthMode } from "@app/services/auth/auth-type";
import { OctopusDeployScope } from "@app/services/integration-auth/integration-auth-types";
import { integrationAuthPubSchema } from "../sanitizedSchemas"; import { integrationAuthPubSchema } from "../sanitizedSchemas";
@@ -1008,4 +1009,118 @@ export const registerIntegrationAuthRouter = async (server: FastifyZodProvider)
return { buildConfigs }; return { buildConfigs };
} }
}); });
server.route({
method: "GET",
url: "/:integrationAuthId/octopus-deploy/scope-values",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
integrationAuthId: z.string().trim()
}),
querystring: z.object({
scope: z.nativeEnum(OctopusDeployScope),
spaceId: z.string().trim(),
resourceId: z.string().trim()
}),
response: {
200: z.object({
Environments: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Machines: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Actions: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Roles: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Channels: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
TenantTags: z
.object({
Name: z.string(),
Id: z.string()
})
.array(),
Processes: z
.object({
ProcessType: z.string(),
Name: z.string(),
Id: z.string()
})
.array()
})
}
},
handler: async (req) => {
const scopeValues = await server.services.integrationAuth.getOctopusDeployScopeValues({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationAuthId,
scope: req.query.scope,
spaceId: req.query.spaceId,
resourceId: req.query.resourceId
});
return scopeValues;
}
});
server.route({
method: "GET",
url: "/:integrationAuthId/octopus-deploy/spaces",
config: {
rateLimit: readLimit
},
onRequest: verifyAuth([AuthMode.JWT]),
schema: {
params: z.object({
integrationAuthId: z.string().trim()
}),
response: {
200: z.object({
spaces: z
.object({
Name: z.string(),
Id: z.string(),
IsDefault: z.boolean()
})
.array()
})
}
},
handler: async (req) => {
const spaces = await server.services.integrationAuth.getOctopusDeploySpaces({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationAuthId
});
return { spaces };
}
});
}; };

View File

@@ -9,6 +9,7 @@ import { getTelemetryDistinctId } from "@app/server/lib/telemetry";
import { verifyAuth } from "@app/server/plugins/auth/verify-auth"; import { verifyAuth } from "@app/server/plugins/auth/verify-auth";
import { AuthMode } from "@app/services/auth/auth-type"; import { AuthMode } from "@app/services/auth/auth-type";
import { IntegrationMetadataSchema } from "@app/services/integration/integration-schema"; import { IntegrationMetadataSchema } from "@app/services/integration/integration-schema";
import { Integrations } from "@app/services/integration-auth/integration-list";
import { PostHogEventTypes, TIntegrationCreatedEvent } from "@app/services/telemetry/telemetry-types"; import { PostHogEventTypes, TIntegrationCreatedEvent } from "@app/services/telemetry/telemetry-types";
import {} from "../sanitizedSchemas"; import {} from "../sanitizedSchemas";
@@ -206,6 +207,33 @@ export const registerIntegrationRouter = async (server: FastifyZodProvider) => {
id: req.params.integrationId id: req.params.integrationId
}); });
if (integration.region) {
integration.metadata = {
...(integration.metadata || {}),
region: integration.region
};
}
if (
integration.integration === Integrations.AWS_SECRET_MANAGER ||
integration.integration === Integrations.AWS_PARAMETER_STORE
) {
const awsRoleDetails = await server.services.integration.getIntegrationAWSIamRole({
actorId: req.permission.id,
actor: req.permission.type,
actorAuthMethod: req.permission.authMethod,
actorOrgId: req.permission.orgId,
id: req.params.integrationId
});
if (awsRoleDetails) {
integration.metadata = {
...(integration.metadata || {}),
awsIamRole: awsRoleDetails.role
};
}
}
return { integration }; return { integration };
} }
}); });

View File

@@ -296,6 +296,12 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
.max(64, { message: "Name must be 64 or fewer characters" }) .max(64, { message: "Name must be 64 or fewer characters" })
.optional() .optional()
.describe(PROJECTS.UPDATE.name), .describe(PROJECTS.UPDATE.name),
description: z
.string()
.trim()
.max(256, { message: "Description must be 256 or fewer characters" })
.optional()
.describe(PROJECTS.UPDATE.projectDescription),
autoCapitalization: z.boolean().optional().describe(PROJECTS.UPDATE.autoCapitalization) autoCapitalization: z.boolean().optional().describe(PROJECTS.UPDATE.autoCapitalization)
}), }),
response: { response: {
@@ -313,6 +319,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
}, },
update: { update: {
name: req.body.name, name: req.body.name,
description: req.body.description,
autoCapitalization: req.body.autoCapitalization autoCapitalization: req.body.autoCapitalization
}, },
actorAuthMethod: req.permission.authMethod, actorAuthMethod: req.permission.authMethod,

View File

@@ -27,7 +27,7 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
body: z.object({ body: z.object({
emails: z.string().email().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.emails), emails: z.string().email().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.emails),
usernames: z.string().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.usernames), usernames: z.string().array().default([]).describe(PROJECT_USERS.INVITE_MEMBER.usernames),
roleSlugs: z.string().array().optional().describe(PROJECT_USERS.INVITE_MEMBER.roleSlugs) roleSlugs: z.string().array().min(1).optional().describe(PROJECT_USERS.INVITE_MEMBER.roleSlugs)
}), }),
response: { response: {
200: z.object({ 200: z.object({
@@ -49,7 +49,7 @@ export const registerProjectMembershipRouter = async (server: FastifyZodProvider
projects: [ projects: [
{ {
id: req.params.projectId, id: req.params.projectId,
projectRoleSlug: [ProjectMembershipRole.Member] projectRoleSlug: req.body.roleSlugs || [ProjectMembershipRole.Member]
} }
] ]
}); });

View File

@@ -161,6 +161,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
], ],
body: z.object({ body: z.object({
projectName: z.string().trim().describe(PROJECTS.CREATE.projectName), projectName: z.string().trim().describe(PROJECTS.CREATE.projectName),
projectDescription: z.string().trim().optional().describe(PROJECTS.CREATE.projectDescription),
slug: z slug: z
.string() .string()
.min(5) .min(5)
@@ -194,6 +195,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
actorOrgId: req.permission.orgId, actorOrgId: req.permission.orgId,
actorAuthMethod: req.permission.authMethod, actorAuthMethod: req.permission.authMethod,
workspaceName: req.body.projectName, workspaceName: req.body.projectName,
workspaceDescription: req.body.projectDescription,
slug: req.body.slug, slug: req.body.slug,
kmsKeyId: req.body.kmsKeyId, kmsKeyId: req.body.kmsKeyId,
template: req.body.template template: req.body.template
@@ -312,8 +314,9 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
slug: slugSchema.describe("The slug of the project to update.") slug: slugSchema.describe("The slug of the project to update.")
}), }),
body: z.object({ body: z.object({
name: z.string().trim().optional().describe("The new name of the project."), name: z.string().trim().optional().describe(PROJECTS.UPDATE.name),
autoCapitalization: z.boolean().optional().describe("The new auto-capitalization setting.") description: z.string().trim().optional().describe(PROJECTS.UPDATE.projectDescription),
autoCapitalization: z.boolean().optional().describe(PROJECTS.UPDATE.autoCapitalization)
}), }),
response: { response: {
200: SanitizedProjectSchema 200: SanitizedProjectSchema
@@ -330,6 +333,7 @@ export const registerProjectRouter = async (server: FastifyZodProvider) => {
}, },
update: { update: {
name: req.body.name, name: req.body.name,
description: req.body.description,
autoCapitalization: req.body.autoCapitalization autoCapitalization: req.body.autoCapitalization
}, },
actorId: req.permission.id, actorId: req.permission.id,

View File

@@ -119,13 +119,6 @@ export const registerSignupRouter = async (server: FastifyZodProvider) => {
if (!userAgent) throw new Error("user agent header is required"); if (!userAgent) throw new Error("user agent header is required");
const appCfg = getConfig(); const appCfg = getConfig();
const serverCfg = await getServerCfg();
if (!serverCfg.allowSignUp) {
throw new ForbiddenRequestError({
message: "Signup's are disabled"
});
}
const { user, accessToken, refreshToken, organizationId } = const { user, accessToken, refreshToken, organizationId } =
await server.services.signup.completeEmailAccountSignup({ await server.services.signup.completeEmailAccountSignup({
...req.body, ...req.body,

View File

@@ -9,7 +9,7 @@ import { isAuthMethodSaml } from "@app/ee/services/permission/permission-fns";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { infisicalSymmetricDecrypt, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption"; import { infisicalSymmetricDecrypt, infisicalSymmetricEncypt } from "@app/lib/crypto/encryption";
import { generateUserSrpKeys, getUserPrivateKey } from "@app/lib/crypto/srp"; import { generateUserSrpKeys, getUserPrivateKey } from "@app/lib/crypto/srp";
import { NotFoundError } from "@app/lib/errors"; import { ForbiddenRequestError, NotFoundError } from "@app/lib/errors";
import { isDisposableEmail } from "@app/lib/validator"; import { isDisposableEmail } from "@app/lib/validator";
import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal"; import { TGroupProjectDALFactory } from "@app/services/group-project/group-project-dal";
import { TProjectDALFactory } from "@app/services/project/project-dal"; import { TProjectDALFactory } from "@app/services/project/project-dal";
@@ -23,6 +23,7 @@ import { TOrgServiceFactory } from "../org/org-service";
import { TProjectMembershipDALFactory } from "../project-membership/project-membership-dal"; import { TProjectMembershipDALFactory } from "../project-membership/project-membership-dal";
import { TProjectUserMembershipRoleDALFactory } from "../project-membership/project-user-membership-role-dal"; import { TProjectUserMembershipRoleDALFactory } from "../project-membership/project-user-membership-role-dal";
import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service"; import { SmtpTemplates, TSmtpService } from "../smtp/smtp-service";
import { getServerCfg } from "../super-admin/super-admin-service";
import { TUserDALFactory } from "../user/user-dal"; import { TUserDALFactory } from "../user/user-dal";
import { UserEncryption } from "../user/user-types"; import { UserEncryption } from "../user/user-types";
import { TAuthDALFactory } from "./auth-dal"; import { TAuthDALFactory } from "./auth-dal";
@@ -151,6 +152,8 @@ export const authSignupServiceFactory = ({
authorization authorization
}: TCompleteAccountSignupDTO) => { }: TCompleteAccountSignupDTO) => {
const appCfg = getConfig(); const appCfg = getConfig();
const serverCfg = await getServerCfg();
const user = await userDAL.findOne({ username: email }); const user = await userDAL.findOne({ username: email });
if (!user || (user && user.isAccepted)) { if (!user || (user && user.isAccepted)) {
throw new Error("Failed to complete account for complete user"); throw new Error("Failed to complete account for complete user");
@@ -163,6 +166,12 @@ export const authSignupServiceFactory = ({
authMethod = userAuthMethod; authMethod = userAuthMethod;
organizationId = orgId; organizationId = orgId;
} else { } else {
// disallow signup if disabled. we are not doing this for providerAuthToken because we allow signups via saml or sso
if (!serverCfg.allowSignUp) {
throw new ForbiddenRequestError({
message: "Signup's are disabled"
});
}
validateSignUpAuthorization(authorization, user.id); validateSignUpAuthorization(authorization, user.id);
} }

View File

@@ -29,7 +29,7 @@ import {
} from "./identity-aws-auth-types"; } from "./identity-aws-auth-types";
type TIdentityAwsAuthServiceFactoryDep = { type TIdentityAwsAuthServiceFactoryDep = {
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">; identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create" | "delete">;
identityAwsAuthDAL: Pick<TIdentityAwsAuthDALFactory, "findOne" | "transaction" | "create" | "updateById" | "delete">; identityAwsAuthDAL: Pick<TIdentityAwsAuthDALFactory, "findOne" | "transaction" | "create" | "updateById" | "delete">;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">; identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">; licenseService: Pick<TLicenseServiceFactory, "getPlan">;
@@ -346,6 +346,8 @@ export const identityAwsAuthServiceFactory = ({
const revokedIdentityAwsAuth = await identityAwsAuthDAL.transaction(async (tx) => { const revokedIdentityAwsAuth = await identityAwsAuthDAL.transaction(async (tx) => {
const deletedAwsAuth = await identityAwsAuthDAL.delete({ identityId }, tx); const deletedAwsAuth = await identityAwsAuthDAL.delete({ identityId }, tx);
await identityAccessTokenDAL.delete({ identityId, authMethod: IdentityAuthMethod.AWS_AUTH }, tx);
return { ...deletedAwsAuth?.[0], orgId: identityMembershipOrg.orgId }; return { ...deletedAwsAuth?.[0], orgId: identityMembershipOrg.orgId };
}); });
return revokedIdentityAwsAuth; return revokedIdentityAwsAuth;

View File

@@ -30,7 +30,7 @@ type TIdentityAzureAuthServiceFactoryDep = {
"findOne" | "transaction" | "create" | "updateById" | "delete" "findOne" | "transaction" | "create" | "updateById" | "delete"
>; >;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">; identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">; identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create" | "delete">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">; licenseService: Pick<TLicenseServiceFactory, "getPlan">;
}; };
@@ -70,7 +70,9 @@ export const identityAzureAuthServiceFactory = ({
.map((servicePrincipalId) => servicePrincipalId.trim()) .map((servicePrincipalId) => servicePrincipalId.trim())
.some((servicePrincipalId) => servicePrincipalId === azureIdentity.oid); .some((servicePrincipalId) => servicePrincipalId === azureIdentity.oid);
if (!isServicePrincipalAllowed) throw new UnauthorizedError({ message: "Service principal not allowed" }); if (!isServicePrincipalAllowed) {
throw new UnauthorizedError({ message: `Service principal '${azureIdentity.oid}' not allowed` });
}
} }
const identityAccessToken = await identityAzureAuthDAL.transaction(async (tx) => { const identityAccessToken = await identityAzureAuthDAL.transaction(async (tx) => {
@@ -317,6 +319,8 @@ export const identityAzureAuthServiceFactory = ({
const revokedIdentityAzureAuth = await identityAzureAuthDAL.transaction(async (tx) => { const revokedIdentityAzureAuth = await identityAzureAuthDAL.transaction(async (tx) => {
const deletedAzureAuth = await identityAzureAuthDAL.delete({ identityId }, tx); const deletedAzureAuth = await identityAzureAuthDAL.delete({ identityId }, tx);
await identityAccessTokenDAL.delete({ identityId, authMethod: IdentityAuthMethod.AZURE_AUTH }, tx);
return { ...deletedAzureAuth?.[0], orgId: identityMembershipOrg.orgId }; return { ...deletedAzureAuth?.[0], orgId: identityMembershipOrg.orgId };
}); });
return revokedIdentityAzureAuth; return revokedIdentityAzureAuth;

View File

@@ -28,7 +28,7 @@ import {
type TIdentityGcpAuthServiceFactoryDep = { type TIdentityGcpAuthServiceFactoryDep = {
identityGcpAuthDAL: Pick<TIdentityGcpAuthDALFactory, "findOne" | "transaction" | "create" | "updateById" | "delete">; identityGcpAuthDAL: Pick<TIdentityGcpAuthDALFactory, "findOne" | "transaction" | "create" | "updateById" | "delete">;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">; identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">; identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create" | "delete">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">; licenseService: Pick<TLicenseServiceFactory, "getPlan">;
}; };
@@ -365,6 +365,8 @@ export const identityGcpAuthServiceFactory = ({
const revokedIdentityGcpAuth = await identityGcpAuthDAL.transaction(async (tx) => { const revokedIdentityGcpAuth = await identityGcpAuthDAL.transaction(async (tx) => {
const deletedGcpAuth = await identityGcpAuthDAL.delete({ identityId }, tx); const deletedGcpAuth = await identityGcpAuthDAL.delete({ identityId }, tx);
await identityAccessTokenDAL.delete({ identityId, authMethod: IdentityAuthMethod.GCP_AUTH }, tx);
return { ...deletedGcpAuth?.[0], orgId: identityMembershipOrg.orgId }; return { ...deletedGcpAuth?.[0], orgId: identityMembershipOrg.orgId };
}); });
return revokedIdentityGcpAuth; return revokedIdentityGcpAuth;

View File

@@ -41,7 +41,7 @@ type TIdentityKubernetesAuthServiceFactoryDep = {
TIdentityKubernetesAuthDALFactory, TIdentityKubernetesAuthDALFactory,
"create" | "findOne" | "transaction" | "updateById" | "delete" "create" | "findOne" | "transaction" | "updateById" | "delete"
>; >;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">; identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create" | "delete">;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne" | "findById">; identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne" | "findById">;
orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "transaction" | "create">; orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "transaction" | "create">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
@@ -622,6 +622,7 @@ export const identityKubernetesAuthServiceFactory = ({
const revokedIdentityKubernetesAuth = await identityKubernetesAuthDAL.transaction(async (tx) => { const revokedIdentityKubernetesAuth = await identityKubernetesAuthDAL.transaction(async (tx) => {
const deletedKubernetesAuth = await identityKubernetesAuthDAL.delete({ identityId }, tx); const deletedKubernetesAuth = await identityKubernetesAuthDAL.delete({ identityId }, tx);
await identityAccessTokenDAL.delete({ identityId, authMethod: IdentityAuthMethod.KUBERNETES_AUTH }, tx);
return { ...deletedKubernetesAuth?.[0], orgId: identityMembershipOrg.orgId }; return { ...deletedKubernetesAuth?.[0], orgId: identityMembershipOrg.orgId };
}); });
return revokedIdentityKubernetesAuth; return revokedIdentityKubernetesAuth;

View File

@@ -39,7 +39,7 @@ import {
type TIdentityOidcAuthServiceFactoryDep = { type TIdentityOidcAuthServiceFactoryDep = {
identityOidcAuthDAL: TIdentityOidcAuthDALFactory; identityOidcAuthDAL: TIdentityOidcAuthDALFactory;
identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">; identityOrgMembershipDAL: Pick<TIdentityOrgDALFactory, "findOne">;
identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create">; identityAccessTokenDAL: Pick<TIdentityAccessTokenDALFactory, "create" | "delete">;
permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">; permissionService: Pick<TPermissionServiceFactory, "getOrgPermission">;
licenseService: Pick<TLicenseServiceFactory, "getPlan">; licenseService: Pick<TLicenseServiceFactory, "getPlan">;
orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "transaction" | "create">; orgBotDAL: Pick<TOrgBotDALFactory, "findOne" | "transaction" | "create">;
@@ -539,6 +539,8 @@ export const identityOidcAuthServiceFactory = ({
const revokedIdentityOidcAuth = await identityOidcAuthDAL.transaction(async (tx) => { const revokedIdentityOidcAuth = await identityOidcAuthDAL.transaction(async (tx) => {
const deletedOidcAuth = await identityOidcAuthDAL.delete({ identityId }, tx); const deletedOidcAuth = await identityOidcAuthDAL.delete({ identityId }, tx);
await identityAccessTokenDAL.delete({ identityId, authMethod: IdentityAuthMethod.OIDC_AUTH }, tx);
return { ...deletedOidcAuth?.[0], orgId: identityMembershipOrg.orgId }; return { ...deletedOidcAuth?.[0], orgId: identityMembershipOrg.orgId };
}); });

View File

@@ -182,7 +182,12 @@ export const identityProjectServiceFactory = ({
// validate custom roles input // validate custom roles input
const customInputRoles = roles.filter( const customInputRoles = roles.filter(
({ role }) => !Object.values(ProjectMembershipRole).includes(role as ProjectMembershipRole) ({ role }) =>
!Object.values(ProjectMembershipRole)
// we don't want to include custom in this check;
// this unintentionally enables setting slug to custom which is reserved
.filter((r) => r !== ProjectMembershipRole.Custom)
.includes(role as ProjectMembershipRole)
); );
const hasCustomRole = Boolean(customInputRoles.length); const hasCustomRole = Boolean(customInputRoles.length);
const customRoles = hasCustomRole const customRoles = hasCustomRole

View File

@@ -385,8 +385,8 @@ export const identityTokenAuthServiceFactory = ({
actorOrgId actorOrgId
}: TUpdateTokenAuthTokenDTO) => { }: TUpdateTokenAuthTokenDTO) => {
const foundToken = await identityAccessTokenDAL.findOne({ const foundToken = await identityAccessTokenDAL.findOne({
id: tokenId, [`${TableName.IdentityAccessToken}.id` as "id"]: tokenId,
authMethod: IdentityAuthMethod.TOKEN_AUTH [`${TableName.IdentityAccessToken}.authMethod` as "authMethod"]: IdentityAuthMethod.TOKEN_AUTH
}); });
if (!foundToken) throw new NotFoundError({ message: `Token with ID ${tokenId} not found` }); if (!foundToken) throw new NotFoundError({ message: `Token with ID ${tokenId} not found` });
@@ -444,8 +444,8 @@ export const identityTokenAuthServiceFactory = ({
}: TRevokeTokenAuthTokenDTO) => { }: TRevokeTokenAuthTokenDTO) => {
const identityAccessToken = await identityAccessTokenDAL.findOne({ const identityAccessToken = await identityAccessTokenDAL.findOne({
[`${TableName.IdentityAccessToken}.id` as "id"]: tokenId, [`${TableName.IdentityAccessToken}.id` as "id"]: tokenId,
isAccessTokenRevoked: false, [`${TableName.IdentityAccessToken}.isAccessTokenRevoked` as "isAccessTokenRevoked"]: false,
authMethod: IdentityAuthMethod.TOKEN_AUTH [`${TableName.IdentityAccessToken}.authMethod` as "authMethod"]: IdentityAuthMethod.TOKEN_AUTH
}); });
if (!identityAccessToken) if (!identityAccessToken)
throw new NotFoundError({ throw new NotFoundError({

View File

@@ -1,6 +1,7 @@
/* eslint-disable no-await-in-loop */ /* eslint-disable no-await-in-loop */
import { createAppAuth } from "@octokit/auth-app"; import { createAppAuth } from "@octokit/auth-app";
import { Octokit } from "@octokit/rest"; import { Octokit } from "@octokit/rest";
import { Client as OctopusDeployClient, ProjectRepository as OctopusDeployRepository } from "@octopusdeploy/api-client";
import { TIntegrationAuths } from "@app/db/schemas"; import { TIntegrationAuths } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
@@ -1087,6 +1088,33 @@ const getAppsAzureDevOps = async ({ accessToken, orgName }: { accessToken: strin
return apps; return apps;
}; };
const getAppsOctopusDeploy = async ({
apiKey,
instanceURL,
spaceName = "Default"
}: {
apiKey: string;
instanceURL: string;
spaceName?: string;
}) => {
const client = await OctopusDeployClient.create({
instanceURL,
apiKey,
userAgentApp: "Infisical Integration"
});
const repository = new OctopusDeployRepository(client, spaceName);
const projects = await repository.list({
take: 1000
});
return projects.Items.map((project) => ({
name: project.Name,
appId: project.Id
}));
};
export const getApps = async ({ export const getApps = async ({
integration, integration,
integrationAuth, integrationAuth,
@@ -1260,6 +1288,13 @@ export const getApps = async ({
orgName: azureDevOpsOrgName as string orgName: azureDevOpsOrgName as string
}); });
case Integrations.OCTOPUS_DEPLOY:
return getAppsOctopusDeploy({
apiKey: accessToken,
instanceURL: url!,
spaceName: workspaceSlug
});
default: default:
throw new NotFoundError({ message: `Integration '${integration}' not found` }); throw new NotFoundError({ message: `Integration '${integration}' not found` });
} }

View File

@@ -1,6 +1,7 @@
import { ForbiddenError } from "@casl/ability"; import { ForbiddenError } from "@casl/ability";
import { createAppAuth } from "@octokit/auth-app"; import { createAppAuth } from "@octokit/auth-app";
import { Octokit } from "@octokit/rest"; import { Octokit } from "@octokit/rest";
import { Client as OctopusClient, SpaceRepository as OctopusSpaceRepository } from "@octopusdeploy/api-client";
import AWS from "aws-sdk"; import AWS from "aws-sdk";
import { SecretEncryptionAlgo, SecretKeyEncoding, TIntegrationAuths, TIntegrationAuthsInsert } from "@app/db/schemas"; import { SecretEncryptionAlgo, SecretKeyEncoding, TIntegrationAuths, TIntegrationAuthsInsert } from "@app/db/schemas";
@@ -9,7 +10,7 @@ import { ProjectPermissionActions, ProjectPermissionSub } from "@app/ee/services
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request"; import { request } from "@app/lib/config/request";
import { decryptSymmetric128BitHexKeyUTF8, encryptSymmetric128BitHexKeyUTF8 } from "@app/lib/crypto"; import { decryptSymmetric128BitHexKeyUTF8, encryptSymmetric128BitHexKeyUTF8 } from "@app/lib/crypto";
import { BadRequestError, NotFoundError } from "@app/lib/errors"; import { BadRequestError, InternalServerError, NotFoundError } from "@app/lib/errors";
import { TGenericPermission, TProjectPermission } from "@app/lib/types"; import { TGenericPermission, TProjectPermission } from "@app/lib/types";
import { TIntegrationDALFactory } from "../integration/integration-dal"; import { TIntegrationDALFactory } from "../integration/integration-dal";
@@ -20,6 +21,7 @@ import { getApps } from "./integration-app-list";
import { TIntegrationAuthDALFactory } from "./integration-auth-dal"; import { TIntegrationAuthDALFactory } from "./integration-auth-dal";
import { IntegrationAuthMetadataSchema, TIntegrationAuthMetadata } from "./integration-auth-schema"; import { IntegrationAuthMetadataSchema, TIntegrationAuthMetadata } from "./integration-auth-schema";
import { import {
OctopusDeployScope,
TBitbucketEnvironment, TBitbucketEnvironment,
TBitbucketWorkspace, TBitbucketWorkspace,
TChecklyGroups, TChecklyGroups,
@@ -38,6 +40,8 @@ import {
TIntegrationAuthGithubOrgsDTO, TIntegrationAuthGithubOrgsDTO,
TIntegrationAuthHerokuPipelinesDTO, TIntegrationAuthHerokuPipelinesDTO,
TIntegrationAuthNorthflankSecretGroupDTO, TIntegrationAuthNorthflankSecretGroupDTO,
TIntegrationAuthOctopusDeployProjectScopeValuesDTO,
TIntegrationAuthOctopusDeploySpacesDTO,
TIntegrationAuthQoveryEnvironmentsDTO, TIntegrationAuthQoveryEnvironmentsDTO,
TIntegrationAuthQoveryOrgsDTO, TIntegrationAuthQoveryOrgsDTO,
TIntegrationAuthQoveryProjectDTO, TIntegrationAuthQoveryProjectDTO,
@@ -48,6 +52,7 @@ import {
TIntegrationAuthVercelBranchesDTO, TIntegrationAuthVercelBranchesDTO,
TNorthflankSecretGroup, TNorthflankSecretGroup,
TOauthExchangeDTO, TOauthExchangeDTO,
TOctopusDeployVariableSet,
TSaveIntegrationAccessTokenDTO, TSaveIntegrationAccessTokenDTO,
TTeamCityBuildConfig, TTeamCityBuildConfig,
TVercelBranches TVercelBranches
@@ -1521,6 +1526,88 @@ export const integrationAuthServiceFactory = ({
return integrationAuthDAL.create(newIntegrationAuth); return integrationAuthDAL.create(newIntegrationAuth);
}; };
const getOctopusDeploySpaces = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id
}: TIntegrationAuthOctopusDeploySpacesDTO) => {
const integrationAuth = await integrationAuthDAL.findById(id);
if (!integrationAuth) throw new NotFoundError({ message: `Integration auth with ID '${id}' not found` });
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integrationAuth.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const { shouldUseSecretV2Bridge, botKey } = await projectBotService.getBotKey(integrationAuth.projectId);
const { accessToken } = await getIntegrationAccessToken(integrationAuth, shouldUseSecretV2Bridge, botKey);
const client = await OctopusClient.create({
apiKey: accessToken,
instanceURL: integrationAuth.url!,
userAgentApp: "Infisical Integration"
});
const spaceRepository = new OctopusSpaceRepository(client);
const spaces = await spaceRepository.list({
partialName: "", // throws error if no string is present...
take: 1000
});
return spaces.Items;
};
const getOctopusDeployScopeValues = async ({
actorId,
actor,
actorOrgId,
actorAuthMethod,
id,
scope,
spaceId,
resourceId
}: TIntegrationAuthOctopusDeployProjectScopeValuesDTO) => {
const integrationAuth = await integrationAuthDAL.findById(id);
if (!integrationAuth) throw new NotFoundError({ message: `Integration auth with ID '${id}' not found` });
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integrationAuth.projectId,
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const { shouldUseSecretV2Bridge, botKey } = await projectBotService.getBotKey(integrationAuth.projectId);
const { accessToken } = await getIntegrationAccessToken(integrationAuth, shouldUseSecretV2Bridge, botKey);
let url: string;
switch (scope) {
case OctopusDeployScope.Project:
url = `${integrationAuth.url}/api/${spaceId}/projects/${resourceId}/variables`;
break;
// future support tenant, variable set etc.
default:
throw new InternalServerError({ message: `Unhandled Octopus Deploy scope` });
}
// SDK doesn't support variable set...
const { data: variableSet } = await request.get<TOctopusDeployVariableSet>(url, {
headers: {
"X-NuGet-ApiKey": accessToken,
Accept: "application/json"
}
});
return variableSet.ScopeValues;
};
return { return {
listIntegrationAuthByProjectId, listIntegrationAuthByProjectId,
listOrgIntegrationAuth, listOrgIntegrationAuth,
@@ -1552,6 +1639,8 @@ export const integrationAuthServiceFactory = ({
getBitbucketWorkspaces, getBitbucketWorkspaces,
getBitbucketEnvironments, getBitbucketEnvironments,
getIntegrationAccessToken, getIntegrationAccessToken,
duplicateIntegrationAuth duplicateIntegrationAuth,
getOctopusDeploySpaces,
getOctopusDeployScopeValues
}; };
}; };

View File

@@ -193,3 +193,72 @@ export type TIntegrationsWithEnvironment = TIntegrations & {
| null | null
| undefined; | undefined;
}; };
export type TIntegrationAuthOctopusDeploySpacesDTO = {
id: string;
} & Omit<TProjectPermission, "projectId">;
export type TIntegrationAuthOctopusDeployProjectScopeValuesDTO = {
id: string;
spaceId: string;
resourceId: string;
scope: OctopusDeployScope;
} & Omit<TProjectPermission, "projectId">;
export enum OctopusDeployScope {
Project = "project"
// add tenant, variable set, etc.
}
export type TOctopusDeployVariableSet = {
Id: string;
OwnerId: string;
Version: number;
Variables: {
Id: string;
Name: string;
Value: string;
Description: string;
Scope: {
Environment?: string[];
Machine?: string[];
Role?: string[];
TargetRole?: string[];
Action?: string[];
User?: string[];
Trigger?: string[];
ParentDeployment?: string[];
Private?: string[];
Channel?: string[];
TenantTag?: string[];
Tenant?: string[];
ProcessOwner?: string[];
};
IsEditable: boolean;
Prompt: {
Description: string;
DisplaySettings: Record<string, string>;
Label: string;
Required: boolean;
} | null;
Type: "String";
IsSensitive: boolean;
}[];
ScopeValues: {
Environments: { Id: string; Name: string }[];
Machines: { Id: string; Name: string }[];
Actions: { Id: string; Name: string }[];
Roles: { Id: string; Name: string }[];
Channels: { Id: string; Name: string }[];
TenantTags: { Id: string; Name: string }[];
Processes: {
ProcessType: string;
Id: string;
Name: string;
}[];
};
SpaceId: string;
Links: {
Self: string;
};
};

View File

@@ -34,7 +34,8 @@ export enum Integrations {
HASURA_CLOUD = "hasura-cloud", HASURA_CLOUD = "hasura-cloud",
RUNDECK = "rundeck", RUNDECK = "rundeck",
AZURE_DEVOPS = "azure-devops", AZURE_DEVOPS = "azure-devops",
AZURE_APP_CONFIGURATION = "azure-app-configuration" AZURE_APP_CONFIGURATION = "azure-app-configuration",
OCTOPUS_DEPLOY = "octopus-deploy"
} }
export enum IntegrationType { export enum IntegrationType {
@@ -413,6 +414,15 @@ export const getIntegrationOptions = async () => {
type: "pat", type: "pat",
clientId: "", clientId: "",
docsLink: "" docsLink: ""
},
{
name: "Octopus Deploy",
slug: "octopus-deploy",
image: "Octopus Deploy.png",
isAvailable: true,
type: "sat",
clientId: "",
docsLink: ""
} }
]; ];

View File

@@ -32,14 +32,14 @@ import { z } from "zod";
import { SecretType, TIntegrationAuths, TIntegrations } from "@app/db/schemas"; import { SecretType, TIntegrationAuths, TIntegrations } from "@app/db/schemas";
import { getConfig } from "@app/lib/config/env"; import { getConfig } from "@app/lib/config/env";
import { request } from "@app/lib/config/request"; import { request } from "@app/lib/config/request";
import { BadRequestError } from "@app/lib/errors"; import { BadRequestError, InternalServerError } from "@app/lib/errors";
import { logger } from "@app/lib/logger"; import { logger } from "@app/lib/logger";
import { TCreateManySecretsRawFn, TUpdateManySecretsRawFn } from "@app/services/secret/secret-types"; import { TCreateManySecretsRawFn, TUpdateManySecretsRawFn } from "@app/services/secret/secret-types";
import { TIntegrationDALFactory } from "../integration/integration-dal"; import { TIntegrationDALFactory } from "../integration/integration-dal";
import { IntegrationMetadataSchema } from "../integration/integration-schema"; import { IntegrationMetadataSchema } from "../integration/integration-schema";
import { IntegrationAuthMetadataSchema } from "./integration-auth-schema"; import { IntegrationAuthMetadataSchema } from "./integration-auth-schema";
import { TIntegrationsWithEnvironment } from "./integration-auth-types"; import { OctopusDeployScope, TIntegrationsWithEnvironment, TOctopusDeployVariableSet } from "./integration-auth-types";
import { import {
IntegrationInitialSyncBehavior, IntegrationInitialSyncBehavior,
IntegrationMappingBehavior, IntegrationMappingBehavior,
@@ -473,7 +473,7 @@ const syncSecretsAzureKeyVault = async ({
id: string; // secret URI id: string; // secret URI
value: string; value: string;
attributes: { attributes: {
enabled: true; enabled: boolean;
created: number; created: number;
updated: number; updated: number;
recoveryLevel: string; recoveryLevel: string;
@@ -509,10 +509,19 @@ const syncSecretsAzureKeyVault = async ({
const getAzureKeyVaultSecrets = await paginateAzureKeyVaultSecrets(`${integration.app}/secrets?api-version=7.3`); const getAzureKeyVaultSecrets = await paginateAzureKeyVaultSecrets(`${integration.app}/secrets?api-version=7.3`);
const enabledAzureKeyVaultSecrets = getAzureKeyVaultSecrets.filter((secret) => secret.attributes.enabled);
// disabled keys to skip sending updates to
const disabledAzureKeyVaultSecretKeys = getAzureKeyVaultSecrets
.filter(({ attributes }) => !attributes.enabled)
.map((getAzureKeyVaultSecret) => {
return getAzureKeyVaultSecret.id.substring(getAzureKeyVaultSecret.id.lastIndexOf("/") + 1);
});
let lastSlashIndex: number; let lastSlashIndex: number;
const res = ( const res = (
await Promise.all( await Promise.all(
getAzureKeyVaultSecrets.map(async (getAzureKeyVaultSecret) => { enabledAzureKeyVaultSecrets.map(async (getAzureKeyVaultSecret) => {
if (!lastSlashIndex) { if (!lastSlashIndex) {
lastSlashIndex = getAzureKeyVaultSecret.id.lastIndexOf("/"); lastSlashIndex = getAzureKeyVaultSecret.id.lastIndexOf("/");
} }
@@ -658,6 +667,7 @@ const syncSecretsAzureKeyVault = async ({
}) => { }) => {
let isSecretSet = false; let isSecretSet = false;
let maxTries = 6; let maxTries = 6;
if (disabledAzureKeyVaultSecretKeys.includes(key)) return;
while (!isSecretSet && maxTries > 0) { while (!isSecretSet && maxTries > 0) {
// try to set secret // try to set secret
@@ -3075,7 +3085,7 @@ const syncSecretsTerraformCloud = async ({
}) => { }) => {
// get secrets from Terraform Cloud // get secrets from Terraform Cloud
const terraformSecrets = ( const terraformSecrets = (
await request.get<{ data: { attributes: { key: string; value: string }; id: string }[] }>( await request.get<{ data: { attributes: { key: string; value: string; sensitive: boolean }; id: string }[] }>(
`${IntegrationUrls.TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars`, `${IntegrationUrls.TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars`,
{ {
headers: { headers: {
@@ -3089,7 +3099,7 @@ const syncSecretsTerraformCloud = async ({
...obj, ...obj,
[secret.attributes.key]: secret [secret.attributes.key]: secret
}), }),
{} as Record<string, { attributes: { key: string; value: string }; id: string }> {} as Record<string, { attributes: { key: string; value: string; sensitive: boolean }; id: string }>
); );
const secretsToAdd: { [key: string]: string } = {}; const secretsToAdd: { [key: string]: string } = {};
@@ -3170,7 +3180,8 @@ const syncSecretsTerraformCloud = async ({
attributes: { attributes: {
key, key,
value: secrets[key]?.value, value: secrets[key]?.value,
category: integration.targetService category: integration.targetService,
sensitive: true
} }
} }
}, },
@@ -3183,7 +3194,11 @@ const syncSecretsTerraformCloud = async ({
} }
); );
// case: secret exists in Terraform Cloud // case: secret exists in Terraform Cloud
} else if (secrets[key]?.value !== terraformSecrets[key].attributes.value) { } else if (
// we now set secrets to sensitive in Terraform Cloud, this checks if existing secrets are not sensitive and updates them accordingly
!terraformSecrets[key].attributes.sensitive ||
secrets[key]?.value !== terraformSecrets[key].attributes.value
) {
// -> update secret // -> update secret
await request.patch( await request.patch(
`${IntegrationUrls.TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars/${terraformSecrets[key].id}`, `${IntegrationUrls.TERRAFORM_CLOUD_API_URL}/api/v2/workspaces/${integration.appId}/vars/${terraformSecrets[key].id}`,
@@ -3193,7 +3208,8 @@ const syncSecretsTerraformCloud = async ({
id: terraformSecrets[key].id, id: terraformSecrets[key].id,
attributes: { attributes: {
...terraformSecrets[key], ...terraformSecrets[key],
value: secrets[key]?.value value: secrets[key]?.value,
sensitive: true
} }
} }
}, },
@@ -4195,6 +4211,61 @@ const syncSecretsRundeck = async ({
} }
}; };
const syncSecretsOctopusDeploy = async ({
integration,
integrationAuth,
secrets,
accessToken
}: {
integration: TIntegrations;
integrationAuth: TIntegrationAuths;
secrets: Record<string, { value: string; comment?: string }>;
accessToken: string;
}) => {
let url: string;
switch (integration.scope) {
case OctopusDeployScope.Project:
url = `${integrationAuth.url}/api/${integration.targetEnvironmentId}/projects/${integration.appId}/variables`;
break;
// future support tenant, variable set, etc.
default:
throw new InternalServerError({ message: `Unhandled Octopus Deploy scope: ${integration.scope}` });
}
// SDK doesn't support variable set...
const { data: variableSet } = await request.get<TOctopusDeployVariableSet>(url, {
headers: {
"X-NuGet-ApiKey": accessToken,
Accept: "application/json"
}
});
await request.put(
url,
{
...variableSet,
Variables: Object.entries(secrets).map(([key, value]) => ({
Name: key,
Value: value.value,
Description: value.comment ?? "",
Scope:
(integration.metadata as { octopusDeployScopeValues: TOctopusDeployVariableSet["ScopeValues"] })
?.octopusDeployScopeValues ?? {},
IsEditable: false,
Prompt: null,
Type: "String",
IsSensitive: true
}))
} as unknown as TOctopusDeployVariableSet,
{
headers: {
"X-NuGet-ApiKey": accessToken,
Accept: "application/json"
}
}
);
};
/** /**
* Sync/push [secrets] to [app] in integration named [integration] * Sync/push [secrets] to [app] in integration named [integration]
* *
@@ -4507,6 +4578,14 @@ export const syncIntegrationSecrets = async ({
accessToken accessToken
}); });
break; break;
case Integrations.OCTOPUS_DEPLOY:
await syncSecretsOctopusDeploy({
integration,
integrationAuth,
secrets,
accessToken
});
break;
default: default:
throw new BadRequestError({ message: "Invalid integration" }); throw new BadRequestError({ message: "Invalid integration" });
} }

View File

@@ -46,5 +46,18 @@ export const IntegrationMetadataSchema = z.object({
shouldDisableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldDisableDelete), shouldDisableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldDisableDelete),
shouldEnableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldEnableDelete), shouldEnableDelete: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldEnableDelete),
shouldMaskSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldMaskSecrets), shouldMaskSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldMaskSecrets),
shouldProtectSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldProtectSecrets) shouldProtectSecrets: z.boolean().optional().describe(INTEGRATION.CREATE.metadata.shouldProtectSecrets),
octopusDeployScopeValues: z
.object({
// in Octopus Deploy Scope Value Format
Environment: z.string().array().optional(),
Action: z.string().array().optional(),
Channel: z.string().array().optional(),
Machine: z.string().array().optional(),
ProcessOwner: z.string().array().optional(),
Role: z.string().array().optional()
})
.optional()
.describe(INTEGRATION.CREATE.metadata.octopusDeployScopeValues)
}); });

View File

@@ -9,6 +9,7 @@ import { TIntegrationAuthDALFactory } from "../integration-auth/integration-auth
import { TIntegrationAuthServiceFactory } from "../integration-auth/integration-auth-service"; import { TIntegrationAuthServiceFactory } from "../integration-auth/integration-auth-service";
import { deleteIntegrationSecrets } from "../integration-auth/integration-delete-secret"; import { deleteIntegrationSecrets } from "../integration-auth/integration-delete-secret";
import { TKmsServiceFactory } from "../kms/kms-service"; import { TKmsServiceFactory } from "../kms/kms-service";
import { KmsDataKey } from "../kms/kms-types";
import { TProjectBotServiceFactory } from "../project-bot/project-bot-service"; import { TProjectBotServiceFactory } from "../project-bot/project-bot-service";
import { TSecretDALFactory } from "../secret/secret-dal"; import { TSecretDALFactory } from "../secret/secret-dal";
import { TSecretQueueFactory } from "../secret/secret-queue"; import { TSecretQueueFactory } from "../secret/secret-queue";
@@ -237,6 +238,46 @@ export const integrationServiceFactory = ({
return { ...integration, envId: integration.environment.id }; return { ...integration, envId: integration.environment.id };
}; };
const getIntegrationAWSIamRole = async ({ id, actor, actorAuthMethod, actorId, actorOrgId }: TGetIntegrationDTO) => {
const integration = await integrationDAL.findById(id);
if (!integration) {
throw new NotFoundError({
message: `Integration with ID '${id}' not found`
});
}
const { permission } = await permissionService.getProjectPermission(
actor,
actorId,
integration?.projectId || "",
actorAuthMethod,
actorOrgId
);
ForbiddenError.from(permission).throwUnlessCan(ProjectPermissionActions.Read, ProjectPermissionSub.Integrations);
const integrationAuth = await integrationAuthDAL.findById(integration.integrationAuthId);
const { decryptor: secretManagerDecryptor } = await kmsService.createCipherPairWithDataKey({
type: KmsDataKey.SecretManager,
projectId: integration.projectId
});
let awsIamRole: string | null = null;
if (integrationAuth.encryptedAwsAssumeIamRoleArn) {
const awsAssumeRoleArn = secretManagerDecryptor({
cipherTextBlob: Buffer.from(integrationAuth.encryptedAwsAssumeIamRoleArn)
}).toString();
if (awsAssumeRoleArn) {
const [, role] = awsAssumeRoleArn.split(":role/");
awsIamRole = role;
}
}
return {
role: awsIamRole
};
};
const deleteIntegration = async ({ const deleteIntegration = async ({
actorId, actorId,
id, id,
@@ -329,6 +370,7 @@ export const integrationServiceFactory = ({
deleteIntegration, deleteIntegration,
listIntegrationByProject, listIntegrationByProject,
getIntegration, getIntegration,
getIntegrationAWSIamRole,
syncIntegration syncIntegration
}; };
}; };

View File

@@ -280,7 +280,12 @@ export const projectMembershipServiceFactory = ({
// validate custom roles input // validate custom roles input
const customInputRoles = roles.filter( const customInputRoles = roles.filter(
({ role }) => !Object.values(ProjectMembershipRole).includes(role as ProjectMembershipRole) ({ role }) =>
!Object.values(ProjectMembershipRole)
// we don't want to include custom in this check;
// this unintentionally enables setting slug to custom which is reserved
.filter((r) => r !== ProjectMembershipRole.Custom)
.includes(role as ProjectMembershipRole)
); );
const hasCustomRole = Boolean(customInputRoles.length); const hasCustomRole = Boolean(customInputRoles.length);
if (hasCustomRole) { if (hasCustomRole) {

View File

@@ -191,6 +191,10 @@ export const projectDALFactory = (db: TDbClient) => {
return project; return project;
} catch (error) { } catch (error) {
if (error instanceof NotFoundError) {
throw error;
}
throw new DatabaseError({ error, name: "Find all projects" }); throw new DatabaseError({ error, name: "Find all projects" });
} }
}; };
@@ -240,6 +244,10 @@ export const projectDALFactory = (db: TDbClient) => {
return project; return project;
} catch (error) { } catch (error) {
if (error instanceof NotFoundError || error instanceof UnauthorizedError) {
throw error;
}
throw new DatabaseError({ error, name: "Find project by slug" }); throw new DatabaseError({ error, name: "Find project by slug" });
} }
}; };
@@ -260,7 +268,7 @@ export const projectDALFactory = (db: TDbClient) => {
} }
throw new BadRequestError({ message: "Invalid filter type" }); throw new BadRequestError({ message: "Invalid filter type" });
} catch (error) { } catch (error) {
if (error instanceof BadRequestError) { if (error instanceof BadRequestError || error instanceof NotFoundError || error instanceof UnauthorizedError) {
throw error; throw error;
} }
throw new DatabaseError({ error, name: `Failed to find project by ${filter.type}` }); throw new DatabaseError({ error, name: `Failed to find project by ${filter.type}` });

View File

@@ -285,11 +285,14 @@ export const projectQueueFactory = ({
if (!orgMembership) { if (!orgMembership) {
// This can happen. Since we don't remove project memberships and project keys when a user is removed from an org, this is a valid case. // This can happen. Since we don't remove project memberships and project keys when a user is removed from an org, this is a valid case.
logger.info("User is not in organization", { logger.info(
userId: key.receiverId, {
orgId: project.orgId, userId: key.receiverId,
projectId: project.id orgId: project.orgId,
}); projectId: project.id
},
"User is not in organization"
);
// eslint-disable-next-line no-continue // eslint-disable-next-line no-continue
continue; continue;
} }
@@ -551,10 +554,10 @@ export const projectQueueFactory = ({
.catch(() => [null]); .catch(() => [null]);
if (!project) { if (!project) {
logger.error("Failed to upgrade project, because no project was found", data); logger.error(data, "Failed to upgrade project, because no project was found");
} else { } else {
await projectDAL.setProjectUpgradeStatus(data.projectId, ProjectUpgradeStatus.Failed); await projectDAL.setProjectUpgradeStatus(data.projectId, ProjectUpgradeStatus.Failed);
logger.error("Failed to upgrade project", err, { logger.error(err, "Failed to upgrade project", {
extra: { extra: {
project, project,
jobData: data jobData: data

View File

@@ -149,6 +149,7 @@ export const projectServiceFactory = ({
actorOrgId, actorOrgId,
actorAuthMethod, actorAuthMethod,
workspaceName, workspaceName,
workspaceDescription,
slug: projectSlug, slug: projectSlug,
kmsKeyId, kmsKeyId,
tx: trx, tx: trx,
@@ -206,6 +207,7 @@ export const projectServiceFactory = ({
const project = await projectDAL.create( const project = await projectDAL.create(
{ {
name: workspaceName, name: workspaceName,
description: workspaceDescription,
orgId: organization.id, orgId: organization.id,
slug: projectSlug || slugify(`${workspaceName}-${alphaNumericNanoId(4)}`), slug: projectSlug || slugify(`${workspaceName}-${alphaNumericNanoId(4)}`),
kmsSecretManagerKeyId: kmsKeyId, kmsSecretManagerKeyId: kmsKeyId,
@@ -496,6 +498,7 @@ export const projectServiceFactory = ({
const updatedProject = await projectDAL.updateById(project.id, { const updatedProject = await projectDAL.updateById(project.id, {
name: update.name, name: update.name,
description: update.description,
autoCapitalization: update.autoCapitalization autoCapitalization: update.autoCapitalization
}); });
return updatedProject; return updatedProject;

View File

@@ -29,6 +29,7 @@ export type TCreateProjectDTO = {
actorId: string; actorId: string;
actorOrgId?: string; actorOrgId?: string;
workspaceName: string; workspaceName: string;
workspaceDescription?: string;
slug?: string; slug?: string;
kmsKeyId?: string; kmsKeyId?: string;
createDefaultEnvs?: boolean; createDefaultEnvs?: boolean;
@@ -69,6 +70,7 @@ export type TUpdateProjectDTO = {
filter: Filter; filter: Filter;
update: { update: {
name?: string; name?: string;
description?: string;
autoCapitalization?: boolean; autoCapitalization?: boolean;
}; };
} & Omit<TProjectPermission, "projectId">; } & Omit<TProjectPermission, "projectId">;

View File

@@ -414,12 +414,13 @@ export const secretV2BridgeServiceFactory = ({
type: KmsDataKey.SecretManager, type: KmsDataKey.SecretManager,
projectId projectId
}); });
const encryptedValue = secretValue const encryptedValue =
? { typeof secretValue === "string"
encryptedValue: secretManagerEncryptor({ plainText: Buffer.from(secretValue) }).cipherTextBlob, ? {
references: getAllSecretReferences(secretValue).nestedReferences encryptedValue: secretManagerEncryptor({ plainText: Buffer.from(secretValue) }).cipherTextBlob,
} references: getAllSecretReferences(secretValue).nestedReferences
: {}; }
: {};
if (secretValue) { if (secretValue) {
const { nestedReferences, localReferences } = getAllSecretReferences(secretValue); const { nestedReferences, localReferences } = getAllSecretReferences(secretValue);
@@ -1165,7 +1166,7 @@ export const secretV2BridgeServiceFactory = ({
const newSecrets = await secretDAL.transaction(async (tx) => const newSecrets = await secretDAL.transaction(async (tx) =>
fnSecretBulkInsert({ fnSecretBulkInsert({
inputSecrets: inputSecrets.map((el) => { inputSecrets: inputSecrets.map((el) => {
const references = secretReferencesGroupByInputSecretKey[el.secretKey].nestedReferences; const references = secretReferencesGroupByInputSecretKey[el.secretKey]?.nestedReferences;
return { return {
version: 1, version: 1,
@@ -1372,7 +1373,7 @@ export const secretV2BridgeServiceFactory = ({
typeof el.secretValue !== "undefined" typeof el.secretValue !== "undefined"
? { ? {
encryptedValue: secretManagerEncryptor({ plainText: Buffer.from(el.secretValue) }).cipherTextBlob, encryptedValue: secretManagerEncryptor({ plainText: Buffer.from(el.secretValue) }).cipherTextBlob,
references: secretReferencesGroupByInputSecretKey[el.secretKey].nestedReferences references: secretReferencesGroupByInputSecretKey[el.secretKey]?.nestedReferences
} }
: {}; : {};

View File

@@ -77,5 +77,21 @@ export const smtpServiceFactory = (cfg: TSmtpConfig) => {
} }
}; };
return { sendMail }; const verify = async () => {
const isConnected = smtp
.verify()
.then(async () => {
logger.info("SMTP connected");
return true;
})
.catch((err: Error) => {
logger.error("SMTP error");
logger.error(err);
return false;
});
return isConnected;
};
return { sendMail, verify };
}; };

View File

@@ -142,7 +142,7 @@ export const fnTriggerWebhook = async ({
!isDisabled && picomatch.isMatch(secretPath, hookSecretPath, { strictSlashes: false }) !isDisabled && picomatch.isMatch(secretPath, hookSecretPath, { strictSlashes: false })
); );
if (!toBeTriggeredHooks.length) return; if (!toBeTriggeredHooks.length) return;
logger.info("Secret webhook job started", { environment, secretPath, projectId }); logger.info({ environment, secretPath, projectId }, "Secret webhook job started");
const project = await projectDAL.findById(projectId); const project = await projectDAL.findById(projectId);
const webhooksTriggered = await Promise.allSettled( const webhooksTriggered = await Promise.allSettled(
toBeTriggeredHooks.map((hook) => toBeTriggeredHooks.map((hook) =>
@@ -195,5 +195,5 @@ export const fnTriggerWebhook = async ({
); );
} }
}); });
logger.info("Secret webhook job ended", { environment, secretPath, projectId }); logger.info({ environment, secretPath, projectId }, "Secret webhook job ended");
}; };

View File

@@ -111,7 +111,7 @@ var exportCmd = &cobra.Command{
accessToken = token.Token accessToken = token.Token
} else { } else {
log.Debug().Msg("GetAllEnvironmentVariables: Trying to fetch secrets using logged in details") log.Debug().Msg("GetAllEnvironmentVariables: Trying to fetch secrets using logged in details")
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails() loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
util.HandleError(err) util.HandleError(err)
} }

View File

@@ -41,7 +41,7 @@ var initCmd = &cobra.Command{
} }
} }
userCreds, err := util.GetCurrentLoggedInUserDetails() userCreds, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
util.HandleError(err, "Unable to get your login details") util.HandleError(err, "Unable to get your login details")
} }

View File

@@ -154,6 +154,8 @@ var loginCmd = &cobra.Command{
DisableFlagsInUseLine: true, DisableFlagsInUseLine: true,
Run: func(cmd *cobra.Command, args []string) { Run: func(cmd *cobra.Command, args []string) {
presetDomain := config.INFISICAL_URL
clearSelfHostedDomains, err := cmd.Flags().GetBool("clear-domains") clearSelfHostedDomains, err := cmd.Flags().GetBool("clear-domains")
if err != nil { if err != nil {
util.HandleError(err) util.HandleError(err)
@@ -198,7 +200,7 @@ var loginCmd = &cobra.Command{
// standalone user auth // standalone user auth
if loginMethod == "user" { if loginMethod == "user" {
currentLoggedInUserDetails, err := util.GetCurrentLoggedInUserDetails() currentLoggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
// if the key can't be found or there is an error getting current credentials from key ring, allow them to override // if the key can't be found or there is an error getting current credentials from key ring, allow them to override
if err != nil && (strings.Contains(err.Error(), "we couldn't find your logged in details")) { if err != nil && (strings.Contains(err.Error(), "we couldn't find your logged in details")) {
log.Debug().Err(err) log.Debug().Err(err)
@@ -216,11 +218,19 @@ var loginCmd = &cobra.Command{
return return
} }
} }
usePresetDomain, err := usePresetDomain(presetDomain)
if err != nil {
util.HandleError(err)
}
//override domain //override domain
domainQuery := true domainQuery := true
if config.INFISICAL_URL_MANUAL_OVERRIDE != "" && if config.INFISICAL_URL_MANUAL_OVERRIDE != "" &&
config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_EU_URL) && config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_EU_URL) &&
config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_US_URL) { config.INFISICAL_URL_MANUAL_OVERRIDE != fmt.Sprintf("%s/api", util.INFISICAL_DEFAULT_US_URL) &&
!usePresetDomain {
overrideDomain, err := DomainOverridePrompt() overrideDomain, err := DomainOverridePrompt()
if err != nil { if err != nil {
util.HandleError(err) util.HandleError(err)
@@ -228,7 +238,7 @@ var loginCmd = &cobra.Command{
//if not override set INFISICAL_URL to exported var //if not override set INFISICAL_URL to exported var
//set domainQuery to false //set domainQuery to false
if !overrideDomain { if !overrideDomain && !usePresetDomain {
domainQuery = false domainQuery = false
config.INFISICAL_URL = util.AppendAPIEndpoint(config.INFISICAL_URL_MANUAL_OVERRIDE) config.INFISICAL_URL = util.AppendAPIEndpoint(config.INFISICAL_URL_MANUAL_OVERRIDE)
config.INFISICAL_LOGIN_URL = fmt.Sprintf("%s/login", strings.TrimSuffix(config.INFISICAL_URL, "/api")) config.INFISICAL_LOGIN_URL = fmt.Sprintf("%s/login", strings.TrimSuffix(config.INFISICAL_URL, "/api"))
@@ -237,7 +247,7 @@ var loginCmd = &cobra.Command{
} }
//prompt user to select domain between Infisical cloud and self-hosting //prompt user to select domain between Infisical cloud and self-hosting
if domainQuery { if domainQuery && !usePresetDomain {
err = askForDomain() err = askForDomain()
if err != nil { if err != nil {
util.HandleError(err, "Unable to parse domain url") util.HandleError(err, "Unable to parse domain url")
@@ -526,6 +536,45 @@ func DomainOverridePrompt() (bool, error) {
return selectedOption == OVERRIDE, err return selectedOption == OVERRIDE, err
} }
func usePresetDomain(presetDomain string) (bool, error) {
infisicalConfig, err := util.GetConfigFile()
if err != nil {
return false, fmt.Errorf("askForDomain: unable to get config file because [err=%s]", err)
}
preconfiguredUrl := strings.TrimSuffix(presetDomain, "/api")
if preconfiguredUrl != "" && preconfiguredUrl != util.INFISICAL_DEFAULT_US_URL && preconfiguredUrl != util.INFISICAL_DEFAULT_EU_URL {
parsedDomain := strings.TrimSuffix(strings.Trim(preconfiguredUrl, "/"), "/api")
_, err := url.ParseRequestURI(parsedDomain)
if err != nil {
return false, errors.New(fmt.Sprintf("Invalid domain URL: '%s'", parsedDomain))
}
config.INFISICAL_URL = fmt.Sprintf("%s/api", parsedDomain)
config.INFISICAL_LOGIN_URL = fmt.Sprintf("%s/login", parsedDomain)
if !slices.Contains(infisicalConfig.Domains, parsedDomain) {
infisicalConfig.Domains = append(infisicalConfig.Domains, parsedDomain)
err = util.WriteConfigFile(&infisicalConfig)
if err != nil {
return false, fmt.Errorf("askForDomain: unable to write domains to config file because [err=%s]", err)
}
}
whilte := color.New(color.FgGreen)
boldWhite := whilte.Add(color.Bold)
time.Sleep(time.Second * 1)
boldWhite.Printf("[INFO] Using domain '%s' from domain flag or INFISICAL_API_URL environment variable\n", parsedDomain)
return true, nil
}
return false, nil
}
func askForDomain() error { func askForDomain() error {
// query user to choose between Infisical cloud or self-hosting // query user to choose between Infisical cloud or self-hosting

View File

@@ -54,7 +54,7 @@ func init() {
util.CheckForUpdate() util.CheckForUpdate()
} }
loggedInDetails, err := util.GetCurrentLoggedInUserDetails() loggedInDetails, err := util.GetCurrentLoggedInUserDetails(false)
if !silent && err == nil && loggedInDetails.IsUserLoggedIn && !loggedInDetails.LoginExpired { if !silent && err == nil && loggedInDetails.IsUserLoggedIn && !loggedInDetails.LoginExpired {
token, err := util.GetInfisicalToken(cmd) token, err := util.GetInfisicalToken(cmd)

View File

@@ -194,7 +194,7 @@ var secretsSetCmd = &cobra.Command{
projectId = workspaceFile.WorkspaceId projectId = workspaceFile.WorkspaceId
} }
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails() loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
util.HandleError(err, "unable to authenticate [err=%v]") util.HandleError(err, "unable to authenticate [err=%v]")
} }
@@ -278,7 +278,7 @@ var secretsDeleteCmd = &cobra.Command{
util.RequireLogin() util.RequireLogin()
util.RequireLocalWorkspaceFile() util.RequireLocalWorkspaceFile()
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails() loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
util.HandleError(err, "Unable to authenticate") util.HandleError(err, "Unable to authenticate")
} }

View File

@@ -41,7 +41,7 @@ var tokensCreateCmd = &cobra.Command{
}, },
Run: func(cmd *cobra.Command, args []string) { Run: func(cmd *cobra.Command, args []string) {
// get plain text workspace key // get plain text workspace key
loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails() loggedInUserDetails, err := util.GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
util.HandleError(err, "Unable to retrieve your logged in your details. Please login in then try again") util.HandleError(err, "Unable to retrieve your logged in your details. Please login in then try again")

View File

@@ -55,7 +55,7 @@ func GetUserCredsFromKeyRing(userEmail string) (credentials models.UserCredentia
return userCredentials, err return userCredentials, err
} }
func GetCurrentLoggedInUserDetails() (LoggedInUserDetails, error) { func GetCurrentLoggedInUserDetails(setConfigVariables bool) (LoggedInUserDetails, error) {
if ConfigFileExists() { if ConfigFileExists() {
configFile, err := GetConfigFile() configFile, err := GetConfigFile()
if err != nil { if err != nil {
@@ -75,18 +75,20 @@ func GetCurrentLoggedInUserDetails() (LoggedInUserDetails, error) {
} }
} }
if setConfigVariables {
config.INFISICAL_URL_MANUAL_OVERRIDE = config.INFISICAL_URL
//configFile.LoggedInUserDomain
//if not empty set as infisical url
if configFile.LoggedInUserDomain != "" {
config.INFISICAL_URL = AppendAPIEndpoint(configFile.LoggedInUserDomain)
}
}
// check to to see if the JWT is still valid // check to to see if the JWT is still valid
httpClient := resty.New(). httpClient := resty.New().
SetAuthToken(userCreds.JTWToken). SetAuthToken(userCreds.JTWToken).
SetHeader("Accept", "application/json") SetHeader("Accept", "application/json")
config.INFISICAL_URL_MANUAL_OVERRIDE = config.INFISICAL_URL
//configFile.LoggedInUserDomain
//if not empty set as infisical url
if configFile.LoggedInUserDomain != "" {
config.INFISICAL_URL = AppendAPIEndpoint(configFile.LoggedInUserDomain)
}
isAuthenticated := api.CallIsAuthenticated(httpClient) isAuthenticated := api.CallIsAuthenticated(httpClient)
// TODO: add refresh token // TODO: add refresh token
// if !isAuthenticated { // if !isAuthenticated {

View File

@@ -20,7 +20,7 @@ func GetAllFolders(params models.GetAllFoldersParameters) ([]models.SingleFolder
log.Debug().Msg("GetAllFolders: Trying to fetch folders using logged in details") log.Debug().Msg("GetAllFolders: Trying to fetch folders using logged in details")
loggedInUserDetails, err := GetCurrentLoggedInUserDetails() loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
return nil, err return nil, err
} }
@@ -177,7 +177,7 @@ func CreateFolder(params models.CreateFolderParameters) (models.SingleFolder, er
if params.InfisicalToken == "" { if params.InfisicalToken == "" {
RequireLogin() RequireLogin()
RequireLocalWorkspaceFile() RequireLocalWorkspaceFile()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails() loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
return models.SingleFolder{}, err return models.SingleFolder{}, err
@@ -224,7 +224,7 @@ func DeleteFolder(params models.DeleteFolderParameters) ([]models.SingleFolder,
RequireLogin() RequireLogin()
RequireLocalWorkspaceFile() RequireLocalWorkspaceFile()
loggedInUserDetails, err := GetCurrentLoggedInUserDetails() loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
if err != nil { if err != nil {
return nil, err return nil, err

View File

@@ -246,7 +246,7 @@ func GetAllEnvironmentVariables(params models.GetAllSecretsParameters, projectCo
log.Debug().Msg("GetAllEnvironmentVariables: Trying to fetch secrets using logged in details") log.Debug().Msg("GetAllEnvironmentVariables: Trying to fetch secrets using logged in details")
loggedInUserDetails, err := GetCurrentLoggedInUserDetails() loggedInUserDetails, err := GetCurrentLoggedInUserDetails(true)
isConnected := ValidateInfisicalAPIConnection() isConnected := ValidateInfisicalAPIConnection()
if isConnected { if isConnected {

View File

@@ -3,6 +3,3 @@ title: "Bulk Create"
openapi: "POST /api/v3/secrets/batch/raw" openapi: "POST /api/v3/secrets/batch/raw"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -3,6 +3,3 @@ title: "Create"
openapi: "POST /api/v3/secrets/raw/{secretName}" openapi: "POST /api/v3/secrets/raw/{secretName}"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -3,6 +3,3 @@ title: "Bulk Delete"
openapi: "DELETE /api/v3/secrets/batch/raw" openapi: "DELETE /api/v3/secrets/batch/raw"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -3,6 +3,3 @@ title: "Delete"
openapi: "DELETE /api/v3/secrets/raw/{secretName}" openapi: "DELETE /api/v3/secrets/raw/{secretName}"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -2,7 +2,3 @@
title: "List" title: "List"
openapi: "GET /api/v3/secrets/raw" openapi: "GET /api/v3/secrets/raw"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -3,6 +3,3 @@ title: "Retrieve"
openapi: "GET /api/v3/secrets/raw/{secretName}" openapi: "GET /api/v3/secrets/raw/{secretName}"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -3,6 +3,3 @@ title: "Bulk Update"
openapi: "PATCH /api/v3/secrets/batch/raw" openapi: "PATCH /api/v3/secrets/batch/raw"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

View File

@@ -2,7 +2,3 @@
title: "Update" title: "Update"
openapi: "PATCH /api/v3/secrets/raw/{secretName}" openapi: "PATCH /api/v3/secrets/raw/{secretName}"
--- ---
<Tip>
This endpoint requires you to disable end-to-end encryption. For more information, you should consult this [note](https://infisical.com/docs/api-reference/overview/examples/note).
</Tip>

Some files were not shown because too many files have changed in this diff Show More