Introduction | Integrates | Fluid Attacks Help


Integrates is the product responsible for Fluid Attacks' platform and its API.

Public Oath

  1. The platform is accessible at
  2. Significant changes to the user interface of the platform will be announced via the appropriate communication mechanism.
  3. The API is accessible at
  4. A six-month notice period will be given for backward incompatible changes in the API. This includes but is not limited to: deprecating attributes and entities, making optional arguments mandatory, changes in the authentication or authorization system, and so on.
  5. The Forces container is accessible at DockerHub.
  6. The Retrieves extension is accessible at Visual Studio Marketplace.


Integrates architecture

  1. Integrates is a standard client-server application divided into:
    • A back-end
    • A front-end
    • Retrieves, a Visual Studio Code extension
    • Forces, a Docker container
  2. It declares its own infrastructure using Terraform.
  3. Sensitive secrets like Cloudflare authentication tokens are stored in encrypted YAML files using Mozilla SOPS.

Back end

  1. The back-end is written in typed, functional Python.
  2. It uses Starlette as its main framework.
  3. It uses Hypercorn as its web server.
  4. It serves a GraphQL API.
  5. It has three environments:
    • Production: The production environment used by end users.
    • Ephemerals: A testing environment for each developer accessible via the Internet.
    • Local: A testing environment developers can run on their machine. Instructions for this can be found here.
  6. There is a Tasks application that performs out-of-band processing for cloning client repositories.
  7. DNS records, cache, custom headers, redirections and firewall for both production and ephemeral environments are managed by Cloudflare.
    • Production environment
    • Ephemeral environments
    • Tasks application
  8. CloudWatch is used for storing production logs.
  9. CloudWatch alerts are used to check the queue size of Tasks. If the queue size goes beyond a given limit, email alerts are sent to developers.
  10. There is one Application Load Balancer (ALB) for Production and one for each Ephemeral environment.
  11. DynamoDB by Amazon Web Services (AWS)is the main database. It has two tables:
    • Main for storing all current information.
    • Historic for storing historical states of entities.
  12. OpenSearch is a secondary search database that mirrors DynamoDB. When changes occur in DynamoDB, a DynamoDB stream triggers a Lambda. Such lambda transforms the DynamoDB data to a compatible OpenSearch format and then stores it there.
  13. For storage several S3 buckets are used:
    • client-repositories stores source code repositories from clients.
    • storage stores blobs upload by users (evidences, example files, etc.).
    • machine-executions stores results of Skims executions and provided configuration files.
  14. The DynamoDB database is backed up using Backup Vaults by Amazon Web Services (AWS) as promised in 1 and 2.
  15. Out-of-band processing Jobs like ZTNA repository cloning and machine executions are performed by AWS Batch.
  16. It uses Twilio to send SMS OTPs.
  17. It uses Mailchimp to send email notifications to end users.
  18. Web-hooks are supported so end users can get machine-readable notifications to their endpoints.

Front end

  1. The front-end is written in functional TypeScript.
  2. It does not have a framework.
  3. It uses React for building most of its web interfaces.
  4. It is deployed into AWS S3 buckets, using the corresponding bucket for the environment (ephemeral or production).
  5. The back-end serves the front-end when either (Production) or (Ephemeral) are accessed.


  1. The Visual Studio Code extension is written in functional TypeScript.
  2. It is deployed to the Visual Studio Code Marketplace.
  3. It authenticates with the back-end API using a user-generated token.


  1. The Docker container is written in typed, functional Python.
  2. It is deployed to DockerHub.
  3. It authenticates with the back-end API using a user-generated token.


Please read the Contributing page first.

Development Environment

Follow the steps in the Development Environment section of our documentation.

When prompted for an AWS role, choose dev, and when prompted for a Development Environment, pick integratesBack.

Local Environment

Two approaches for deploying a local environment of Integrates are described below. Either of them will launch a replica of and on localhost:8001.

All in one

You can use mprocs for handling all components in a single terminal:

  • Run m . /integrates.
  • Jobs can be restarted using r.
  • Jobs can be stopped using x.

Individual components

Run each of the following commands within the universe repository in different terminals:

m . /integrates/back dev
m . /integrates/db
m . /integrates/front
m . /integrates/storage/dev

Each terminal will serve a key component of Integrates.

Accessing local environment

  1. Go to https://localhost:3000 and accept the self-signed certificates offered by the server.

    This will allow the back-end to fetch the files to render the UI.

  2. Go to https://localhost:8001 and, again, accept the self-signed certificates offered by the server.

    Now you should see the login portal of the application.

Ephemeral Environment

Once you upload your local changes to your remote branch in GitLab, a pipeline will begin and run some verifications on your branch.

Some of those verifications require a complete working environment to test against. This environment can be found at https://<branch_name>, and it will be available once the pipeline stage deploy-app finishes.

In order to login to your ephemeral environment, SSO needs to be set up for it. You can write to with the URL of your environment so it can be configured.

In case you want to deploy to your ephemeral environment back-end manually, you must first enable permissions for the /not-set file (with root permissions):

touch /not-set
chmod a+rw /not-set

Once this file has the required permissions, you can run deployment from your machine:

m . /integrates/back/deploy/dev

Enable SSO on Ephemeral Environments


This requires you to have access to the Fluid Attacks organization on Google Cloud.

  1. Access the Google Cloud Console.

  2. Choose the project Integrates.

  3. On the left sidebar, choose APIs & Services > Credentials.

  4. On the Credentials dashboard, under OAuth 2.0 Client IDs, choose the client ID not created by Google Services.

  5. Finally, under Authorized redirect URIs, add the URI of the ephemeral environment you want to enable SSO on, https://<branch_name>