# Breaking Changes
This list shows all versions that include breaking changes and how to upgrade.
## 0.35.1
### What has changed?
* The 'name' attribute has been renamed to 'externalId' in the `AppConnection` entity.
* The 'displayName' attribute has been added to the `AppConnection` entity.
### When is action necessary?
* If you are using the connections API, you should update the `name` attribute to `externalId` and add the `displayName` attribute.
## 0.35.0
### What has changed?
* All branches are now converted to routers, and downgrade is not supported.
## 0.33.0
### What has changed?
* Files from actions or triggers are now stored in the database / S3 to support retries from certain steps, and the size of files from actions is now subject to the limit of `AP_MAX_FILE_SIZE_MB`.
* Files in triggers were previously passed as base64 encoded strings; now they are passed as file paths in the database / S3. Paused flows that have triggers from version 0.29.0 or earlier will no longer work.
### When is action necessary?
* If you are dealing with large files in the actions, consider increasing the `AP_MAX_FILE_SIZE_MB` to a higher value, and make sure the storage system (database/S3) has enough capacity for the files.
## 0.30.0
### What has changed?
* `AP_SANDBOX_RUN_TIME_SECONDS` is now deprecated and replaced with `AP_FLOW_TIMEOUT_SECONDS`
* `AP_CODE_SANDBOX_TYPE` is now deprecated and replaced with new mode in `AP_EXECUTION_MODE`
### When is action necessary?
* If you are using `AP_CODE_SANDBOX_TYPE` to `V8_ISOLATE`, you should switch to `AP_EXECUTION_MODE` to `SANDBOX_CODE_ONLY`
* If you are using `AP_SANDBOX_RUN_TIME_SECONDS` to set the sandbox run time limit, you should switch to `AP_FLOW_TIMEOUT_SECONDS`
## 0.28.0
### What has changed?
* **Project Members:**
* The `EXTERNAL_CUSTOMER` role has been deprecated and replaced with the `OPERATOR` role. Please check the permissions page for more details.
* All pending invitations will be removed.
* The User Invitation entity has been introduced to send invitations. You can still use the Project Member API to add roles for the user, but it requires the user to exist. If you want to send an email, use the User Invitation, and later a record in the project member will be created after the user accepts and registers an account.
* **Authentication:**
* The `SIGN_UP_ENABLED` environment variable, which allowed multiple users to sign up for different platforms/projects, has been removed. It has been replaced with inviting users to the same platform/project. All old users should continue to work normally.
### When is action necessary?
* **Project Members:**
If you use the embedding SDK or the create project member API with the `EXTERNAL_CUSTOMER` role, you should start using the `OPERATOR` role instead.
* **Authentication:**
Multiple platforms/projects are no longer supported in the community edition. Technically, everything is still there, but you have to hack using the API as the authentication system has now changed. If you have already created the users/platforms, they should continue to work, and no action is required.
# Changelog
A log of all notable changes to Activepieces
# Editions
Activepieces operates on an open-core model, providing a core software platform as open source licensed under the permissive **MIT** license while offering additional features as proprietary add-ons in the cloud.
### Community / Open Source Edition
The Community edition is free and open source. It has all the pieces and features to build and run flows without any limitations.
### Commercial Editions
Learn more at: [https://www.activepieces.com/pricing](https://www.activepieces.com/pricing)
## Feature Comparison
| Feature | Community | Enterprise | Embed |
| ------------------------ | --------- | ---------- | -------- |
| Flow History | ✅ | ✅ | ✅ |
| All Pieces | ✅ | ✅ | ✅ |
| Flow Runs | ✅ | ✅ | ✅ |
| Unlimited Flows | ✅ | ✅ | ✅ |
| Unlimited Connections | ✅ | ✅ | ✅ |
| Unlimited Flow steps | ✅ | ✅ | ✅ |
| Custom Pieces | ✅ | ✅ | ✅ |
| On Premise | ✅ | ✅ | ✅ |
| Cloud | ❌ | ✅ | ✅ |
| Project Team Members | ❌ | ✅ | ✅ |
| Manage Multiple Projects | ❌ | ✅ | ✅ |
| Limits Per Project | ❌ | ✅ | ✅ |
| Pieces Management | ❌ | ✅ | ✅ |
| Templates Management | ❌ | ✅ | ✅ |
| Custom Domain | ❌ | ✅ | ✅ |
| All Languages | ✅ | ✅ | ✅ |
| JWT Single Sign On | ❌ | ❌ | ✅ |
| Embed SDK | ❌ | ❌ | ✅ |
| Audit Logs | ❌ | ✅ | ❌ |
| Git Sync | ❌ | ✅ | ❌ |
| Private Pieces | ❌ | 5 | 2 |
| Custom Email Branding | ❌ | ✅ | ✅ |
| Custom Branding | ❌ | ✅ | ✅ |
# i18n Translations
This guide helps you understand how to change or add new translations.
Activepieces uses Crowdin because it helps translators who don't know how to code. It also makes the approval process easier. Activepieces automatically sync new text from the code and translations back into the code.
## Contribute to existing translations
1. Create Crowdin account
2. Join the project [https://crowdin.com/project/activepieces](https://crowdin.com/project/activepieces)
![Join Project](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/crowdin.png)
3. Click on the language you want to translate
4. Click on "Translate All"
![Translate All](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/crowdin-translate-all.png)
5. Select Strings you want to translate and click on "Save" button
## Adding a new language
* Please contact us ([support@activepieces.com](mailto:support@activepieces.com)) if you want to add a new language. We will add it to the project and you can start translating.
# License
Activepieces' **core** is released as open source under the [MIT license](https://github.com/activepieces/activepieces/blob/main/LICENSE) and enterprise / cloud editions features are released under [Commercial License](https://github.com/activepieces/activepieces/blob/main/packages/ee/LICENSE)
The MIT license is a permissive license that grants users the freedom to use, modify, or distribute the software without any significant restrictions. The only requirement is that you include the license notice along with the software when distributing it.
Using the enterprise features (under the packages/ee and packages/server/api/src/app/ee folder) with a self-hosted instance requires an Activepieces license. If you are looking for these features, contact us at [sales@activepieces.com](mailto:sales@activepieces.com).
**Benefits of Dual Licensing Repo**
* **Transparency** - Everyone can see what we are doing and contribute to the project.
* **Clarity** - Everyone can see what the difference is between the open source and commercial versions of our software.
* **Audit** - Everyone can audit our code and see what we are doing.
* **Faster Development** - We can develop faster and more efficiently.
If you are still confused or have feedback, please open an issue on GitHub or send a message in the #contribution channel on Discord.
# Telemetry
# Why Does Activepieces need data?
As a self-hosted product, gathering usage metrics and insights can be difficult for us. However, these analytics are essential in helping us understand key behaviors and delivering a higher quality experience that meets your needs.
To ensure we can continue to improve our product, we have decided to track certain basic behaviors and metrics that are vital for understanding the usage of Activepieces.
We have implemented a minimal tracking plan and provide a detailed list of the metrics collected in a separate section.
# What Does Activepieces Collect?
We value transparency in data collection and assure you that we do not collect any personal information. The following events are currently being collected:
[Exact Code](https://github.com/activepieces/activepieces/blob/main/packages/shared/src/lib/common/telemetry.ts)
1. `flow.published`: Event fired when a flow is published
2. `signed.up`: Event fired when a user signs up
3. `flow.test`: Event fired when a flow is tested
4. `flow.created`: Event fired when a flow is created
5. `start.building`: Event fired when a user starts building
6. `demo.imported`: Event fired when a demo is imported
7. `flow.imported`: Event fired when a flow template is imported
# Opting out?
To opt out, set the environment variable `AP_TELEMETRY_ENABLED=false`
# Appearance
Customize the brand by going to the **Appearance** section under **Settings**. Here, you can customize:
* Logo / FavIcon
* Primary color
* Default Language
# Custom Domains
You can set up a unique domain for your platform, like app.example.com.
This is also used to determine the theme and branding on the authentication pages when a user is not logged in.
![Manage Projects](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/custom-domain.png)
# Customize Emails
You can add your own mail server to Activepieces, or override it if it's in the cloud. From the platform, all email templates are automatically whitelabeled according to the [appearance settings](https://www.activepieces.com/docs/platform/appearance).
![Manage SMTP](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/manage-smtp.png)
# Manage AI Providers
Set your AI providers so your users enjoy a seamless building experience with our universal AI pieces like [Text AI](https://www.activepieces.com/pieces/text-ai).
## Manage AI Providers
You can manage the AI providers that you want to use in your flows. To do this, go to the **AI** page in the **Admin Console**.
You can define the provider's base URL and the API key.
These settings will be used for all the projects for every request to the AI provider.
![Manage AI Providers](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/configure-ai-provider.png)
## Configure AI Credits Limits Per Project
You can configure the token limits per project. To do this, go to the project general settings and change the **AI Credits** field to the desired value.
This limit is per project and is an accumulation of all the reported usage by the AI piece in the project.
Since only the AI piece goes through the Activepieces API,
using any other piece like the standalone OpenAI, Anthropic or Perplexity pieces will not count towards or respect this limit.
![Manage AI Providers](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/ai-credits-limit.png)
### AI Credits Explained
AI credits are the number tasks that can be run by any of our universal AI pieces.
So if you have a flow run that contains 5 universal AI pieces steps, the AI credits consumed will be 5.
# Replace OAuth2 Apps
The project automatically uses Activepieces OAuth2 Apps as the default setting. If you prefer to use your own OAuth2 Apps, you can click on the 'Gear Icon' on the piece from the 'Manage Pieces' page and enter your own OAuth2 Apps details.
![Manage Oauth2 apps](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/manage-oauth2.png)
# Manage Pieces
## Customize Pieces for Each Project
In each **project settings** you can customize the pieces for the project.
![Manage Projects](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/manage-pieces.png)
## Install Piece
You can install custom pieces for all your projects by clicking on "Install Piece" and then filling in the piece package information.
You can choose to install it from npm or upload a tar file directly for private pieces.
![Manage Projects](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/install-piece.png)
# Managed Projects
This feature helps you unlock these use cases:
1. Set up projects for different teams inside the company.
2. Set up projects automatically using the embedding feature for your SaaS customers.
You can **create** new projects and sets **limits** on the number of tasks for each project.
# Manage Templates
You can create custom templates for your users within the Platform dashboard's.
# Overview
The platform is the admin panel for managing your instance. It's suitable for SaaS, Embed, or agencies that want to white-label Activepieces and offer it to their customers. With this platform, you can:
1. **Custom Branding:** Tailor the appearance of the software to align with your brand's identity by selecting your own branding colors and fonts.
2. **Projects Management:** Manage your projects, including creating, editing, and deleting projects.
3. **Piece Management:** Take full control over Activepieces pieces. You can show or hide existing pieces and create your own unique pieces to customize the platform according to your specific needs.
4. **User Authentication Management:** adding and removing users, and assigning roles to users.
5. **Template Management:** Control prebuilt templates and add your own unique templates to meet the requirements of your users.
6. **AI Provider Management:** Manage the AI providers that you want to use in your flows.
# Create Action
## Action Definition
Now let's create first action which fetch random ice-cream flavor.
```bash
npm run cli actions create
```
You will be asked three questions to define your new piece:
1. `Piece Folder Name`: This is the name associated with the folder where the action resides. It helps organize and categorize actions within the piece.
2. `Action Display Name`: The name users see in the interface, conveying the action's purpose clearly.
3. `Action Description`: A brief, informative text in the UI, guiding users about the action's function and purpose.
Next, Let's create the action file:
**Example:**
```bash
npm run cli actions create
? Enter the piece folder name : gelato
? Enter the action display name : get icecream flavor
? Enter the action description : fetches random icecream flavor.
```
This will create a new TypeScript file named `get-icecream-flavor.ts` in the `packages/pieces/community/gelato/src/lib/actions` directory.
Inside this file, paste the following code:
```typescript
import {
createAction,
Property,
PieceAuth,
} from '@activepieces/pieces-framework';
import { httpClient, HttpMethod } from '@activepieces/pieces-common';
import { gelatoAuth } from '../..';
export const getIcecreamFlavor = createAction({
name: 'get_icecream_flavor', // Must be a unique across the piece, this shouldn't be changed.
auth: gelatoAuth,
displayName: 'Get Icecream Flavor',
description: 'Fetches random icecream flavor',
props: {},
async run(context) {
const res = await httpClient.sendRequest({
method: HttpMethod.GET,
url: 'https://cloud.activepieces.com/api/v1/webhooks/RGjv57ex3RAHOgs0YK6Ja/sync',
headers: {
Authorization: context.auth, // Pass API key in headers
},
});
return res.body;
},
});
```
The createAction function takes an object with several properties, including the `name`, `displayName`, `description`, `props`, and `run` function of the action.
The `name` property is a unique identifier for the action. The `displayName` and `description` properties are used to provide a human-readable name and description for the action.
The `props` property is an object that defines the properties that the action requires from the user. In this case, the action doesn't require any properties.
The `run` function is the function that is called when the action is executed. It takes a single argument, context, which contains the values of the action's properties.
The `run` function utilizes the httpClient.sendRequest function to make a GET request, fetching a random ice cream flavor. It incorporates API key authentication in the request headers. Finally, it returns the response body.
## Expose The Definition
To make the action readable by Activepieces, add it to the array of actions in the piece definition.
```typescript
import { createPiece } from '@activepieces/pieces-framework';
// Don't forget to add the following import.
import { getIcecreamFlavor } from './lib/actions/get-icecream-flavor';
export const gelato = createPiece({
displayName: 'Gelato',
logoUrl: 'https://cdn.activepieces.com/pieces/gelato.png',
authors: [],
auth: gelatoAuth,
// Add the action here.
actions: [getIcecreamFlavor], // <--------
triggers: [],
});
```
# Testing
By default, the development setup only builds specific components. Open the file `packages/server/api/.env` and include "gelato" in the `AP_DEV_PIECES`.
For more details, check out the [Piece Development](../development-setup/getting-started) section.
Once you edit the environment variable, restart the backend. The piece will be rebuilt. After this process, you'll need to **refresh** the frontend to see the changes.
If the build fails, try debugging by running `npx nx run-many -t build --projects=gelato`.
It will display any errors in your code.
To test the action, use the flow builder in Activepieces. It should function as shown in the screenshot.
![Gelato Action](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/gelato-action.png)
# Create Trigger
This tutorial will guide you through the process of creating trigger for a Gelato piece that fetches new icecream flavor.
## Trigger Definition
To create trigger run the following command,
```bash
npm run cli triggers create
```
1. `Piece Folder Name`: This is the name associated with the folder where the trigger resides. It helps organize and categorize triggers within the piece.
2. `Trigger Display Name`: The name users see in the interface, conveying the trigger's purpose clearly.
3. `Trigger Description`: A brief, informative text in the UI, guiding users about the trigger's function and purpose.
4. `Trigger Technique`: Specifies the trigger type - either [polling](../piece-reference/triggers/polling-trigger) or [webhook](../piece-reference/triggers/webhook-trigger).
**Example:**
```bash
npm run cli triggers create
? Enter the piece folder name : gelato
? Enter the trigger display name : new flavor created
? Enter the trigger description : triggers when a new icecream flavor is created.
? Select the trigger technique: polling
```
This will create a new TypeScript file at `packages/pieces/community/gelato/src/lib/triggers` named `new-flavor-created.ts`.
Inside this file, paste the following code:
```ts
import { gelatoAuth } from '../../';
import {
DedupeStrategy,
HttpMethod,
HttpRequest,
Polling,
httpClient,
pollingHelper,
} from '@activepieces/pieces-common';
import {
PiecePropValueSchema,
TriggerStrategy,
createTrigger,
} from '@activepieces/pieces-framework';
import dayjs from 'dayjs';
const polling: Polling<
PiecePropValueSchema,
Record
> = {
strategy: DedupeStrategy.TIMEBASED,
items: async ({ auth, propsValue, lastFetchEpochMS }) => {
const request: HttpRequest = {
method: HttpMethod.GET,
url: 'https://cloud.activepieces.com/api/v1/webhooks/aHlEaNLc6vcF1nY2XJ2ed/sync',
headers: {
authorization: auth,
},
};
const res = await httpClient.sendRequest(request);
return res.body['flavors'].map((flavor: string) => ({
epochMilliSeconds: dayjs().valueOf(),
data: flavor,
}));
},
};
export const newFlavorCreated = createTrigger({
auth: gelatoAuth,
name: 'newFlavorCreated',
displayName: 'new flavor created',
description: 'triggers when a new icecream flavor is created.',
props: {},
sampleData: {},
type: TriggerStrategy.POLLING,
async test(context) {
const { store, auth, propsValue } = context;
return await pollingHelper.test(polling, { store, auth, propsValue });
},
async onEnable(context) {
const { store, auth, propsValue } = context;
await pollingHelper.onEnable(polling, { store, auth, propsValue });
},
async onDisable(context) {
const { store, auth, propsValue } = context;
await pollingHelper.onDisable(polling, { store, auth, propsValue });
},
async run(context) {
const { store, auth, propsValue } = context;
return await pollingHelper.poll(polling, { store, auth, propsValue });
},
});
```
The way polling triggers usually work is as follows:
`Run`:The run method executes every 5 minutes, fetching data from the endpoint within a specified timestamp range or continuing until it identifies the last item ID. It then returns the new items as an array. In this example, the httpClient.sendRequest method is utilized to retrieve new flavors, which are subsequently stored in the store along with a timestamp.
## Expose The Definition
To make the trigger readable by Activepieces, add it to the array of triggers in the piece definition.
```typescript
import { createPiece } from '@activepieces/pieces-framework';
import { getIcecreamFlavor } from './lib/actions/get-icecream-flavor';
// Don't forget to add the following import.
import { newFlavorCreated } from './lib/triggers/new-flavor-created';
export const gelato = createPiece({
displayName: 'Gelato Tutorial',
logoUrl: 'https://cdn.activepieces.com/pieces/gelato.png',
authors: [],
auth: gelatoAuth,
actions: [getIcecreamFlavor],
// Add the trigger here.
triggers: [newFlavorCreated], // <--------
});
```
# Testing
By default, the development setup only builds specific components. Open the file `packages/server/api/.env` and include "gelato" in the `AP_DEV_PIECES`.
For more details, check out the [Piece Development](../development-setup/getting-started) section.
Once you edit the environment variable, restart the backend. The piece will be rebuilt. After this process, you'll need to **refresh** the frontend to see the changes.
To test the trigger, use the load sample data from flow builder in Activepieces. It should function as shown in the screenshot.
![Gelato Action](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/gelato-trigger.png)
# Overview
This section helps developers build and contribute pieces.
Building pieces is fun and important; it allows you to customize Activepieces for your own needs.
We love contributions! In fact, most of the pieces are contributed by the community. Feel free to open a pull request.
**Friendly Tip:**
For the fastest support, we recommend joining our Discord community. We are dedicated to addressing every question and concern raised there.
Build pieces using TypeScript for a more powerful and flexible development process.
See your changes in the browser within 7 seconds.
Work within the open-source environment, explore, and contribute to other pieces.
Join our large community, where you can ask questions, share ideas, and develop alongside others.
Use the Unified SDK to quickly build AI-powered pieces that support multiple AI providers.
# Add Piece Authentication
### Piece Authentication
Activepieces supports multiple forms of authentication, you can check those forms [here](../piece-reference/authentication)
Now, let's establish authentication for this piece, which necessitates the inclusion of an API Key in the headers.
Modify `src/index.ts` file to add authentication,
```ts
import { PieceAuth, createPiece } from '@activepieces/pieces-framework';
export const gelatoAuth = PieceAuth.SecretText({
displayName: 'API Key',
required: true,
description: 'Please use **test-key** as value for API Key',
});
export const gelato = createPiece({
displayName: 'Gelato',
logoUrl: 'https://cdn.activepieces.com/pieces/gelato.png',
auth: gelatoAuth,
authors: [],
actions: [],
triggers: [],
});
```
Use the value **test-key** as the API key when testing actions or triggers for
Gelato.
# Create Piece Definition
This tutorial will guide you through the process of creating a Gelato piece with an action that fetches random icecream flavor and trigger that fetches new icecream flavor created. It assumes that you are familiar with the following:
* [Activepieces Local development](../development-setup/local) Or [GitHub Codespaces](../development-setup/codespaces).
* TypeScript syntax.
## Piece Definition
To get started, let's generate a new piece for Gelato
```bash
npm run cli pieces create
```
You will be asked three questions to define your new piece:
1. `Piece Name`: Specify a name for your piece. This name uniquely identifies your piece within the ActivePieces ecosystem.
2. `Package Name`: Optionally, you can enter a name for the npm package associated with your piece. If left blank, the default name will be used.
3. `Piece Type`: Choose the piece type based on your intention. It can be either "custom" if it's a tailored solution for your needs, or "community" if it's designed to be shared and used by the broader community.
**Example:**
```bash
npm run cli pieces create
? Enter the piece name: gelato
? Enter the package name: @activepieces/piece-gelato
? Select the piece type: community
```
The piece will be generated at `packages/pieces/community/gelato/`,
the `src/index.ts` file should contain the following code
```ts
import { PieceAuth, createPiece } from '@activepieces/pieces-framework';
export const gelato = createPiece({
displayName: 'Gelato',
logoUrl: 'https://cdn.activepieces.com/pieces/gelato.png',
auth: PieceAuth.None(),
authors: [],
actions: [],
triggers: [],
});
```
# Fork Repository
To start building pieces, we need to fork the repository that contains the framework library and the development environment. Later, we will publish these pieces as `npm` artifacts.
Follow these steps to fork the repository:
1. Go to the repository page at [https://github.com/activepieces/activepieces](https://github.com/activepieces/activepieces).
2. Click the `Fork` button located in the top right corner of the page.
![Fork Repository](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/fork-repository.jpg)
If you are an enterprise customer and want to use the private pieces feature, you can refer to the tutorial on how to set up a [private fork](../misc/private-fork).
# Start Building
This section guides you in creating a Gelato piece, from setting up your development environment to contributing the piece. By the end of this tutorial, you will have a piece with an action that fetches a random ice cream flavor and a trigger that fetches newly created ice cream flavors.
These are the next sections. In each step, we will do one small thing. This tutorial should take around 30 minutes.
## Steps Overview
Fork the repository to create your own copy of the codebase.
Set up your development environment with the necessary tools and dependencies.
Define the structure and behavior of your Gelato piece.
Implement authentication mechanisms for your Gelato piece.
Create an action that fetches a random ice cream flavor.
Create a trigger that fetches newly created ice cream flavors.
Share your Gelato piece with others.
Contribute a piece to our repo and receive +1,400 tasks/month on [Activepieces Cloud](https://cloud.activepieces.com).
# GitHub Codespaces
GitHub Codespaces is a cloud development platform that enables developers to write, run, and debug code directly in their browsers, seamlessly integrated with GitHub.
### Steps to setup Codespaces
1. Go to [Activepieces repo](https://github.com/activepieces/activepieces).
2. Click Code `<>`, then under codespaces click create codespace on main.
![Create Codespace](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/development-setup_codespaces.png)
By default, the development setup only builds specific pieces.Open the file
`packages/server/api/.env` and add comma-separated list of pieces to make
available.
For more details, check out the [Piece Development](/developers/development-setup/getting-started) section.
3. Open the terminal and run `npm start`
4. Access the frontend URL by opening port 4200 and signing in with these details:
Email: `dev@ap.com`
Password: `12345678`
# Dev Containers
## Using Dev Containers in Visual Studio Code
The project includes a dev container configuration that allows you to use Visual Studio Code's [Remote Development](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack) extension to develop the project in a consistent environment. This can be especially helpful if you are new to the project or if you have a different environment setup on your local machine.
## Prerequisites
Before you can use the dev container, you will need to install the following:
* [Visual Studio Code](https://code.visualstudio.com/).
* The [Remote Development](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack) extension for Visual Studio Code.
* [Docker](https://www.docker.com/).
## Using the Dev Container
To use the dev container for the Activepieces project, follow these steps:
1. Clone the Activepieces repository to your local machine.
2. Open the project in Visual Studio Code.
3. Press `Ctrl+Shift+P` and type `> Dev Containers: Reopen in Container`.
4. Run `npm start`.
5. The backend will run at `localhost:3000` and the frontend will run at `localhost:4200`.
By default, the development setup only builds specific pieces.Open the file
`packages/server/api/.env` and add comma-separated list of pieces to make
available.
For more details, check out the [Piece Development](/developers/development-setup/getting-started) section.
The login credentials are:\
Email: `dev@ap.com`
Password: `12345678`
## Exiting the Dev Container
To exit the dev container and return to your local environment, follow these steps:
1. In the bottom left corner of Visual Studio Code, click the `Remote-Containers: Reopen folder locally` button.
2. Visual Studio Code will close the connection to the dev container and reopen the project in your local environment.
## Troubleshoot
One of the best trouble shoot after an error occur is to reset the dev container.
1. Exit the dev container
2. Run the following
```sh
sh tools/reset-dev.sh
```
3. Rebuild the dev container from above steps
# Getting Started
## Development Setup
To set up the development environment, you can choose one of the following methods:
* **Codespaces**: This is the quickest way to set up the development environment. Follow the [Codespaces](./codespaces) guide.
* **Local Environment**: It is recommended for local development. Follow the [Local Environment](./local) guide.
* **Dev Container**: This method is suitable for remote development on another machine. Follow the [Dev Container](./dev-container) guide.
## Pieces Development
To avoid making the dev environment slow, not all pieces are functional during development at first. By default, only these pieces are functional at first, as specified in `AP_DEV_PIECES`.
[https://github.com/activepieces/activepieces/blob/main/packages/server/api/.env#L4](https://github.com/activepieces/activepieces/blob/main/packages/server/api/.env#L4)
To override the default list available at first, define an `AP_DEV_PIECES` environment variable with a comma-separated list of pieces to make available. For example, to make `google-sheets` and `cal-com` available, you can use:
```sh
AP_DEV_PIECES=google-sheets,cal-com npm start
```
# Local Dev Environment
## Prerequisites
* Node.js v18+
* npm v9+
## Instructions
1. Setup the environment
```bash
node tools/setup-dev.js
```
2. Start the environment
This command will start activepieces with sqlite3 and in memory queue.
```bash
npm start
```
By default, the development setup only builds specific pieces.Open the file
`packages/server/api/.env` and add comma-separated list of pieces to make
available.
For more details, check out the [Piece Development](/developers/development-setup/getting-started) section.
3. Go to ***localhost:4200*** on your web browser and sign in with these details:
Email: `dev@ap.com`
Password: `12345678`
# Create New AI Provider
ActivePieces currently supports the following AI providers:
* OpenAI
* Anthropic
To create a new AI provider, you need to follow these steps:
## Implement the AI Interface
Create a new factory that returns an instance of the `AI` interface in the `packages/pieces/community/common/src/lib/ai/providers/your-ai-provider.ts` file.
```typescript
export const yourAiProvider = ({
serverUrl,
engineToken,
}: { serverUrl: string, engineToken: string }): AI => {
const impl = new YourAiProviderSDK(serverUrl, engineToken);
return {
provider: "YOUR_AI_PROVIDER" as const,
chat: {
text: async (params) => {
try {
const response = await impl.chat.text(params);
return response;
} catch (e: any) {
if (e?.error?.error) {
throw e.error.error;
}
throw e;
}
}
},
};
};
```
## Register the AI Provider
Add the new AI provider to the `AiProviders` enum in `packages/pieces/community/common/src/lib/ai/providers/index.ts` file.
```diff
export const AiProviders = [
+ {
+ logoUrl: 'https://cdn.activepieces.com/pieces/openai.png',
+ defaultBaseUrl: 'https://api.your-ai-provider.com',
+ label: 'Your AI Provider' as const,
+ value: 'your-ai-provider' as const,
+ models: [
+ { label: 'model-1', value: 'model-1' },
+ { label: 'model-2', value: 'model-2' },
+ { label: 'model-3', value: 'model-3' },
+ ],
+ factory: yourAiProvider,
+ },
...
]
```
## Define Authentication Header
Now we need to tell ActivePieces how to authenticate to your AI provider. You can do this by adding an `auth` property to the `AiProvider` object.
The `auth` property is an object that defines the authentication mechanism for your AI provider. It consists of two properties: `name` and `mapper`. The `name` property specifies the name of the header that will be used to authenticate with your AI provider, and the `mapper` property defines a function that maps the value of the header to the format that your AI provider expects.
Here's an example of how to define the authentication header for a bearer token:
```diff
export const AiProviders = [
{
logoUrl: 'https://cdn.activepieces.com/pieces/openai.png',
defaultBaseUrl: 'https://api.your-ai-provider.com',
label: 'Your AI Provider' as const,
value: 'your-ai-provider' as const,
models: [
{ label: 'model-1', value: 'model-1' },
{ label: 'model-2', value: 'model-2' },
{ label: 'model-3', value: 'model-3' },
],
+ auth: authHeader({ bearer: true }), // or authHeader({ name: 'x-api-key', bearer: false })
factory: yourAiProvider,
},
...
]
```
## Test the AI Provider
To test the AI provider, you can use a **universal AI** piece in a flow. Follow these steps:
* Add the required headers from the admin console for the newly created AI provider. These headers will be used to authenticate the requests to the AI provider.
![Configure AI Provider](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/configure-ai-provider.png)
* Create a flow that uses our **universal AI** pieces. And select **"Your AI Provider"** as the AI provider in the **Ask AI** action settings.
![Configure AI Provider](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/use-ai-provider.png)
# Custom Pieces CI/CD
You can use the CLI to sync custom pieces. There is no need to rebuild the Docker image as they are loaded directly from npm.
### How It Works
Use the CLI to sync items from `packages/pieces/custom/` to instances. In production, Activepieces acts as an npm registry, storing all piece versions.
The CLI scans the directory for `package.json` files, checking the **name** and **version** of each piece. If a piece isn't uploaded, it packages and uploads it via the API.
### Usage
To use the CLI, follow these steps:
1. Generate an API Key from the Admin Interface. Go to Settings and generate the API Key.
2. Install the CLI by cloning the repository.
3. Run the following command, replacing `API_KEY` with your generated API Key and `INSTANCE_URL` with your instance URL:
```bash
AP_API_KEY=your_api_key_here npm run sync-pieces -- --apiUrl https://INSTANCE_URL/api
```
### Developer Workflow
1. Developers create and modify the pieces offline.
2. Increment the piece version in their corresponding `package.json`. For more information, refer to the [piece versioning](../../developers/piece-reference/piece-versioning) documentation.
3. Open a pull request towards the main branch.
4. Once the pull request is merged to the main branch, manually run the CLI or use a GitHub/GitLab Action to trigger the synchronization process.
# Setup Private Fork
**Friendly Tip #1:** If you want to experiment, you can fork or clone the public repository.
For private piece installation, you will need the paid edition. However, you can still develop pieces, contribute them back, **OR** publish them to the public npm registry and use them in your own instance or project.
## Create a Private Fork (Private Pieces)
By following these steps, you can create a private fork on GitHub, GitLab or another platform and configure the "activepieces" repository as the upstream source, allowing you to incorporate changes from the "activepieces" repository.
1. **Clone the Repository:**
Begin by creating a bare clone of the repository. Remember that this is a temporary step and will be deleted later.
```bash
git clone --bare git@github.com:activepieces/activepieces.git
```
2. **Create a Private Git Repository**
Generate a new private repository on GitHub or your chosen platform. When initializing the new repository, do not include a README, license, or gitignore files. This precaution is essential to avoid merge conflicts when synchronizing your fork with the original repository.
3. **Mirror-Push to the Private Repository:**
Mirror-push the bare clone you created earlier to your newly created "activepieces" repository. Make sure to replace `` in the URL below with your actual GitHub username.
```bash
cd activepieces.git
git push --mirror git@github.com:/activepieces.git
```
4. **Remove the Temporary Local Repository:**
```bash
cd ..
rm -rf activepieces.git
```
5. **Clone Your Private Repository:**
Now, you can clone your "activepieces" repository onto your local machine into your desired directory.
```bash
cd ~/path/to/directory
git clone git@github.com:/activepieces.git
```
6. **Add the Original Repository as a Remote:**
If desired, you can add the original repository as a remote to fetch potential future changes. However, remember to disable push operations for this remote, as you are not permitted to push changes to it.
```bash
git remote add upstream git@github.com:activepieces/activepieces.git
git remote set-url --push upstream DISABLE
```
You can view a list of all your remotes using `git remote -v`. It should resemble the following:
```
origin git@github.com:/activepieces.git (fetch)
origin git@github.com:/activepieces.git (push)
upstream git@github.com:activepieces/activepieces.git (fetch)
upstream DISABLE (push)
```
> When pushing changes, always use `git push origin`.
### Sync Your Fork
To retrieve changes from the "upstream" repository, fetch the remote and rebase your work on top of it.
```bash
git fetch upstream
git merge upstream/main
```
Conflict resolution should not be necessary since you've only added pieces to your repository.
# Piece Auth
Learn about piece authentication
Piece authentication is used to gather user credentials and securely store them for future use in different flows. The authentication must be defined as the `auth` parameter in the `createPiece`, `createTrigger`, and `createAction` functions.
This requirement ensures that the type of authentication can be inferred correctly in triggers and actions.
Friendly Tip: Only at most one authentication is allowed per piece.
### Secret Text
This authentication collects sensitive information, such as passwords or API keys. It is displayed as a masked input field.
**Example:**
```typescript
PieceAuth.SecretText({
displayName: 'API Key',
description: 'Enter your API key',
required: true,
// Optional Validation
validate: async ({auth}) => {
if(auth.startsWith('sk_')){
return {
valid: true,
}
}
return {
valid: false,
error: 'Invalid Api Key'
}
}
})
```
### Username and Password
This authentication collects a username and password as separate fields.
**Example:**
```typescript
PieceAuth.BasicAuth({
displayName: 'Credentials',
description: 'Enter your username and password',
required: true,
username: {
displayName: 'Username',
description: 'Enter your username',
},
password: {
displayName: 'Password',
description: 'Enter your password',
},
// Optional Validation
validate: async ({auth}) => {
if(auth){
return {
valid: true,
}
}
return {
valid: false,
error: 'Invalid Api Key'
}
}
})
```
### Custom
This authentication allows for custom authentication by collecting specific properties, such as a base URL and access token.
**Example:**
```typescript
PieceAuth.CustomAuth({
displayName: 'Custom Authentication',
description: 'Enter custom authentication details',
props: {
base_url: Property.ShortText({
displayName: 'Base URL',
description: 'Enter the base URL',
required: true,
}),
access_token: PieceAuth.SecretText({
displayName: 'Access Token',
description: 'Enter the access token',
required: true
})
},
// Optional Validation
validate: async ({auth}) => {
if(auth){
return {
valid: true,
}
}
return {
valid: false,
error: 'Invalid Api Key'
}
},
required: true
})
```
### OAuth2
This authentication collects OAuth2 authentication details, including the authentication URL, token URL, and scope.
**Example:**
```typescript
PieceAuth.OAuth2({
displayName: 'OAuth2 Authentication',
grantType: OAuth2GrantType.AUTHORIZATION_CODE,
required: true,
authUrl: 'https://example.com/auth',
tokenUrl: 'https://example.com/token',
scope: ['read', 'write']
})
```
Please note `OAuth2GrantType.CLIENT_CREDENTIALS` is also supported for service-based authentication.
# Enable Custom API Calls
Learn how to enable custom API calls for your pieces
Custom API Calls allow the user to send a request to a specific endpoint if no action has been implemented for it.
This will show in the actions list of the piece as `Custom API Call`, to enable this action for a piece, you need to call the `createCustomApiCallAction` in your actions array.
## Basic Example
The example below implements the action for the OpenAI piece. The OpenAI piece uses a `Bearer token` authorization header to identify the user sending the request.
```typescript
actions: [
...yourActions,
createCustomApiCallAction({
// The auth object defined in the piece
auth: openaiAuth,
// The base URL for the API
baseUrl: () => {
'https://api.openai.com/v1'
},
// Mapping the auth object to the needed authorization headers
authMapping: async (auth) => {
return {
'Authorization': `Bearer ${auth}`
}
}
})
]
```
## Dynamic Base URL and Basic Auth Example
The example below implements the action for the Jira Cloud piece. The Jira Cloud piece uses a dynamic base URL for it's actions, where the base URL changes based on the values the user authenticated with. We will also implement a Basic authentication header.
```typescript
actions: [
...yourActions,
createCustomApiCallAction({
baseUrl: (auth) => {
return `${(auth as JiraAuth).instanceUrl}/rest/api/3`
},
auth: jiraCloudAuth,
authMapping: async (auth) => {
const typedAuth = auth as JiraAuth
return {
'Authorization': `Basic ${typedAuth.email}:${typedAuth.apiToken}`
}
}
})
]
```
# Piece Examples
Explore a collection of example triggers and actions
To get the full benefit, it is recommended to read the tutorial first.
## Triggers:
**Webhooks:**
* [New Form Submission on Typeform](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/typeform/src/lib/trigger/new-submission.ts)
**Polling:**
* [New Completed Task On Todoist](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/todoist/src/lib/triggers/task-completed-trigger.ts)
## Actions:
* [Send a message On Discord](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/discord/src/lib/actions/send-message-webhook.ts)
* [Send an mail On Gmail](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/gmail/src/lib/actions/send-email-action.ts)
## Authentication
**OAuth2:**
* [Slack](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/slack/src/index.ts)
* [Gmail](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/gmail/src/index.ts)
**API Key:**
* [Sendgrid](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/sendgrid/src/index.ts)
**Basic Authentication:**
* [Twilio](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/twilio/src/index.ts)
# External Libraries
Learn how to install and use external libraries.
The Activepieces repository is structured as a monorepo, employing Nx as its build tool.
To use an external library in your project, you can simply add it to the main `package.json` file and then use it in any part of your project.
Nx will automatically detect where you're using the library and include it in the build.
Here's how to install and use an external library:
* Install the library using:
```bash
npm install --save
```
* Import the library into your piece.
Guidelines:
* Make sure you are using well-maintained libraries.
* Ensure that the library size is not too large to avoid bloating the bundle size; this will make the piece load faster in the sandbox.
# Files
Learn how to use files object to create file references.
The `ctx.files` object allow you to store files in local storage or in a remote storage depending on the run environment.
## Write
You can use the `write` method to write a file to the storage, It returns a string that can be used in other actions or triggers properties to reference the file.
**Example:**
```ts
const fileReference = await files.write({
fileName: 'file.txt',
data: Buffer.from('text')
});
```
This code will store the file in the database If the run environment is testing mode since it will be required to test other steps, other wise it will store it in the local temporary directory.
For Reading the file If you are using the file property in a trigger or action, It will be automatically parsed and you can use it directly, please refer to `Property.File` in the [properties](./properties#file) section.
# Flow Control
Learn How to Control Flow from Inside the Piece
Flow Controls provide the ability to control the flow of execution from inside a piece. By using the `ctx` parameter in the `run` method of actions, you can perform various operations to control the flow.
## Stop Flow
You can stop the flow and provide a response to the webhook trigger. This can be useful when you want to terminate the execution of the piece and send a specific response back.
**Example with Response:**
```typescript
context.run.stop({
response: {
status: context.propsValue.status ?? StatusCodes.OK,
body: context.propsValue.body,
headers: (context.propsValue.headers as Record) ?? {},
},
});
```
**Example without Response:**
```typescript
context.run.stop();
```
## Pause Flow and Wait for Webhook
You can pause flow and return HTTP response, also provide a callback to URL that you can call with certain payload and continue the flow.
**Example:**
```typescript
ctx.run.pause({
pauseMetadata: {
type: PauseType.WEBHOOK,
response: {
callbackUrl: context.generateResumeUrl({
queryParams: {},
}),
},
},
});
```
## Pause Flow and Delay
You can pause or delay the flow until a specific timestamp. Currently, the only supported type of pause is a delay based on a future timestamp.
**Example:**
```typescript
ctx.run.pause({
pauseMetadata: {
type: PauseType.DELAY,
resumeDateTime: futureTime.toUTCString()
}
});
```
These flow hooks give you control over the execution of the piece by allowing you to stop the flow or pause it until a certain condition is met. You can use these hooks to customize the behavior and flow of your actions.
# Persistent Storage
Learn how to store and retrieve data from a key-value store
The `ctx` parameter inside triggers and actions provides a simple key/value storage mechanism. The storage is persistent, meaning that the stored values are retained even after the execution of the piece.
By default, the storage operates at the flow level, but it can also be configured to store values at the project level.
The storage scope is completely isolated. If a key is stored in a different scope, it will not be fetched when requested in different scope.
## Put
You can store a value with a specified key in the storage.
**Example:**
```typescript
ctx.store.put('KEY', 'VALUE', StoreScope.PROJECT);
```
## Get
You can retrieve the value associated with a specific key from the storage.
**Example:**
```typescript
const value = ctx.store.get('KEY', StoreScope.PROJECT);
```
## Delete
You can delete a key-value pair from the storage.
**Example:**
```typescript
ctx.store.delete('KEY', StoreScope.PROJECT);
```
These storage operations allow you to store, retrieve, and delete key-value pairs in the persistent storage. You can use this storage mechanism to store and retrieve data as needed within your triggers and actions.
# Piece Versioning
Learn how to version your pieces
Pieces are npm packages and follows **semantic versioning**.
## Semantic Versioning
The version number consists of three numbers: `MAJOR.MINOR.PATCH`, where:
* **MAJOR** It should be incremented when there are breaking changes to the piece.
* **MINOR** It should be incremented for new features or functionality that is compatible with the previous version, unless the major version is less than 1.0, in which case it can be a breaking change.
* **PATCH** It should be incremented for bug fixes and small changes that do not introduce new features or break backward compatibility.
## Engine
The engine will use the most up-to-date compatible version for a given piece version during the **DRAFT** flow versions. Once the flow is published, all pieces will be locked to a specific version.
**Case 1: Piece Version is Less Than 1.0**:
The engine will select the latest **patch** version that shares the same **minor** version number.
**Case 2: Piece Version Reaches Version 1.0**:
The engine will select the latest **minor** version that shares the same **major** version number.
## Examples
when you make a change, remember to increment the version accordingly.
### Breaking changes
* Remove an existing action.
* Add a required `action` prop.
* Remove an existing action prop, whether required or optional.
* Remove an attribute from an action output.
* Change the existing behavior of an action/trigger.
### Non-breaking changes
* Add a new action.
* Add an optional `action` prop.
* Add an attribute to an action output.
i.e., any removal is breaking, any required addition is breaking, everything else is not breaking.
# Props
Learn about different types of properties used in triggers / actions
Properties are used in actions and triggers to collect information from the user. They are also displayed to the user for input. Here are some commonly used properties:
## Basic Properties
These properties collect basic information from the user.
### Short Text
This property collects a short text input from the user.
**Example:**
```typescript
Property.ShortText({
displayName: 'Name',
description: 'Enter your name',
required: true,
defaultValue: 'John Doe',
});
```
### Long Text
This property collects a long text input from the user.
**Example:**
```typescript
Property.LongText({
displayName: 'Description',
description: 'Enter a description',
required: false,
});
```
### Checkbox
This property presents a checkbox for the user to select or deselect.
**Example:**
```typescript
Property.Checkbox({
displayName: 'Agree to Terms',
description: 'Check this box to agree to the terms',
required: true,
defaultValue: false,
});
```
### Markdown
This property displays a markdown snippet to the user, useful for documentation or instructions. It includes a `variant` option to style the markdown, using the `MarkdownVariant` enum:
* **BORDERLESS**: For a minimalistic, no-border layout.
* **INFO**: Displays informational messages.
* **WARNING**: Alerts the user to cautionary information.
* **TIP**: Highlights helpful tips or suggestions.
The default value for `variant` is **INFO**.
**Example:**
```typescript
Property.MarkDown({
value: '## This is a markdown snippet',
variant: MarkdownVariant.WARNING,
}),
```
If you want to show a webhook url to the user, use `{{ webhookUrl }}` in the
markdown snippet.
### DateTime
This property collects a date and time from the user.
**Example:**
```typescript
Property.DateTime({
displayName: 'Date and Time',
description: 'Select a date and time',
required: true,
defaultValue: '2023-06-09T12:00:00Z',
});
```
### Number
This property collects a numeric input from the user.
**Example:**
```typescript
Property.Number({
displayName: 'Quantity',
description: 'Enter a number',
required: true,
});
```
### Static Dropdown
This property presents a dropdown menu with predefined options.
**Example:**
```typescript
Property.StaticDropdown({
displayName: 'Country',
description: 'Select your country',
required: true,
options: {
options: [
{
label: 'Option One',
value: '1',
},
{
label: 'Option Two',
value: '2',
},
],
},
});
```
### Static Multiple Dropdown
This property presents a dropdown menu with multiple selection options.
**Example:**
```typescript
Property.StaticMultiSelectDropdown({
displayName: 'Colors',
description: 'Select one or more colors',
required: true,
options: {
options: [
{
label: 'Red',
value: 'red',
},
{
label: 'Green',
value: 'green',
},
{
label: 'Blue',
value: 'blue',
},
],
},
});
```
### JSON
This property collects JSON data from the user.
**Example:**
```typescript
Property.Json({
displayName: 'Data',
description: 'Enter JSON data',
required: true,
defaultValue: { key: 'value' },
});
```
### Dictionary
This property collects key-value pairs from the user.
**Example:**
```typescript
Property.Object({
displayName: 'Options',
description: 'Enter key-value pairs',
required: true,
defaultValue: {
key1: 'value1',
key2: 'value2',
},
});
```
### File
This property collects a file from the user, either by providing a URL or uploading a file.
**Example:**
```typescript
Property.File({
displayName: 'File',
description: 'Upload a file',
required: true,
});
```
### Array of Strings
This property collects an array of strings from the user.
**Example:**
```typescript
Property.Array({
displayName: 'Tags',
description: 'Enter tags',
required: false,
defaultValue: ['tag1', 'tag2'],
});
```
### Array of Fields
This property collects an array of objects from the user.
**Example:**
```typescript
Property.Array({
displayName: 'Fields',
description: 'Enter fields',
properties: {
fieldName: Property.ShortText({
displayName: 'Field Name',
required: true,
}),
fieldType: Property.StaticDropdown({
displayName: 'Field Type',
required: true,
options: {
options: [
{ label: 'TEXT', value: 'TEXT' },
{ label: 'NUMBER', value: 'NUMBER' },
],
},
}),
},
required: false,
defaultValue: [],
});
```
## Dynamic Data Properties
These properties provide more advanced options for collecting user input.
### Dropdown
This property allows for dynamically loaded options based on the user's input.
**Example:**
```typescript
Property.Dropdown({
displayName: 'Options',
description: 'Select an option',
required: true,
refreshers: ['auth'],
refreshOnSearch: false,
options: async ({ auth }, { searchValue }) => {
// Search value only works when refreshOnSearch is true
if (!auth) {
return {
disabled: true,
};
}
return {
options: [
{
label: 'Option One',
value: '1',
},
{
label: 'Option Two',
value: '2',
},
],
};
},
});
```
When accessing the Piece auth, be sure to use exactly `auth` as it is
hardcoded. However, for other properties, use their respective names.
### Multi-Select Dropdown
This property allows for multiple selections from dynamically loaded options.
**Example:**
```typescript
Property.MultiSelectDropdown({
displayName: 'Options',
description: 'Select one or more options',
required: true,
refreshers: ['auth'],
options: async ({ auth }) => {
if (!auth) {
return {
disabled: true,
};
}
return {
options: [
{
label: 'Option One',
value: '1',
},
{
label: 'Option Two',
value: '2',
},
],
};
},
});
```
When accessing the Piece auth, be sure to use exactly `auth` as it is
hardcoded. However, for other properties, use their respective names.
### Dynamic Properties
This property is used to construct forms dynamically based on API responses or user input.
**Example:**
```typescript
Property.DynamicProperties({
description: 'Dynamic Form',
displayName: 'Dynamic Form',
required: true,
refreshers: ['authentication'],
props: async (propsValue) => {
const authentication = propsValue['authentication'];
const apiEndpoint = 'https://someapi.com';
const response = await fetch(apiEndpoint);
const data = await response.json();
const properties = {
prop1: Property.ShortText({
displayName: 'Property 1',
description: 'Enter property 1',
required: true,
}),
prop2: Property.Number({
displayName: 'Property 2',
description: 'Enter property 2',
required: false,
}),
};
return properties;
},
});
```
# Props Validation
Learn about different types of properties validation
Validators help ensure that user input meets certain criteria or constraints. Below are some examples of validator functions:
## Validators
### Pattern
Checks if the processed value matches a specific regular expression.
```typescript
text: Property.LongText({
displayName: 'Text',
required: true,
processors: [],
validators: [Validators.pattern(/^[a-zA-Z0-9]+$/);],
})
```
### Max Length
Verifies if the processed string length is within a specified maximum limit.
```typescript
text: Property.LongText({
displayName: 'Text',
required: true,
processors: [],
validators: [Validators.maxLength(100)],
})
```
### Min Length
Verifies if the processed string length is within a specified minimum limit.
```typescript
text: Property.LongText({
displayName: 'Text',
required: true,
validators: [Validators.minLength(41)],
})
```
### Min Value
Ensures that the processed numeric value is greater than or equal to a specified minimum.
```typescript
age: Property.Number({
displayName: 'Age',
required: true,
validators: [Validators.minValue(100)],
})
```
### Max Value
Ensures that the processed numeric value is less than or equal to a specified maximum.
```typescript
count: Property.Number({
displayName: 'Count',
required: true,
validators: [Validators.maxValue(7)],
})
```
### In Range
Checks if the processed numeric value falls within a specified range.
```typescript
score: Property.Number({
displayName: 'Score',
required: true,
validators: [Validators.inRange(0, 100)], // The range is inclusive (0 to 100).
})
```
### One Of
Validates whether the processed value is one of the specified values.
```typescript
fruit: Property.Text({
displayName: 'Fruit',
required: true,
validators: [Validators.oneOf(["apple", "banana", "orange"])],
})
```
### Number
Validates whether the processed value is a valid number.
```typescript
quantity: Property.Number({
displayName: 'Quantity',
required: true,
validators: [Validators.number],
})
```
### Non Zero
Validates whether the processed value is strictly NOT zero. Designed to be used with `Validators.number`
```typescript
numberToDivideBy: Property.Number({
displayName: 'Divide by',
required: true,
validators: [Validators.number, Validators.nonZero]
})
```
### Image
Verifies whether the processed value is a valid image file based on its extension.
```typescript
imageFile: Property.File({
displayName: 'Image File',
required: true,
validators: [Validators.image],
})
```
### File
Ensures that the processed value is a valid file.
```typescript
documentFile: Property.File({
displayName: 'Document File',
required: true,
validators: [Validators.file],
})
```
### Email
Validates whether the processed value is a valid email address.
```typescript
email: Property.Text({
displayName: 'Email',
required: true,
validators: [Validators.email],
})
```
### URL
Ensures that the processed value is a valid URL.
```typescript
websiteUrl: Property.Text({
displayName: 'Website URL',
required: true,
validators: [Validators.url],
})
```
### Datetime ISO
Validates whether the processed value is a valid ISO-formatted date and time.
```typescript
eventDate: Property.DateTime({
displayName: 'Event Date',
required: true,
validators: [Validators.datetimeIso],
})
```
### Custom
Example:
Note: The custom validator allows you to define your own validation logic and error message. It can be used to perform complex validations beyond the provided built-in validators.
```typescript
const customValidator = {
type: ValidationInputType.STRING,
fn: (property, processedValue, userInput) => {
// Your custom validation logic here
if (validationFails) {
return "Validation Error: Your custom error message.";
}
return null;
}
};
```
# Overview
This tutorial explains three techniques for creating triggers:
* `Polling`: Periodically call endpoints to check for changes.
* `Webhooks`: Listen to user events through a single URL.
* `App Webhooks (Subscriptions)`: Use a developer app (using OAuth2) to receive all authorized user events at a single URL (Not Supported).
to create new trigger run following command,
```bash
npm run cli triggers create
```
1. `Piece Folder Name`: This is the name associated with the folder where the trigger resides. It helps organize and categorize triggers within the piece.
2. `Trigger Display Name`: The name users see in the interface, conveying the trigger's purpose clearly.
3. `Trigger Description`: A brief, informative text in the UI, guiding users about the trigger's function and purpose.
4. `Trigger Technique`: Specifies the trigger type - either polling or webhook.
# Trigger Structure
```typescript
export const createNewIssue = createTrigger({
auth: PieceAuth | undefined
name: string, // Unique name across the piece.
displayName: string, // Display name on the interface.
description: string, // Description for the action
triggerType: POLLING | WEBHOOK,
props: {}; // Required properties from the user.
// Run when the user enable or publish the flow.
onEnable: (ctx) => {};
// Run when the user disable the flow or
// the old flow is deleted after new one is published.
onDisable: (ctx) => {};
// Trigger implementation, It takes context as parameter.
// should returns an array of payload, each payload considered
// a separate flow run.
run: async run(ctx): unknown[] => {}
})
```
It's important to note that the `run` method returns an array. The reason for
this is that a single polling can contain multiple triggers, so each item in
the array will trigger the flow to run.
## Context Object
The Context object contains multiple helpful pieces of information and tools that can be useful while developing.
```typescript
// Store: A simple, lightweight key-value store that is helpful when you are developing triggers that persist between runs, used to store information like the last polling date.
await context.store.put('_lastFetchedDate', new Date());
const lastFetchedData = await context.store.get('_lastFetchedDate', new Date());
// Webhook URL: A unique, auto-generated URL that will trigger the flow. Useful when you need to develop a trigger based on webhooks.
context.webhookUrl;
// Payload: Contains information about the HTTP request sent by the third party. It has three properties: status, headers, and body.
context.payload;
// PropsValue: Contains the information filled by the user in defined properties.
context.propsValue;
```
**App Webhooks (Not Supported)**
Certain services, such as `Slack` and `Square`, only support webhooks at the developer app level.
This means that all authorized users for the app will be sent to the same endpoint. While this technique will be supported soon, for now, a workaround is to perform polling on the endpoint.
# Polling Trigger
Periodically call endpoints to check for changes
The way polling triggers usually work is as follows:
**On Enable:**
Store the last timestamp or most recent item id using the context store property.
**Run:**
This method runs every **5 minutes**, fetches the endpoint between a certain timestamp or traverses until it finds the last item id, and returns the new items as an array.
**Testing:**
You can implement a test function which should return some of the most recent items. It's recommended to limit this to five.
**Examples:**
* [New Record Airtable](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/airtable/src/lib/trigger/new-record.trigger.ts)
* [New Updated Item Salesforce](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/salesforce/src/lib/trigger/new-updated-record.ts)
# Polling library
There multiple strategy to implement polling triggers, and we have created a library to help you with that.
## Strategies
**Timebased:**
This strategy fetches new items using a timestamp. You need to implement the items method, which should return the most recent items.
The library will detect new items based on the timestamp.
The polling object's generic type consists of the props value and the object type.
```typescript
const polling: Polling<{ authentication: OAuth2PropertyValue, object: string }> = {
strategy: DedupeStrategy.TIMEBASED,
items: async ({ propsValue, lastFetchEpochMS }) => {
// Todo implement the logic to fetch the items
const items = [ {id: 1, created_date: '2021-01-01T00:00:00Z'}, {id: 2, created_date: '2021-01-01T00:00:00Z'}];
return items.map((item) => ({
epochMilliSeconds: dayjs(item.created_date).valueOf(),
data: item,
}));
}
}
```
**Last ID Strategy:**
This strategy fetches new items based on the last item ID. To use this strategy, you need to implement the items method, which should return the most recent items.
The library will detect new items after the last item ID.
The polling object's generic type consists of the props value and the object type
```typescript
const polling: Polling<{ authentication: AuthProps}> = {
strategy: DedupeStrategy.LAST_ITEM,
items: async ({ propsValue }) => {
// Implement the logic to fetch the items
const items = [{ id: 1 }, { id: 2 }];
return items.map((item) => ({
id: item.id,
data: item,
}));
}
}
```
## Trigger Implementation
After implementing the polling object, you can use the polling helper to implement the trigger.
```typescript
export const newTicketInView = createTrigger({
name: 'new_ticket_in_view',
displayName: 'New ticket in view',
description: 'Triggers when a new ticket is created in a view',
type: TriggerStrategy.POLLING,
props: {
authentication: Property.SecretText({
displayName: 'Authentication',
description: markdownProperty,
required: true,
}),
},
sampleData: {},
onEnable: async (context) => {
await pollingHelper.onEnable(polling, {
store: context.store,
propsValue: context.propsValue,
})
},
onDisable: async (context) => {
await pollingHelper.onDisable(polling, {
store: context.store,
propsValue: context.propsValue,
})
},
run: async (context) => {
return await pollingHelper.poll(polling, {
store: context.store,
propsValue: context.propsValue,
});
},
test: async (context) => {
return await pollingHelper.test(polling, {
store: context.store,
propsValue: context.propsValue,
});
}
});
```
# Webhook Trigger
Listen to user events through a single URL
The way webhook triggers usually work is as follows:
**On Enable:**
Use `context.webhookUrl` to perform an HTTP request to register the webhook in a third-party app, and store the webhook Id in the `store`.
**On Handshake:**
Some services require a successful handshake request usually consisting of some challenge. It works similar to a normal run except that you return the correct challenge response. This is optional and in order to enable the handshake you need to configure one of the available handshake strategies in the `handshakeConfiguration` option.
**Run:**
You can find the HTTP body inside `context.payload.body`. If needed, alter the body; otherwise, return an array with a single item `context.payload.body`.
**Disable:**
Using the `context.store`, fetch the webhook ID from the enable step and delete the webhook on the third-party app.
**Testing:**
You cannot test it with Test Flow, as it uses static sample data provided in the piece.
To test the trigger, publish the flow, perform the event. Then check the flow runs from the main dashboard.
**Examples:**
* [New Form Submission on Typeform](https://github.com/activepieces/activepieces/blob/main/packages/pieces/community/typeform/src/lib/trigger/new-submission.ts)
To make your webhook accessible from the internet, you need to configure the backend URL. Follow these steps:
1. Install ngrok.
2. Run the command `ngrok http 4200`.
3. Replace the `AP_FRONTEND_URL` environment variable in `packages/server/api/.env` with the ngrok URL.
Once you have completed these configurations, you are good to go!
# Community (Public NPM)
Learn how to publish your piece to the community.
You can publish your pieces to the npm registry and share them with the community. Users can install your piece from Settings -> My Pieces -> Install Piece -> type in the name of your piece package.
Make sure you are logged in to npm. If not, please run:
```bash
npm login
```
Rename the piece name in `package.json` to something unique or related to your organization's scope (e.g., `@my-org/piece-PIECE_NAME`). You can find it at `packages/pieces/PIECE_NAME/package.json`.
Don't forget to increase the version number in `package.json` for each new release.
Replace `PIECE_FOLDER_NAME` with the name of the folder.
Run the following command:
```bash
npm run publish-piece PIECE_FOLDER_NAME
```
**Congratulations! You can now import the piece from the settings page.**
# Contribute
Learn how to contribute a piece to the main repository.
* Build and test your piece.
* Open a pull request from your repository to the main fork.
* A maintainer will review your work closely.
* Once the pull request is approved, it will be merged into the main branch.
* Your piece will be available within a few minutes.
* An automatic GitHub action will package it and create an npm package on npmjs.com.
# Overview
Learn the different ways to publish your own piece on activepieces.
## Methods
* [Contribute Back](/developers/sharing-pieces/contribute): Publish your piece by contributing back your piece to main repository.
* [Community](/developers/sharing-pieces/community): Publish your piece on npm directly and share it with the community.
* [Private](/developers/sharing-pieces/private): Publish your piece on activepieces privately.
# Private
Learn how to share your pieces privately.
This guide assumes you have already created a piece and created a private fork of our repository, and you would like to package it as a file and upload it.
Friendly Tip: There is a CLI to easily upload it to your platform. Please check out [Piece CI/CD](../misc/pieces-ci-cd).
Build the piece using the following command. Make sure to replace `${name}` with your piece name.
```bash
npx nx build pieces-${name}
```
Then pack your pieces as an npm package. Make sure to replace `${name}` with your piece name.
```bash
cd dist/packages/pieces/${name} && npm pack
```
Upload the generated tarball inside `dist/packages/pieces/${name}`from Activepieces Platform Admin -> Pieces
![Manage Pieces](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/install-piece.png)
# Chat Completion
Learn how to use chat completion AI in actions
The following snippet shows how to use chat completion to get a response from an AI model.
```typescript
const ai = AI({ provider: context.propsValue.provider, server: context.server });
const response = await ai.chat.text({
model: context.propsValue.model,
messages: [
{
role: AIChatRole.USER,
content: "Can you provide examples of TypeScript code formatting?",
},
],
/**
* Controls the creativity of the AI response.
* A higher value will make the AI more creative and a lower value will make it more deterministic.
*/
creativity: 0.7,
/**
* The maximum number of tokens to generate in the completion.
*/
maxTokens: 100,
});
```
# Function Calling
Learn how to use function calling AI in actions
### Chat-based Function Calling
The code snippet below shows how to use a function call to extract structured data directly from a text input:
```typescript
const chatResponse = await ai.chat.function({
model: context.propsValue.model,
messages: [
{
role: AIChatRole.USER,
content: context.propsValue.text,
},
],
functions: [
{
name: 'extract_structured_data',
description: 'Extract the following data from the provided text.',
arguments: [
{ name: 'customerName', type: 'string', description: 'The customer\'s name.', isRequired: true },
{ name: 'orderId', type: 'string', description: 'Unique order identifier.', isRequired: true },
{ name: 'purchaseDate', type: 'string', description: 'Date of purchase (YYYY-MM-DD).', isRequired: false },
{ name: 'totalAmount', type: 'number', description: 'Total transaction amount in dollars.', isRequired: false },
],
}
]
});
```
### Image-based Function Calling
To extract structured data from an image, use this function call:
```typescript
const imageResponse = await ai.image.function({
model: context.propsValue.imageModel,
image: context.propsValue.imageData,
functions: [
{
name: 'extract_structured_data',
description: 'Extract the following data from the image text.',
arguments: [
{ name: 'customerName', type: 'string', description: 'The customer\'s name.', isRequired: true },
{ name: 'orderId', type: 'string', description: 'Unique order identifier.', isRequired: true },
{ name: 'purchaseDate', type: 'string', description: 'Date of purchase (YYYY-MM-DD).', isRequired: false },
{ name: 'totalAmount', type: 'number', description: 'Total transaction amount in dollars.', isRequired: false },
],
}
]
});
```
# Image AI
Learn how to use image AI in actions
The following snippet shows how to use image generation to create an image using AI.
```typescript
const ai = AI({
provider: context.propsValue.provider,
server: context.server,
});
const response = await image.generate({
// The model to use for image generation
model: context.propsValue.model,
// The prompt to guide the image generation
prompt: context.propsValue.prompt,
// The resolution of the generated image
size: "1024x1024",
// Any advanced options for the image generation
advancedOptions: {},
});
```
# Overview
The AI Toolkit to build AI pieces tailored for specific use cases that work with many AI providers
**What it provides:**
* 🔐 **Centralized Credentials Management**: Admin manages credentials, end users use without hassle.
* 🌐 **Support for Multiple AI Providers**: OpenAI, Anthropic, Google, LLAMA, and many open-source models.
* 💬 **Support for Various AI Capabilities**: Chat, 🖼️ Image, 🎤 Voice, and more.
![Unified AI SDK](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/unified-ai.png)
## Getting Started
# Customize Pieces
This documentation explains how to customize access to pieces depending on projects.
You can tag pieces in bulk using **Admin Console**
![Bulk Tag](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/tag-pieces.png)
We need to specify the tags of pieces in the token, check how to generate token in [provision-users](./provision-users).
You should specify the `pieces` claim like this:
```json
{
/// Other claims
"piecesFilterType": "ALLOWED",
"piecesTags": [ "free" ]
}
```
Each time the token is used in the frontend, it will sync all pieces with these tags to the project.
The project's pieces list will **exactly match** all pieces with these tags at the moment of using the iframe.
# Embed Builder
This documentation explains how to embed the Activepieces iframe inside your application and customize it.
## Configure SDK
Adding the embedding SDK script will initialize an object in your window called `activepieces`, which has a method called `configure` that you should call after the container has been rendered.
The following scripts shouldn't contain the `async` or `defer` attributes.
These steps assume you have already generated a JWT token from the backend. If not, please check the [provision-users](./provision-users) page.
```html
```
`configure` returns a promise which is resolved after authentication is done.
Please check the [navigation](./navigation.mdx) section, as it's very important to understand how navigation works and how to supply an auto-sync experience.
**Configure Parameters:**
| Parameter Name | Required | Type | Description |
| ----------------------------------- | -------- | -------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| prefix | ❌ | string | Some customers have an embedding prefix, like this `/`. For example if the prefix is `/automation` and the Activepieces url is `/flows` the full url would be `/automation/flows`. |
| instanceUrl | ✅ | string | The url of the instance hosting Activepieces, could be [https://cloud.activepieces.com](https://cloud.activepieces.com) if you are a cloud user. |
| jwtToken | ✅ | string | The jwt token you generated to authenticate your users to Activepieces. |
| embedding.containerId | ❌ | string | The html element's id that is going to be containing Activepieces's iframe. |
| embedding.builder.disableNavigation | ❌ | boolean | Hides the folder name and back button in the builder, by default it is false. |
| embedding.builder.hideLogo | ❌ | boolean | Hides the logo in the builder's header, by default it is false. |
| embedding.builder.hideFlowName | ❌ | boolean | Hides the flow name and flow actions dropdown in the builder's header, by default it is false. |
| embedding.dashboard.hideSidebar | ❌ | boolean | Controls the visibility of the sidebar in the dashboard, by default it is false. |
| embedding.hideFolders | ❌ | boolean | Hides all things related to folders in both the flows table and builder by default it is false. |
| navigation.handler | ❌ | `({route:string}) => void` | If defined the callback will be triggered each time a route in Activepieces changes, you can read more about it [here](/embedding/navigation) |
# Create Connections
**Requirements:**
* Activepieces version 0.34.5 or higher
* SDK version 0.3.2 or higher
You can use the embedded SDK to create connections.
Follow the instructions in the [Embed Builder](./embed-builder).
After initializing the SDK, you will have access to a property called `activepieces` inside your `window` object. Call its `connect` method to open a new connection dialog as follows.
```html
```
**Connect Parameters:**
| Parameter Name | Required | Type | Description |
| -------------- | -------- | ------ | ---------------------------------------------------------- |
| pieceName | ✅ | string | The name of the piece you want to create a connection for. |
| connectionName | ❌ | string | The name of the connection |
**Connect Result**
The `connect` method returns a `promise` that resolves to the following:
```ts
{
connection?: {
id: string,
name: string
}
}
```
`connection` is undefined if the user closes the dialog and doesn't create a connection.
Or if the `connectionName` parameter is defined, it could get rejected with an error message telling you why `connectionName` is invalid:
```ts
{
error:string
}
```
If `connectionName` is defined, the connection name input will be disabled and if the connection name is invalid the returned result from the connect method will be an error message describing why it is invalid.
You can use the `connections` piece in the builder to retrieve the created connection using its name.
![Connections in Builder](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/connections-piece.png)
![Connections in Builder](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/connections-piece-usage.png)
# Navigation
By default, navigating within your embedded instance of Activepieces doesn't affect the client's browser history or viewed URL. Activepieces only provide a **handler**, that trigger on every route change in the **iframe**.
## Automatically Sync URL
You can use the following snippet when configuring the SDK, which will implement a handler that syncs the Activepieces iframe with your browser:
The following snippet listens when the user clicks backward, so it syncs the route back to the iframe using `activepieces.navigate` and in the handler, it updates the URL of the browser.
```js
activepieces.configure({
prefix: "/",
instanceUrl: 'INSTANCE_URL',
jwtToken: "GENERATED_JWT_TOKEN",
embedding: {
containerId: "container",
builder: {
disableNavigation: false,
hideLogo: false,
hideFlowName: false
},
dashboard: {
hideSidebar: false
},
hideFolders: false,
navigation: {
handler: ({ route }) => {
if (!window.location.href.endsWith(route)) {
window.history.pushState({}, "", window.location.origin + route);
}
}
}
},
});
window.addEventListener("popstate", () => {
const route = activepieces.extractActivepiecesRouteFromUrl({ vendorUrl: window.location.href });
activepieces.navigate({ route });
});
```
## Navigate Method
If you use `activepieces.navigate({ route: '/flows' })` this will tell the embedded sdk where to navigate to.
Here is the list for routes the sdk can navigate to:
| Route | Description |
| ----------------- | ------------------------------ |
| `/flows` | Flows table |
| `/flows/{flowId}` | Opens up a flow in the builder |
| `/runs` | Runs table |
| `/runs/{runId}` | Opens up a run in the builder |
| `/connections` | Connections table |
# Overview
Understanding how embedding works
This section provides an overview of how to embed the Activepieces builder in your application and automatically provision the user.
The embedding process involves the following steps:
Generate a JSON Web Token (JWT) to identify your customer and pass it to the frontend.
Use the Activepieces SDK and the JWT to embed the Activepieces builder as an iframe, and customize using the SDK.
Incase, you need to gather connections in custom place in your application. You can do this with the SDK. Find more info [here](./embed-connections.mdx).
# Provision Users
Automatically authenticate your SaaS users to your Activepieces instance
## Overview
In Activepieces, there are **Projects** and **Users**. Each project is provisioned with their corresponding workspace, project, or team in your SaaS. The users are then mapped to the respective users in Activepieces.
To achieve this, the backend will generate a signed token that contains all the necessary information to automatically create a user and project. If the user or project already exists, it will skip the creation and log in the user directly.
You can generate a signing key by going to **Platform Settings -> Signing Keys -> Generate Signing Key**.
This will generate a public and private key pair. The public key will be used by Activepieces to verify the signature of the JWT tokens you send. The private key will be used by you to sign the JWT tokens.
Please store your private key in a safe place, as it will not be stored in Activepieces.
The signing key will be used to generate JWT tokens for the currently logged-in user on your website, which will then be sent to the Activepieces Iframe as a query parameter to authenticate the user and exchange the token for a longer lived token.
To generate these tokens, you will need to add code in your backend to generate the token using the RS256 algorithm, so the JWT header would look like this:
To obtain the `SIGNING_KEY_ID`, refer to the signing key table and locate the value in the first column.
```json
{
"alg": "RS256",
"typ": "JWT",
"kid": "SIGNING_KEY_ID"
}
```
The signed tokens must include these claims in the payload:
```json
{
"version": "v3",
"externalUserId": "user_id",
"externalProjectId": "user_project_id",
"firstName": "John",
"lastName": "Doe",
"role": "EDITOR",
"piecesFilterType": "NONE",
"exp": 1856563200
}
```
| Claim | Description |
| ----------------- | ----------------------------------------------------------------------------------- |
| externalUserId | Unique identification of the user in **your** software |
| externalProjectId | Unique identification of the user's project in **your** software |
| firstName | First name of the user |
| lastName | Last name of the user |
| email | Email address of the user |
| role | Role of the user in the Activepieces project (e.g., **EDITOR**, **VIEWER**) |
| exp | Expiry timestamp for the token (Unix timestamp) |
| piecesFilterType | Customize the project pieces, check [customize pieces](/embedding/customize-pieces) |
| piecesTags | Customize the project pieces, check [customize pieces](/embedding/customize-pieces) |
| tasks | Customize the task limit, check the section below |
You can use any JWT library to generate the token. Here is an example using the jsonwebtoken library in Node.js:
**Friendly Tip #1**: You can also use this [tool](https://dinochiesa.github.io/jwt/) to generate a quick example.
**Friendly Tip #2**: Make sure the expiry time is very short, as it's a temporary token and will be exchanged for a longer-lived token.
```javascript Node.js
const jwt = require('jsonwebtoken');
// JWT NumericDates specified in seconds:
const currentTime = Math.floor(Date.now() / 1000);
let token = jwt.sign(
{
version: "v3",
externalUserId: "user_id",
externalProjectId: "user_project_id",
firstName: "John",
lastName: "Doe",
role: "EDITOR",
email: "john@example.com",
piecesFilterType: "NONE",
exp: currentTime + (5 * 60), // 5 minutes from now
},
process.env.ACTIVEPIECES_SIGNING_KEY,
{
algorithm: "RS256",
header: {
kid: signingKeyID, // Include the "kid" in the header
},
}
);
```
Once you have generated the token, please check the embedding docs to know how to embed the token in the iframe.
# SDK Changelog
A log of all notable changes to Activepieces SDK
### 12/04/2024 (3.0)
**Breaking Change**: Automatic URL sync has been removed. Instead, Activepieces now provides a callback handler method. Please read [Embedding Navigation](./navigation) for more information.
* feat(embed-sdk): add custom navigation handler ([#4500](https://github.com/activepieces/activepieces/pull/4500))
* feat(embed-sdk): allow passing a predefined name for connection in connect method ([#4485](https://github.com/activepieces/activepieces/pull/4485))
* docs(embed-sdk): add changelog ([#4503](https://github.com/activepieces/activepieces/pull/4503))
# Delete Connection
DELETE /v1/app-connections/{id}
Delete an app connection
# List Connections
GET /v1/app-connections/
# Connection Schema
# Upsert Connection
POST /v1/app-connections
Upsert an app connection based on the app name
# Get Flow Run
GET /v1/flow-runs/{id}
Get Flow Run
# List Flows Runs
GET /v1/flow-runs
List Flow Runs
# Flow Run Schema
# Create Flow Template
POST /v1/flow-templates
Create a flow template
# Delete Flow Template
DELETE /v1/flow-templates/{id}
Delete a flow template
# Get Flow Template
GET /v1/flow-templates/{id}
Get a flow template
# List Flow Templates
GET /v1/flow-templates
List flow templates
# Flow Template Schema
# Create Flow
POST /v1/flows
Create a flow
# Delete Flow
DELETE /v1/flows/{id}
Delete a flow
# Get Flow
GET /v1/flows/{id}
Get a flow by id
# List Flows
GET /v1/flows
List flows
# Flow Schema
# Apply Flow Operation
POST /v1/flows/{id}
Apply an operation to a flow
# Create Folder
POST /v1/folders
Create a new folder
# Delete Folder
DELETE /v1/folders/{id}
Delete a folder
# Get Folder
GET /v1/folders/{id}
Get a folder by id
# List Folders
GET /v1/folders
List folders
# Folder Schema
# Update Folder
POST /v1/folders/{id}
Update an existing folder
# Pull
POST /v1/git-repos/pull
Pull all changes from the git repository and overwrite any conflicting changes in the project.
# Git Repos Schema
# Overview
API keys are generated under the platform dashboard at this moment to manage multiple projects, which is only available in the Platform and Enterprise editions,
Please contact [sales@activepieces.com](mailto:sales@activepieces.com) for more information.
### Authentication:
The API uses "API keys" to authenticate requests. You can view and manage your API keys from the Platform Dashboard. After creating the API keys, you can pass the API key as a Bearer token in the header.
Example:
`Authorization: Bearer {API_KEY}`
### Pagination
All endpoints use seek pagination, to paginate through the results, you can provide the `limit` and `cursor` as query parameters.
The API response will have the following structure:
```json
{
"data": [],
"next": "string",
"previous": "string"
}
```
* **`data`**: Holds the requested results or data.
* **`next`**: Provides a starting cursor for the next set of results, if available.
* **`previous`**: Provides a starting cursor for the previous set of results, if applicable.
# Install Piece
POST /v1/pieces
Add a piece to a platform
# Piece Schema
# Delete Project Member
DELETE /v1/project-members/{id}
# List Project Member
GET /v1/project-members
# Project Member Schema
# Create Project
POST /v1/projects
# List Projects
GET /v1/projects
# Project Schema
# Update Project
POST /v1/projects/{id}
# Delete User Invitation
DELETE /v1/user-invitations/{id}
# List User Invitations
GET /v1/user-invitations
# User Invitation Schema
# Send User Invitation (Upsert)
POST /v1/user-invitations
Send a user invitation to a user. If the user already has an invitation, the invitation will be updated.
# Building Flows
Flow consists of two parts, trigger and actions
## Trigger
The flow's starting point determines its frequency of execution. There are various types of triggers available, such as Schedule Trigger, Webhook Trigger, or Event Trigger based on specific service.
## Action
Actions come after the flow and control what occurs when the flow is activated, like running code or communicating with other services.
In real-life scenario:
![Flow Parts](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/flow-parts.png)
# Debugging Runs
Ensuring your business automations are running properly
You can monitor each run that results from an enabled flow:
1. Go to the Dashboard, click on **Runs**.
2. Find the run that you're looking for, and click on it.
3. You will see the builder in a view-only mode, each step will show a ✅ or a ❌ to indicate its execution status.
4. Click on any of these steps, you will see the **input** and **output** in the **Run Details** panel.
The debugging experience looks like this:
![Debugging Business Automations](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/using-activepieces-debugging.png)
# Technical limits
technical limits for Activepieces execution
### Overview
This Limits applies for the **Activepieces Cloud**, and can be configured via environment variables for self-hosted instances.
### Flow Limits
* **Execution Time**: Each flow has a maximum execution time of **600 seconds (10 minutes)**. Flows exceeding this limit will be marked as a timeout.
* **Memory Usage**: During execution, a flow should not use more than **128 MB of RAM**.
**Friendly Tip #1:** Flow run in a paused state, such as Wait for Approval or Delay, do not count toward the 600 seconds.
**Friendly Tip #2:** The execution time limit can be worked around by splitting the flows into multiple ones, such as by having one flow call another flow using a webhook, or by having each flow process a small batch of items.
### File Storage Limits
The files from actions or triggers are stored in the database / S3 to support retries from certain steps.
* **Maximum File Size**: 10 MB
### Data Storage Limits
Some pieces utilize the built-in Activepieces key store, such as the Store Piece and Queue Piece.
The storage limits are as follows:
* **Maximum Key Length**: 128 characters
* **Maximum Value Size**: 512 KB
# Passing Data
Using data from previous steps in the current one
## Data flow
Any Activepieces flow is a vertical diagram that **starts with a trigger step** followed by **any number of action steps**.
Steps are connected vertically. Data flows from parent steps to the children. Children steps have access to the output data of the parent steps.
## Example Steps
This flow has 3 steps, they can access data as follows:
* **Step 1** is the main data producer to be used in the next steps. Data produced by Step 1 will be accessible in Steps 2 and 3. Some triggers don't produce data though, like Schedules.
* **Step 2** can access data produced by Step 1. After execution, this step will also produce data to be used in the next step(s).
* **Step 3** can access data produced by Steps 1 and 2 as they're its parent steps. This step can produce data but since it's the last step in the flow, it can't be used by other ones.
## Data to Insert Panel
In order to use data from a previous step in your current step, place your cursor in any input, the **Data to Insert** panel will pop up.
This panel shows the accessible steps and their data. You can expand the data items to view their content, and you can click the items to insert them in your current settings input.
If an item in this panel has a caret (⌄) to the right, it means you can click on the item to expand its child properties. You can select the parent item or its properties as you need.
When you insert data from this panel, it gets inserted at the cursor's position in the input. This means you can combine static text and dynamic data in any field.
We generally recommend that you expand the items before inserting them to understand the type of data they contain and whether they're the right fit to the input you're filling.
## Testing Steps to Generate Data
We require you to test steps before accessing their data. This approach protects you from selecting the wrong data and breaking your flows after publishing them.
If a step is not tested and you try to access its data, you will see the following message:
To fix this, go to the step and use the Generate Sample Data panel to test it. Steps use different approaches for testing. These are the common ones:
* **Load Data:** Some triggers will let you load data from your connected account without having to perform any action in that account.
* **Test Trigger:** Some triggers will require you to head to your connected account and fire the trigger in order to generate sample data.
* **Send Data:** Webhooks require you to send a sample request to the webhook URL to generate sample data.
* **Test Action:** Action steps will let you run the action in order to generate sample data.
Follow the instructions in the Generate Sample Data panel to know how your step should be tested. Some triggers will also let you Use Mock Data, which will generate static sample data from the piece. We recommend that you test the step instead of using mock data.
This is an example for generating sample data for a trigger using the **Load Data** button:
## Advanced Tips
### Switching to Dynamic Values
Dropdowns and some other input types don't let you select data from previous steps. If you'd like to bypass this and use data from previous steps instead, switch the input into a dynamic one using this button:
### Accessing data by path
If you can't find the data you're looking for in the **Data to Insert** panel but you'd like to use it, you can write a JSON path instead.
Use the following syntax to write JSON paths:
`{{step_slug.path.to.property}}`
The `step_slug` can be found by moving your cursor over any of your flow steps, it will show to the right of the step.
# Publishing Flows
Make your flow work by publishing your updates
The changes you make won't work right away to avoid disrupting the flow that's already published. To enable your changes, simply click on the publish button once you're done with your changes.
![Flow Parts](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/publish-flow.png)
# Version History
Learn how flow versioning works in Activepieces
Activepieces keeps track of all published flows and their versions. Here’s how it works:
1. You can edit a flow as many times as you want in **draft** mode.
2. Once you're done with your changes, you can publish it.
3. The published flow will be **immutable** and cannot be edited.
4. If you try to edit a published flow, Activepieces will create a new **draft** if there is none and copy the **published** version to the new version.
This means you can always go back to a previous version and edit the flow in draft mode without affecting the published version.
![Flow History](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/flow-history.png)
As you can see in the following screenshot, the yellow dot refers to DRAFT and the green dot refers to PUBLISHED.
# 🥳 Welcome to Activepieces
Your friendliest open source all-in-one automation tool, designed to be extensible.
Learn how to work with Activepieces
Browse available pieces
Learn how to install Activepieces
How to Build Pieces and Contribute
# 🔥 Why Activepieces is Different:
* **💖 Loved by Everyone**: Intuitive interface and great experience for both technical and non-technical users with a quick learning curve.
![](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/templates.gif)
* **🌐 Open Ecosystem:** All pieces are open source and available on npmjs.com, **60% of the pieces are contributed by the community**.
* **🛠️ Pieces are written in Typescript**: Pieces are npm packages in TypeScript, offering full customization with the best developer experience, including **hot reloading** for **local** piece development on your machine. 😎
![](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/create-action.png)
* **🤖 AI-Ready**: Native AI pieces let you experiment with various providers, or create your own agents using our AI SDK, and there is Copilot to help you build flows inside the builder.
* **🏢 Enterprise-Ready**: Developers set up the tools, and anyone in the organization can use the no-code builder. Full customization from branding to control.
* **🔒 Secure by Design**: Self-hosted and network-gapped for maximum security and control over your data.
* **🧠 Human in Loop**: Delay execution for a period of time or require approval. These are just pieces built on top of the piece framework, and you can build many pieces like that. 🎨
* **💻 Human Input Interfaces**: Built-in support for human input triggers like "Chat Interface" 💬 and "Form Interface" 📝
# Product Principles
## 🌟 Keep It Simple
* Design the product to be accessible for everyone, regardless of their background and technical expertise.
* The code is in a monorepository under one service, making it easy to develop, maintain, and scale.
* Keep the technology stack simple to achieve massive adoption.
* Keep the software unopinionated and unlock niche use cases by making it extensible through pieces.
## 🧩 Keep It Extensible
* Automation pieces framework has minimal abstraction and allow you to extend for any usecase.
* All contributions are welcome. The core is open source, and commercial code is available.
# Customer Support
Talking to customers might seem like a daunting task if you're not used to it. But guess what? Activepieces customers and community are some of the coolest and nicest people you'll ever chat with. Plus, you'll get to work with some really smart folks!
Don't believe me? Just check out our pull requests and see how many customers have contributed to the codebase. It's pretty awesome!
## How to talk to customers?
* Chat with them like you're talking to a friend. No need to sound like a robot. For example:
* ✅ "Hey there! How can I help you today?"
* ❌ "Greetings. How may I assist you with your inquiry?"
* ✅ "No worries, we'll get this sorted out together!"
* ❌ "Please hold while I process your request."
* Reply quickly! People love fast responses. Even if you don't know the answer right away, let them know you'll get back to them with the info. This is the fastest way to make customers happy; everyone likes to be heard.
* Explain the issue clearly and don't be defensive and be honest. We're all about open source and transparency here – it's part of our culture. For example:
* ✅ "I'm sorry, I forgot to follow up on this. Let's get it sorted out now."
* ❌ "I apologize for the delay, there were unforeseen circumstances."
## How to handle bugs and feature requests?
### Case 1: Quick, Easy, or Urgent Issues
* Understand the issue and how urgent it is.
* Fix it yourself and open a PR right away. It leaves a great impression!
### Case 2: Issues Needing More Time
* Always create a GitHub issue for the feature request, and send it to the customer.
* Assess the issue and determine its urgency.
* Leave a comment on the GitHub issue with an estimated completion time.
### Case 3: Feature Requests
* Always create a GitHub issue for the feature request, and send it to the customer.
* Evaluate the request and how it aligns with our product vision.
* Add it to our roadmap and discuss it with the team.
### Case 4: Reassigning Issues
* If you're not the right person to answer the question, reassign the issue to the right person.
# How We Work?
Activepieces work is based on one-week sprints, as priorities change fast, the sprint has to be short to adapt.
## Sprints
Sprints are shared publicly on our GitHub account. This would give everyone visibility into what we are working on.
* There should be a GitHub issue for the sprint set up in advance that outlines the changes.
* Each *individual* should come prepared with specific suggestions for what they will work on over the next sprint. **if you're in an engineering role, no one will dictate to you what to build – it is up to you to drive this.**
* Teams generally meet once a week to pick the **priorities** together.
* Everyone in the team should attend the sprint planning.
* Anyone can comment on the sprint issue before or after the sprint.
## Pull Requests
When it comes to code review, we have a few guidelines to ensure efficiency:
* Create a pull request in draft state as soon as possible.
* Be proactive and review other people’s pull requests. Don’t wait for someone to ask for your review; it’s your responsibility.
* Assign only one reviewer to your pull request.
* **It is the responsibility of the PR owner to draft the test scenarios within the PR description. Upon review, the reviewer may assume that these scenarios have been tested and provide additional suggestions for scenarios.**
* **Large, incomplete features should be broken down into smaller tasks and continuously merged into the main branch.**
## Planning is everyone's job.
Every engineer is responsible for discovering bugs/opportunities and bringing them up in the sprint to convert them into actionable tasks.
# On-Call
The on-call rotation is a simple strategy to ensure there is always someone available to fix the issue for the users, each engineer is responsible for a week and the rotation is done by the team.
## Why On-Call?
We need to ensure there is **exactly one person** at the same time who is the main point of contact for the users and the **first responder** for the issues. It's also a great way to learn about the product and the users and have some fun.
You can listen to [Queen - Under Pressure](https://www.youtube.com/watch?v=a01QQZyl-_I) while on-call, it's fun and motivating.
If you ever feel burn out in middle of your rotation, please reach out to the team and we will help you with the rotation or take over the responsibility.
## When you are on-call
The primary objective of being on-call is to triage issues and assist users. It is not about fixing the issues or coding missing features. Delegation is key whenever possible.
You are responsible for the following:
* Respond to Slack messages as soon as possible, referring to the [customer support guidelines](./customer-support.mdx).
* Check [community.activepieces.com](https://community.activepieces.com) for any new issues or to learn about existing issues.
**Friendly Tip #1**: always escalate to the team if you are unsure what to do.
## On-Call Schedule
| Week | Engineer |
| --------------------------------- | ---------------------------------------------------------------- |
| 25th September - 1st October 2024 | [@abuaboud](https://github.com/abuaboud) |
| 2nd - 9th October 2024 | [@abuaboud](https://github.com/abuaboud) |
| 9th - 13th October 2024 | [@AbdulTheActivePiecer](https://github.com/AbdulTheActivePiecer) |
| 13th - 20th October 2024 | [@hazemadelkhalel](https://github.com/hazemadelkhalel) |
| 20th - 27th October 2024 | [@anasbarg](https://github.com/anasbarg) |
| 27th October - 3rd November 2024 | [@abuaboud](https://github.com/abuaboud) |
# Pre-Releases
Pre-releases are versions of the software that are released before the final version. They are used to test new features and bug fixes before they are released to the public. Pre-releases are typically labeled with a version number that includes a pre-release identifier, such as `official` or `rc`.
## Types of Releases
There are several types of releases that can be used to indicate the stability of the software:
* **Official**: Official releases are considered to be stable and are close to the final release.
* **Release Candidate (RC)**: Release candidates are versions of the software that are feature-complete and have been tested by a larger group of users. They are considered to be stable and are close to the final release. They are typically used for final testing before the final release.
## Why Use Pre-Releases
We do pre-release when we release hot-fixes / bug fixes / small and beta features.
## How to Release a Pre-Release
To release a pre-release version of the software, follow these steps:
1. **Create a new branch**: Create a new branch from the `main` branch. The branch name should be `release/vX.Y.Z` where `X.Y.Z` is the version number.
2. **Increase the version number**: Update the `package.json` file with the new version number.
3. **Open a Pull Request**: Open a pull request from the new branch to the `main` branch. Assign the `pre-release` label to the pull request.
4. **Check the Changelog**: Check the [Activepieces Releases](https://github.com/activepieces/activepieces/releases) page to see if there are any new features or bug fixes that need to be included in the pre-release. Make sure all PRs are labeled correctly so they show in the correct auto-generated changelog. If not, assign the labels and rerun the changelog by removing the "pre-release" label and adding it again to the PR.
5. Go to [https://github.com/activepieces/activepieces/actions/workflows/release-rc.yml](https://github.com/activepieces/activepieces/actions/workflows/release-rc.yml) and run it on the release branch to build the rc image.
6. **Merge the Pull Request**: Merge the pull request to the `main` branch.
7. **Release the Notes**: Release the notes for the new version.
# Our Compensation
The packages include three factors for the salary:
* **Role**: The specific position and responsibilities of the employee.
* **Location**: The geographical area where the employee is based.
* **Level**: The seniority and experience level of the employee.
Salaries are fixed and based on levels and seniority, not negotiation. This ensures fair pay for everyone.
Salaries are updated based on market trends and the company's performance. It's easier to justify raises when the business is great.
# Our Hiring Process
Engineers are the majority of the Activepieces team, and we are always looking for highly talented product engineers.
Here, you'll face a real challenge from Activepieces. We'll guide you through it to see how you solve problems.
We'll chat about your past experiences and how you design products. It's like having a friendly conversation where we reflect on what you've done before.
You'll do paid task for a short time (1-2 days). These tasks help us understand how well we work together.
## Interviewing Tips
Every interview should make us say **HELL YES**. If not, we'll kindly pass.
**Avoid Bias:** Get opinions from others to make fair decisions.
**Speak Up Early:** If you're unsure about something, ask or test it right away.
# Our Roles & Levels
**Product Engineers** are full stack engineers who handle both the engineering and product side, delivering features end-to-end.
### Our Levels
We break out seniority into three levels, **L1 to L3**.
### L1 Product Engineers
They tend to be early-career.
* They get more management support than folks at other levels.
* They focus on continuously absorbing new information about our users and how to be effective at **Activepieces**.
* They aim to be increasingly autonomous as they gain more experience here.
### L2 Product Engineers
They are generally responsible for running a project start-to-finish.
* They independently decide on the implementation details.
* They work with **Stakeholders** / **teammates** / **L3s** on the plan.
* They have personal responsibility for the **“how”** of what they’re working on, but share responsibility for the **“what”** and **“why”**.
* They make consistent progress on their work by continuously defining the scope, incorporating feedback, trying different approaches and solutions, and deciding what will deliver the most value for users.
### L3 Product Engineers
Their scope is bigger than coding, they lead a product area, make key product decisions and guide the team with strong leadership skills.
* **Planning**: They help **L2s** figure out what the next priority things to focus on and guide **L1s** in determining the right sequence of work to get a project done.
* **Day-to-Day Work**: They might be hands-on with the day-to-day work of the team, providing support and resources to their teammates as needed.
* **Customer Communication**: They handle direct communication with customers regarding planning and product direction, ensuring that customer needs and feedback are incorporated into the development process.
### How to Level Up
There is no formal process, but it happens at the end of **each year** and is based on two things:
1. **Manager Review**: Managers look at how well the engineer has performed and grown over the year.
2. **Peer Review**: Colleagues give feedback on how well the engineer has worked with the team.
This helps make sure promotions are fair and based on merit.
# Our Team Structure
We are big believers in small teams with 10x engineers who would outperform other team types.
## No product management by default
Engineers decide what to build. If you need help, feel free to reach out to the team for other opinions or help.
## No Process by default
We trust the engineers' judgment to make the call whether this code is risky and requires external approval or if it's a fix that can be easily reversed or fixed with no big impact on the end user.
## They Love Users
When the engineer loves the users, that means they would ship fast, they wouldn't over-engineer because they understand the requirements very well, they usually have empathy which means they don't complicate everyone else.
## Pragmatic & Speed
Engineering planning sometimes seems sexy from a technical perspective, but being pragmatic means you would take decisions in a timely manner, taking them in baby steps and iterating faster rather than planning for the long run, and it's easy to reverse wrong decisions early on without investing too much time.
## Starts With Hiring
We hire very **slowly**. We are always looking for highly talented engineers. We love to hire people with a broader skill set and flexibility, low egos, and who are builders at heart.
We found that working with strong engineers is one of the strongest reasons to retain employees, and this would allow everyone to be free and have less process.
# Activepieces Handbook
Welcome to the Activepieces Handbook!
This guide serves as a complete resource for understanding our organization. Inside, you'll find detailed sections covering various aspects of our internal processes and policies.
# Engine
The Engine file contains the following types of operations:
* **Extract Piece Metadata**: Extracts metadata when installing new pieces.
* **Execute Step**: Executes a single test step.
* **Execute Flow**: Executes a flow.
* **Execute Property**: Executes dynamic dropdowns or dynamic properties.
* **Execute Trigger Hook**: Executes actions such as OnEnable, OnDisable, or extracting payloads.
* **Execute Auth Validation**: Validates the authentication of the connection.
The engine takes the flow JSON with an engine token scoped to this project and implements the API provided for the piece framework, such as:
* Storage Service: A simple key/value persistent store for the piece framework.
* File Service: A helper to store files either locally or in a database, such as for testing steps.
* Fetch Metadata: Retrieves metadata of the current running project.
# Overview
This page focuses on describing the main components of Activepieces and focus mainly on workflow executions.
## Components
![Architecture](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/architecture.png)
**Activepieces:**
* **App**: The main application that organizes everything from APIs to scheduled jobs.
* **Worker**: Polls for new jobs and executes the flows with the engine, ensuring proper sandboxing, and sends results back to the app through the API.
* **Engine**: TypeScript code that parses flow JSON and executes it. It is compiled into a single JS file.
* **UI**: Frontend written in Angular.
**Third Party**:
* **Postgres**: The main database for Activepieces.
* **Redis**: This is used to power the queue using [BullMQ](https://docs.bullmq.io/).
## Reliability & Scalability
Postgres and Redis availability is outside the scope of this documentation, as many cloud providers already implement best practices to ensure their availability.
* **Webhooks**:\
All webhooks are sent to the Activepieces app, which performs basic validation and adds them to the queue. In case of a spike, webhooks will be added to the queue.
* **Polling Trigger**:\
All recurring jobs are added to Redis. In case of a failure, the missed jobs will be executed again.
* **Flow Execution**:\
Workers poll jobs from the queue. In the event of a spike, the flow execution will still work but may be delayed depending on the size of the spike.
To scale Activepieces, you typically need to increase the replicas of either workers, the app, or the Postgres database. A small Redis instance is sufficient as it can handle thousands of jobs per second and rarely acts as a bottleneck.
## Repository Structure
The repository is structured as a monorepo using the NX build system, with TypeScript as the primary language. It is divided into several packages:
```
.
├── packages
│ ├── react-ui
│ ├── server
| |── api
| |── worker
| |── shared
| ├── ee
│ ├── engine
│ ├── pieces
│ ├── shared
```
* `react-ui`: This package contains the user interface, implemented using the React framework.
* `server-api`: This package contains the main application written in TypeScript with the Fastify framework.
* `server-worker`: This package contains the logic of accepting flow jobs and executing them using the engine.
* `server-shared`: this package contains the shared logic between worker and app.
* `engine`: This package contains the logic for flow execution within the sandbox.
* `pieces`: This package contains the implementation of triggers and actions for third-party apps.
* `shared`: This package contains shared data models and helper functions used by the other packages.
* `ee`: This package contains features that are only available in the paid edition.
# Stack & Tools
## Language
Activepieces uses **Typescript** as its one and only language.
The reason behind unifying the language is the ability for it to break data models and features into packages, which can be shared across its components (worker / frontend / backend).
This enables it to focus on learning fewer tooling options and perfect them across all its packages.
## Frontend
* Web framework/library: [React](https://reactjs.org/)
* Layout/components: [shadcn](https://shadcn.com/) / Tailwind
## Backend
* Framework: [Fastify](https://www.fastify.io/)
* Database: [PostgreSQL](https://www.postgresql.org/)
* Task Queuing: [Redis](https://redis.io/)
* Task Worker: [BullMQ](https://github.com/taskforcesh/bullmq)
## Testing
* Unit & Integration Tests: [Jest](https://jestjs.io/)
* E2E Test: [Playwright](https://playwright.dev/)
## Additional Tools
* Application monitoring: [Sentry](https://sentry.io/welcome/)
* CI/CD: [GitHub Actions](https://github.com/features/actions) / [Depot](https://depot.dev/) / [Kamal](https://kamal-deploy.org/)
* Containerization: [Docker](https://www.docker.com/)
* Linter: [ESLint](https://eslint.org/)
* Logging: [Loki](https://grafana.com/)
* Building: [NX Monorepo](https://nx.dev/)
## Adding New Tool
Adding a new tool isn't a simple choice. A simple choice is one that's easy to do or undo, or one that only affects your work and not others'.
We avoid adding new stuff to increase the ease of setup, which increases adoption. Having more dependencies means more moving parts and support.
If you're thinking about a new tool, ask yourself these:
* Is this tool open source? How can we give it to customers who use their own servers?
* What does it fix, and why do we need it now?
* Can we use what we already have instead?
These questions only apply to required services for everyone. If this tool speeds up your own work, we don't need to think so hard.
# Workers & Sandboxing
This component is responsible for polling jobs from the app, preparing the sandbox, and executing them with the engine.
## Jobs
There are three types of jobs:
* **Recurring Jobs**: Polling/schedule triggers jobs for active flows.
* **Flow Jobs**: Flows that are currently being executed.
* **Webhook Jobs**: Webhooks that still need to be ingested, as third-party webhooks can map to multiple flows or need mapping.
This documentation will not discuss how the engine works other than stating that it takes the jobs and produces the output. Please refer to [engine](./engine) for more information.
## Sandboxing
Sandbox in Activepieces means in which environment the engine will execute the flow. There are three types of sandboxes, each with different trade-offs:
### No Sandboxing & V8 Sandboxing
The difference between the two modes is in the execution of code pieces. For V8 Sandboxing, we use [isolated-vm](https://www.npmjs.com/package/isolated-vm), which relies on V8 isolation to isolate code pieces.
These are the steps that are used to execute the flow:
If the code doesn't exist, it will be compiled using TypeScript Compiler (tsc) and the necessary npm packages will be prepared, if possible.
Pieces are npm packages, we perform a simple check. If they don't exist, we use `pnpm` to install the pieces.
There is a pool of worker threads kept warm and the engine stays running and listening. Each thread executes one engine operation and sends back the result upon completion.
#### Security:
In a self-hosted environment, all piece installations are done by the **platform admin**. It is assumed that the pieces are secure, as they have full access to the machine.
Code pieces provided by the end user are isolated using V8, which restricts the user to browser JavaScript instead of Node.js with npm.
#### Performance
The flow execution is fast as the javascript can be, although there is overhead in polling from queue and prepare the files first time the flow get executed.
#### Benchmark
TBD
### Kernel Namespaces Sandboxing
This consists of two steps: the first one is preparing the sandbox, and the other one is the execution part.
#### Prepare the folder
Each flow will have a folder with everything required to execute this flows, which means the **engine**, **code pieces** and **npms**
If the code doesn't exist, it will be compiled using TypeScript Compiler (tsc) and the necessary npm packages will be prepared, if possible.
Pieces are npm packages, we perform simple check If they don't exist we use `pnpm` to install the pieces.
#### Execute Flow using Sandbox
In this mode, we use kernel namespaces to isolate everything (file system, memory, CPU). The folder prepared earlier will be bound as a **Read Only** Directory.
Then we use the command line and to spin up the isolation with new node process, something like that.
```bash
./isolate node path/to/flow.js --- rest of args
```
#### Security
The flow execution is isolated in their own namespaces, which means pieces are isolated in different process and namespaces, So the user can run bash scripts and use the file system safely as It's limited and will be removed after the execution, in this mode the user can use any **NPM package** in their code piece.
#### Performance
This mode is **Slow** and **CPU Intensive**. The reason behind this is the **cold boot** of Node.js, since each flow execution will require a new **Node.js** process. The Node.js process consumes a lot of resources and takes some time to compile the code and start executing.
#### Benchmark
TBD
# Environment Variables
To configure activepieces, you will need to set some environment variables, There is file called `.env` at the root directory for our main repo.
When you execute the [tools/deploy.sh](https://github.com/activepieces/activepieces/blob/main/tools/deploy.sh) script in the Docker installation tutorial,
it will produce these values.
## Environment Variables
| Variable | Description | Default Value | Example |
| ------------------------------------ | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------ | ---------------------------------------------------------------------- |
| `AP_CONFIG_PATH` | Optional parameter for specifying the path to store SQLite3 and local settings. | `~/.activepieces` | |
| `AP_CLOUD_AUTH_ENABLED` | Turn off the utilization of Activepieces oauth2 applications | `false` | |
| `AP_DB_TYPE` | The type of database to use. (POSTGRES / SQLITE3) | `SQLITE3` | |
| `AP_EXECUTION_MODE` | You can choose between 'SANDBOXED', 'UNSANDBOXED', 'SANDBOX\_CODE\_ONLY' as possible values. If you decide to change this, make sure to carefully read [https://www.activepieces.com/docs/install/architecture/workers](https://www.activepieces.com/docs/install/architecture/workers) | `UNSANDBOXED` | |
| `AP_FLOW_WORKER_CONCURRENCY` | The number of different flows can be processed in same time | `10` | |
| `AP_SCHEDULED_WORKER_CONCURRENCY` | The number of different scheduled flows can be processed in same time | `10` | |
| `AP_ENCRYPTION_KEY` | ❗️ Encryption key used for connections is a 16-character hexadecimal key. You can generate one using the following command: `openssl rand -hex 16`. | `None` | |
| `AP_EXECUTION_DATA_RETENTION_DAYS` | The number of days to retain execution data, logs and events. | `30` | |
| `AP_FRONTEND_URL` | ❗️ Url that will be used to specify redirect url and webhook url. | `None` | [https://demo.activepieces.com/api](https://demo.activepieces.com/api) |
| `AP_JWT_SECRET` | ❗️ Encryption key used for generating JWT tokens is a 32-character hexadecimal key. You can generate one using the following command: `openssl rand -hex 32`. | `None` | [https://demo.activepieces.com](https://demo.activepieces.com) |
| `AP_QUEUE_MODE` | The queue mode to use. (MEMORY / REDIS) | `MEMORY` | |
| `AP_QUEUE_UI_ENABLED` | Enable the queue UI (only works with redis) | `true` | |
| `AP_QUEUE_UI_USERNAME` | The username for the queue UI. This is required if `AP_QUEUE_UI_ENABLED` is set to `true`. | None | |
| `AP_QUEUE_UI_PASSWORD` | The password for the queue UI. This is required if `AP_QUEUE_UI_ENABLED` is set to `true`. | None | |
| `AP_TRIGGER_DEFAULT_POLL_INTERVAL` | The default polling interval determines how frequently the system checks for new data updates for pieces with scheduled triggers, such as new Google Contacts. | `5` | |
| `AP_PIECES_SOURCE` | `AP_PIECES_SOURCE`: `FILE` for local development, `DB` for database. You can find more information about it in [Setting Piece Source](#setting-piece-source) section. | `CLOUD_AND_DB` | |
| `AP_PIECES_SYNC_MODE` | `AP_PIECES_SYNC_MODE`: `NONE` for no metadata syncing / 'OFFICIAL\_AUTO' for automatic syncing for pieces metadata from cloud | `OFFICIAL_AUTO` | |
| `AP_POSTGRES_DATABASE` | ❗️ The name of the PostgreSQL database | `None` | |
| `AP_POSTGRES_HOST` | ❗️ The hostname or IP address of the PostgreSQL server | `None` | |
| `AP_POSTGRES_PASSWORD` | ❗️ The password for the PostgreSQL, you can generate a 32-character hexadecimal key using the following command: `openssl rand -hex 32`. | `None` | |
| `AP_POSTGRES_PORT` | ❗️ The port number for the PostgreSQL server | `None` | |
| `AP_POSTGRES_USERNAME` | ❗️ The username for the PostgreSQL user | `None` | |
| `AP_POSTGRES_USE_SSL` | Use SSL to connect the postgres database | `false` | |
| `AP_POSTGRES_SSL_CA` | Use SSL Certificate to connect to the postgres database | | |
| `AP_POSTGRES_URL` | Alternatively, you can specify only the connection string (e.g postgres\://user:password\@host:5432/database) instead of providing the database, host, port, username, and password. | `None` | |
| `AP_REDIS_TYPE` | Type of Redis, Possible values are `DEFAULT` or `SENTINEL`. | `DEFAULT` | |
| `AP_REDIS_URL` | If a Redis connection URL is specified, all other Redis properties will be ignored. | `None` | |
| `AP_REDIS_USER` | ❗️ Username to use when connect to redis | `None` | |
| `AP_REDIS_PASSWORD` | ❗️ Password to use when connect to redis | `None` | |
| `AP_REDIS_HOST` | ❗️ The hostname or IP address of the Redis server | `None` | |
| `AP_REDIS_PORT` | ❗️ The port number for the Redis server | `None` | |
| `AP_REDIS_DB` | The Redis database index to use | `0` | |
| `AP_REDIS_USE_SSL` | Connect to Redis with SSL | `false` | |
| `AP_REDIS_SSL_CA_FILE` | The path to the CA file for the Redis server. | `None` | |
| `AP_REDIS_SENTINEL_HOSTS` | If specified, this should be a comma-separated list of `host:port` pairs for Redis Sentinels. Make sure to set `AP_REDIS_CONNECTION_MODE` to `SENTINEL` | `None` | `sentinel-host-1:26379,sentinel-host-2:26379,sentinel-host-3:26379` |
| `AP_REDIS_SENTINEL_NAME` | The name of the master node monitored by the sentinels. | `None` | `sentinel-host-1` |
| `AP_REDIS_SENTINEL_ROLE` | The role to connect to, either `master` or `slave`. | `None` | `master` |
| `AP_OPENAI_API_KEY` | This is required only if you want to enable code copilot | `None` | |
| `AP_COPILOT_INSTANCE_TYPE` | Possible values are `AZURE_OPENAI`, `OPENAI` | `OPENAI` | |
| `AP_AZURE_OPENAI_ENDPOINT` | This is required only if you want to enable code copilot | `https://{{your-resource}}.openai.azure.com/openai/deployments/{{your-model}}` | |
| `AP_AZURE_OPENAI_API_VERSION` | This is required only if you want to enable code copilot | `2023-06-01-preview` | |
| `AP_TRIGGER_TIMEOUT_SECONDS` | Maximum allowed runtime for a trigger to perform polling in seconds | `None` | |
| `AP_FLOW_TIMEOUT_SECONDS` | Maximum allowed runtime for a flow to run in seconds | `600` | |
| `AP_SANDBOX_PROPAGATED_ENV_VARS` | Environment variables that will be propagated to the sandboxed code. If you are using it for pieces, we strongly suggests keeping everything in the authentication object to make sure it works across AP instances. | `None` | |
| `AP_TELEMETRY_ENABLED` | Collect telemetry information. | `true` | |
| `AP_TEMPLATES_SOURCE_URL` | This is the endpoint we query for templates, remove it and templates will be removed from UI | `https://cloud.activepieces.com/api/v1/flow-templates` | |
| `AP_WEBHOOK_TIMEOUT_SECONDS` | The default timeout for webhooks. The maximum allowed is 15 minutes. Please note that Cloudflare limits it to 30 seconds. If you are using a reverse proxy for SSL, make sure it's configured correctly. | `30` | |
| `AP_TRIGGER_FAILURE_THRESHOLD` | The maximum number of consecutive trigger failures is 576 by default, which is equivalent to approximately 2 days. | `30` | |
| `AP_PROJECT_RATE_LIMITER_ENABLED` | Enforce rate limits and prevent excessive usage by a single project. | `true` | |
| `AP_MAX_CONCURRENT_JOBS_PER_PROJECT` | The maximum number of active runs a project can have. This is used to enforce rate limits and prevent excessive usage by a single project. | `100` | |
| `AP_S3_ACCESS_KEY_ID` | The access key ID for your S3-compatible storage service. | `None` | |
| `AP_S3_SECRET_ACCESS_KEY` | The secret access key for your S3-compatible storage service. | `None` | |
| `AP_S3_BUCKET` | The name of the S3 bucket to use for file storage. | `None` | |
| `AP_S3_ENDPOINT` | The endpoint URL for your S3-compatible storage service. | `None` | `https://s3.amazonaws.com` |
| `AP_S3_REGION` | The region where your S3 bucket is located. | `None` | `us-east-1` |
| `AP_S3_USE_SIGNED_URLS` | It is used to route traffic to S3 directly. It should be enabled if the S3 bucket is public. | `false` | |
| `AP_MAX_FILE_SIZE_MB` | The maximum allowed file size in megabytes for uploads. | `None` | `10` |
| `AP_FILE_STORAGE_LOCATION` | The location to store files. Possible values are `DB` for storing files in the database or `S3` for storing files in an S3-compatible storage service. | `DB` | |
| `AP_PAUSED_FLOW_TIMEOUT_DAYS` | The maximum allowed pause duration in days for a paused flow, please note it can not exceed `AP_EXECUTION_DATA_RETENTION_DAYS` | `30` | |
The frontend URL is essential for webhooks and app triggers to work. It must
be accessible to third parties to send data.
### Separate Workers from api
To separate workers from API servers, you can achieve this by setting `AP_FLOW_WORKER_CONCURRENCY` to zero on API servers, while keeping it on worker servers.
### Setting Webhook (Frontend URL):
The default URL is set to the machine's IP address. To ensure proper operation, ensure that this address is accessible or specify an `AP_FRONTEND_URL` environment variable.
One possible solution for this is using a service like ngrok ([https://ngrok.com/](https://ngrok.com/)), which can be used to expose the frontend port (4200) to the internet.
### Setting Piece Source
These are the different options for the `AP_PIECES_SOURCE` environment variable:
1. `FILE`: **Only for Local Development**, this option loads pieces directly from local files. For Production, please consider using other options, as this one only supports a single version per piece.
2. `DB`: This option will only load pieces that are manually installed in the database from "My Pieces" or the Admin Console in the EE Edition. Pieces are loaded from npm, which provides multiple versions per piece, making it suitable for production.
You can also set AP\_PIECES\_SYNC\_MODE to `OFFICIAL_AUTO`, where it will update the metadata of pieces periodically.
### Redis Configuration
Set the `AP_REDIS_URL` environment variable to the connection URL of your Redis server.
Please note that if a Redis connection URL is specified, all other **Redis properties** will be ignored.
If you don't have the Redis URL, you can use the following command to get it. You can use the following variables:
* `REDIS_USER`: The username to use when connecting to Redis.
* `REDIS_PASSWORD`: The password to use when connecting to Redis.
* `REDIS_HOST`: The hostname or IP address of the Redis server.
* `REDIS_PORT`: The port number for the Redis server.
* `REDIS_DB`: The Redis database index to use.
* `REDIS_USE_SSL`: Connect to Redis with SSL.
If you are using **Redis Sentinel**, you can set the following environment variables:
* `AP_REDIS_TYPE`: Set this to `SENTINEL`.
* `AP_REDIS_SENTINEL_HOSTS`: A comma-separated list of `host:port` pairs for Redis Sentinels. When set, all other Redis properties will be ignored.
* `AP_REDIS_SENTINEL_NAME`: The name of the master node monitored by the sentinels.
* `AP_REDIS_SENTINEL_ROLE`: The role to connect to, either `master` or `slave`.
* `AP_REDIS_PASSWORD`: The password to use when connecting to Redis.
* `AP_REDIS_USE_SSL`: Connect to Redis with SSL.
* `AP_REDIS_SSL_CA_FILE`: The path to the CA file for the Redis server.
# Hardware Requirements
Specifications for hosting Activepieces
More information about architecture please visit our [architecture](../architecture/overview) page.
### Technical Specifications
Activepieces is designed to be memory-intensive rather than CPU-intensive. A modest instance will suffice for most scenarios, but requirements can vary based on specific use cases.
| Component | Memory (RAM) | CPU Cores | Notes |
| ------------ | ------------ | --------- | ---------------------------------------------------------------------------------------------------------------------------------- |
| PostgreSQL | 1 GB | 1 | |
| Redis | 1 GB | 1 | |
| Activepieces | 8 GB | 2 | For high availability, consider deploying across multiple machines. Set `FLOW_WORKER_CONCURRENCY` to `25` for optimal performance. |
The above recommendations are designed to meet the needs of the majority of use cases.
## Scaling Factors
### Redis
Redis requires minimal scaling as it primarily stores jobs during processing. Activepieces leverages BullMQ, capable of handling a substantial number of jobs per second.
### PostgreSQL
**Scaling Tip:** Since files are stored in the database, you can alleviate the load by configuring S3 storage for file management.
PostgreSQL is typically not the system's bottleneck.
### Activepieces Container
**Scaling Tip:** The Activepieces container is stateless, allowing for seamless horizontal scaling.
* `FLOW_WORKER_CONCURRENCY` and `SCHEDULED_WORKER_CONCURRENCY` dictate the number of concurrent jobs processed for flows and scheduled flows, respectively. By default, these are set to 20 and 10.
## Expected Performance
Activepieces ensures no request is lost; all requests are queued. In the event of a spike, requests will be processed later, which is acceptable as most flows are asynchronous, with synchronous flows being prioritized.
It's hard to predict exact performance because flows can be very different. But running a flow doesn't slow things down, as it runs as fast as regular JavaScript.
(Note: This applies to `SANDBOXED_CODE_ONLY` and `UNSANDBOXED` execution modes, which are recommended and used in self-hosted setups.)
You can anticipate handling over **20 million executions** monthly with this setup.
# Deployment Checklist
Checklist to follow after deploying Activepieces
This tutorial assumes you have already followed the quick start guide using one of the installation methods listed in [Install Overview](../overview).
In this section, we will go through the checklist after using one of the installation methods and ensure that your deployment is production-ready.
You should decide on the sandboxing mode for your deployment based on your use case and whether it is multi-tenant or not. Here is a simplified way to decide:
**Friendly Tip #1**: For multi-tenant setups, use V8/Code Sandboxing.
It is secure and does not require privileged Docker access in Kubernetes.
Privileged Docker is usually not allowed to prevent root escalation threats.
**Friendly Tip #2**: For single-tenant setups, use No Sandboxing. It is faster and does not require privileged Docker access.
More Information at [Sandboxing & Workers](../architecture/workers#sandboxing)
For licensing inquiries regarding the self-hosted enterprise edition, please reach out to `sales@activepieces.com`, as the code and Docker image are not covered by the MIT license.
You can request a trial key from within the app or in the cloud by filling out the form. Alternatively, you can contact sales at [https://www.activepieces.com/sales](https://www.activepieces.com/sales).
Please know that when your trial runs out, all enterprise [features](/about/editions#feature-comparison) will be shut down meaning any user other than the platform admin will be deactivated, and your private pieces will be deleted, which could result in flows using them to fail.
Enterprise Edition only works on Fresh Installation as the database migration scripts are different from the community edition.
Enterprise edition must use `PostgreSQL` as the database backend and `Redis` as the Queue System.
## Installation
1. Set the `AP_EDITION` environment variable to `ee`.
2. Set the `AP_EXECUTION_MODE` to anything other than `UNSANDBOXED`, check the above section.
3. Once your instance is up, activate the license key by going to Platform Admin -> Settings -> License Keys.
![Activation License Key](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/activation-license-key-settings.png)
Setting up HTTPS is highly recommended because many services require webhook URLs to be secure (HTTPS). This helps prevent potential errors.
To set up SSL, you can use any reverse proxy. For a step-by-step guide, check out our example using [Nginx](./setup-ssl).
If you are looking to enable the "ASK AI" feature in the code piece, you need to set the `AP_OPENAI_API_KEY` environment variable to your OpenAI API key.
Run logs and files are stored in the database by default, but you can switch to S3 later without any migration; for most cases, the database is enough.
It's recommended to start with the database and switch to S3 if needed. After switching, expired files in the database will be deleted, and everything will be stored in S3. No manual migration is needed.
Configure the following environment variables:
* `AP_S3_ACCESS_KEY_ID`
* `AP_S3_SECRET_ACCESS_KEY`
* `AP_S3_ENDPOINT`
* `AP_S3_BUCKET`
* `AP_S3_REGION`
* `AP_MAX_FILE_SIZE_MB`
* `AP_FILE_STORAGE_LOCATION` (set to `S3`)
* `AP_S3_USE_SIGNED_URLS`
**Friendly Tip #1**: If the S3 bucket is public, you can set `AP_S3_USE_SIGNED_URLS` to `true` to route traffic to S3 directly and avoid heavy traffic on your API server.
If you encounter any issues, check out our [Troubleshooting](./troubleshooting) guide.
# Setup App Webhooks
Certain apps like Slack and Square only support one webhook per OAuth2 app. This means that manual configuration is required in their developer portal, and it cannot be automated.
## Slack
**Configure Webhook Secret**
1. Visit the "Basic Information" section of your Slack OAuth settings.
2. Copy the "Signing Secret" and save it.
3. Set the following environment variable in your activepieces environment:
```
AP_APP_WEBHOOK_SECRETS={"@activepieces/piece-slack": {"webhookSecret": "SIGNING_SECRET"}}
```
4. Restart your application instance.
**Configure Webhook URL**
1. Go to the "Event Subscription" settings in the Slack OAuth2 developer platform.
2. The URL format should be: `https://YOUR_AP_INSTANCE/api/v1/app-events/slack`.
3. When connecting to Slack, use your OAuth2 credentials or update the OAuth2 app details from the admin console (in platform plans).
4. Add the following events to the app:
* `message.channels`
* `reaction_added`
* `message.im`
* `message.groups`
* `message.mpim`
* `app_mention`
# Setup HTTPS
To enable SSL, you can use a reverse proxy. In this case, we will use Nginx as the reverse proxy.
## Install Nginx
```bash
sudo apt-get install nginx
```
## Create Certificate
To proceed with this documentation, it is assumed that you already have a certificate for your domain.
You have the option to use Cloudflare or generate a certificate using Let's Encrypt or Certbot.
Add the certificate to the following paths: `/etc/key.pem` and `/etc/cert.pem`
## Setup Nginx
```bash
sudo nano /etc/nginx/sites-available/default
```
```bash
server {
listen 80;
listen [::]:80;
server_name example.com www.example.com;
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
server_name example.com www.example.com;
ssl_certificate /etc/cert.pem;
ssl_certificate_key /etc/key.pem;
location / {
proxy_pass http://localhost:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
```
## Restart Nginx
```bash
sudo systemctl restart nginx
```
## Test
Visit your domain and you should see your application running with SSL.
# Troubleshooting
### Test Flow Button is Not Working
If the Test Flow button is not working, it might be due to an improperly configured websocket in your reverse proxy. Ensure that the websocket is correctly set up to resolve this issue, check our [Setup HTTPS](./setup-ssl) example.
### Runs with Internal Errors or Scheduling Issues
The Bull Board is a tool that allows you to check runs for issues.
It is accessible when you are using Redis as the queue system. In production mode, you can access it at `/api/ui`, and in development mode, it is located at `/ui`.
To enable the Bull Board UI, follow these steps:
1. Define the following environment variables:
* `AP_QUEUE_UI_ENABLED`: Set it to `true`.
* `AP_QUEUE_UI_USERNAME`: Set it to your desired username.
* `AP_QUEUE_UI_PASSWORD`: Set it to your desired password.
Make sure to change the username and password to your preferred values.
You will be able to access the Bull Board UI at `/api/ui` (production) or `/ui` (development) depending on your environment.
![Bull board](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/bullboard-ui.png)
There are two queues that you can monitor:
* oneTimeJobs: This queue is used for currently pending runs.
* repeatableJobs: This queue is used for polling triggers.
In case you have flows with internal errors, please go to the oneTimeJobs queue and click on the failed run. You can see the error message in the data section and retry the flow by clicking on the retry button.
### Reset Password
If you forgot your password on self hosted instance, you can reset it using the following steps:
**Postgres**
1. **Locate PostgreSQL Docker Container**:
* Use a command like `docker ps` to find the PostgreSQL container.
2. **Access the Container**:
* Use SSH to access the PostgreSQL Docker container.
```bash
docker exec -it CONTAINER_ID /bin/bash
```
3. **Open the PostgreSQL Console**:
* Inside the container, open the PostgreSQL console with the `psql` command.
```bash
psql -U postgres
```
4. **Create a Secure Password**:
* Use a tool like [bcrypt.online](https://bcrypt.online/) to generate a new secure password, number of rounds is 10.
5. **Update Your Password**:
* Run the following SQL query within the PostgreSQL console, replacing `HASH_PASSWORD` with your new password and `YOUR_EMAIL_ADDRESS` with your email.
```sql
UPDATE public.user SET password='HASH_PASSWORD' WHERE email='YOUR_EMAIL_ADDRESS';
```
**SQLite3**
1. **Open the SQLite3 Shell**:
* Access the SQLite3 database by opening the SQLite3 shell. Replace "database.db" with the actual name of your SQLite3 database file if it's different.
```bash
sqlite3 ~/.activepieces/database.sqlite
```
2. **Create a Secure Password**:
* Use a tool like [bcrypt.online](https://bcrypt.online/) to generate a new secure password, number of rounds is 10.
3. **Reset Your Password**:
* Once inside the SQLite3 shell, you can update your password with an SQL query. Replace `HASH_PASSWORD` with your new password and `YOUR_USERNAME` with your username or email.
```sql
UPDATE user SET password = 'HASH_PASSWORD' WHERE email = 'YOUR_EMAIL_ADDRESS';
```
4. **Exit the SQLite3 Shell**:
* After making the changes, exit the SQLite3 shell by typing:
```bash
.exit
```
# AWS (Pulumi)
Get Activepieces up & running on AWS with Pulumi for IaC
# Infrastructure-as-Code (IaC) with Pulumi
Pulumi is an IaC solution akin to Terraform or CloudFormation that lets you deploy & manage your infrastructure using popular programming languages e.g. Typescipt (which we'll use), C#, Go etc.
## Deploy from Pulumi Cloud
If you're already familiar with Pulumi Cloud and have [integrated their services with your AWS account](https://www.pulumi.com/docs/pulumi-cloud/deployments/oidc/aws/#configuring-openid-connect-for-aws), you can use the button below to deploy Activepieces in a few clicks.
The template will deploy the latest Activepieces image that's available on [Docker Hub](https://hub.docker.com/r/activepieces/activepieces).
[![Deploy with Pulumi](https://get.pulumi.com/new/button.svg)](https://app.pulumi.com/new?template=https://github.com/activepieces/activepieces/tree/main/deploy/pulumi)
## Deploy from a local environment
Or, if you're currently using an S3 bucket to maintain your Pulumi state, you can scaffold and deploy Activepieces direct from Docker Hub using the template below in just few commands:
```bash
$ mkdir deploy-activepieces && cd deploy-activepieces
$ pulumi new https://github.com/activepieces/activepieces/tree/main/deploy/pulumi
$ pulumi up
```
## What's Deployed?
The template is setup to be somewhat flexible, supporting what could be a development or more production-ready configuration.
The configuration options that are presented during stack configuration will allow you to optionally add any or all of:
* PostgreSQL RDS instance. Opting out of this will use a local SQLite3 Db.
* Single node Redis 7 cluster. Opting out of this will mean using an in-memory cache.
* Fully qualified domain name with SSL. Note that the hosted zone must already be configured in Route 53.
Opting out of this will mean relying on using the application load balancer's url over standard HTTP to access your Activepieces deployment.
For a full list of all the currently available configuration options, take a look at the [Activepieces Pulumi template file on GitHub](https://github.com/activepieces/activepieces/tree/main/deploy/pulumi/Pulumi.yaml).
## Setting up Pulumi for the first time
If you're new to Pulumi then read on to get your local dev environment setup to be able to deploy Activepieces.
### Prerequisites
1. Make sure you have [Node](https://nodejs.org/en/download) and [Pulumi](https://www.pulumi.com/docs/install/) installed.
2. [Install and configure the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).
3. [Install and configure Pulumi](https://www.pulumi.com/docs/clouds/aws/get-started/begin/).
4. Create an S3 bucket which we'll use to maintain the state of all the various service we'll provision for our Activepieces deployment:
```bash
aws s3api create-bucket --bucket pulumi-state --region us-east-1
```
Note: [Pulumi supports to two different state management approaches](https://www.pulumi.com/docs/concepts/state/#deciding-on-a-state-backend).
If you'd rather use Pulumi Cloud instead of S3 then feel free to skip this step and setup an account with Pulumi.
5. Login to the Pulumi backend:
```bash
pulumi login s3://pulumi-state?region=us-east-1
```
6. Next we're going to use the Activepieces Pulumi deploy template to create a new project, a stack in that project and then kick off the deploy:
```bash
$ mkdir deploy-activepieces && cd deploy-activepieces
$ pulumi new https://github.com/activepieces/activepieces/tree/main/deploy/pulumi
```
This step will prompt you to create you stack and to populate a series of config options, such as whether or not to provision a PostgreSQL RDS instance or use SQLite3.
Note: When choosing a stack name, use something descriptive like `activepieces-dev`, `ap-prod` etc.
This solution uses the stack name as a prefix for every AWS service created\
e.g. your VPC will be named `-vpc`.
7. Nothing left to do now but kick off the deploy:
```bash
pulumi up
```
8. Now choose `yes` when prompted. Once the deployment has finished, you should see a bunch of Pulumi output variables that look like the following:
```json
_: {
activePiecesUrl: "http://.us-east-1.elb.amazonaws.com"
activepiecesEnv: [
. . . .
]
}
```
The config value of interest here is the `activePiecesUrl` as that is the URL for our Activepieces deployment.
If you chose to add a fully qualified domain during your stack configuration, that will be displayed here.
Otherwise you'll see the URL to the application load balancer. And that's it.
Congratulations! You have successfully deployed Activepieces to AWS.
## Deploy a locally built Activepieces Docker image
To deploy a locally built image instead of using the official Docker Hub image, read on.
1. Clone the Activepieces repo locally:
```bash
git clone https://github.com/activepieces/activepieces
```
2. Move into the `deploy/pulumi` folder & install the necessary npm packages:
```bash
cd deploy/pulumi && npm i
```
3. This folder already has two Pulumi stack configuration files reday to go: `Pulumi.activepieces-dev.yaml` and `Pulumi.activepieces-prod.yaml`.
These files already contain all the configurations we need to create our environments. Feel free to have a look & edit the values as you see fit.
Lets continue by creating a development stack that uses the existing `Pulumi.activepieces-dev.yaml` file & kick off the deploy.
```bash
pulumi stack init activepieces-dev && pulumi up
```
Note: Using `activepieces-dev` or `activepieces-prod` for the `pulumi stack init` command is required here as the stack name needs to match the existing stack file name in the folder.
4. You should now see a preview in the terminal of all the services that will be provisioned, before you continue.
Once you choose `yes`, a new image will be built based on the `Dockerfile` in the root of the solution (make sure Docker Desktop is running) and then pushed up to a new ECR, along with provisioning all the other AWS services for the stack.
Congratulations! You have successfully deployed Activepieces into AWS using a locally built Docker image.
## Customising the deploy
All of the current configuration options, as well as the low-level details associated with the provisioned services are fully customisable, as you would expect from any IaC.
For example, if you'd like to have three availability zones instead of two for the VPC, use an older version of Redis or add some additional security group rules for PostgreSQL, you can update all of these and more in the `index.ts` file inside the `deploy` folder.
Or maybe you'd still like to deploy the official Activepieces Docker image instead of a local build, but would like to change some of the services. Simply set the `deployLocalBuild` config option in the stack file to `false` and make whatever changes you'd like to the `index.ts` file.
Checking out the [Pulumi docs](https://www.pulumi.com/docs/clouds/aws/) before doing so is highly encouraged.
# Docker
Single docker image deployment with SQLite3 and Memory Queue
Set up Activepieces using Docker Compose for easy deployment - Ideal for personal and testing with SQLite3 and in-memory queue.
For production (companies), use PostgreSQL and Redis, Refer to docker compose setup.
To get up and running quickly with Activepieces, we will use the Activepieces Docker image. Follow these steps:
## Prerequisites
You need to have [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Docker](https://docs.docker.com/get-docker/) installed on your machine in order to set up Activepieces via Docker Compose.
## Install
### Pull Image and Run Docker image
Pull the Activepieces Docker image and run the container with the following command:
```bash
docker run -d -p 8080:80 -v ~/.activepieces:/root/.activepieces -e AP_QUEUE_MODE=MEMORY -e AP_DB_TYPE=SQLITE3 -e AP_FRONTEND_URL="http://localhost:8080" activepieces/activepieces:latest
```
### Configure Webhook URL (Important for Triggers, Optional If you have public IP)
**Note:** By default, Activepieces will try to use your public IP for webhooks. If you are self-hosting on a personal machine, you must configure the frontend URL so that the webhook is accessible from the internet.
**Optional:** The easiest way to expose your webhook URL on localhost is by using a service like ngrok. However, it is not suitable for production use.
1. Install ngrok
2. Run the following command:
```bash
ngrok http 8080
```
3. Replace `AP_FRONTEND_URL` environment variable in the command line above.
![Ngrok](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/docker-ngrok.png)
## Upgrade
Please follow the steps below:
### Step 1: Back Up Your Data (Recommended)
Before proceeding with the upgrade, it is always a good practice to back up your Activepieces data to avoid any potential data loss during the update process.
1. **Stop the Current Activepieces Container:** If your Activepieces container is running, stop it using the following command:
```bash
docker stop activepieces_container_name
```
2. **Backup Activepieces Data Directory:** By default, Activepieces data is stored in the `~/.activepieces` directory on your host machine. Create a backup of this directory to a safe location using the following command:
```bash
cp -r ~/.activepieces ~/.activepieces_backup
```
### Step 2: Update the Docker Image
1. **Pull the Latest Activepieces Docker Image:** Run the following command to pull the latest Activepieces Docker image from Docker Hub:
```bash
docker pull activepieces/activepieces:latest
```
### Step 3: Remove the Existing Activepieces Container
1. **Stop and Remove the Current Activepieces Container:** If your Activepieces container is running, stop and remove it using the following commands:
```bash
docker stop activepieces_container_name
docker rm activepieces_container_name
```
### Step 4: Run the Updated Activepieces Container
Now, run the updated Activepieces container with the latest image using the same command you used during the initial setup. Be sure to replace `activepieces_container_name` with the desired name for your new container.
```bash
docker run -d -p 8080:80 -v ~/.activepieces:/root/.activepieces -e AP_QUEUE_MODE=MEMORY -e AP_DB_TYPE=SQLITE3 -e AP_FRONTEND_URL="http://localhost:8080" --name activepieces_container_name activepieces/activepieces:latest
```
Congratulations! You have successfully upgraded your Activepieces Docker deployment
# Docker Compose
To get up and running quickly with Activepieces, we will use the Activepieces Docker image. Follow these steps:
## Prerequisites
You need to have [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Docker](https://docs.docker.com/get-docker/) installed on your machine in order to set up Activepieces via Docker Compose.
## Installing
**1. Clone Activepieces repository.**
Use the command line to clone Activepieces repository:
```bash
git clone https://github.com/activepieces/activepieces.git
```
**2. Go to the repository folder.**
```bash
cd activepieces
```
**3.Generate Environment variable**
Run the following command from the command prompt / terminal
```bash
sh tools/deploy.sh
```
If none of the above methods work, you can rename the .env.example file in the root directory to .env and fill in the necessary information within the file.
**4. Run Activepieces.**
Please note that "docker-compose" (with a dash) is an outdated version of Docker Compose and it will not work properly. We strongly recommend downloading and installing version 2 from the [here](https://docs.docker.com/compose/install/) to use Docker Compose.
```bash
docker compose -p activepieces up
```
## 4. Configure Webhook URL (Important for Triggers, Optional If you have public IP)
**Note:** By default, Activepieces will try to use your public IP for webhooks. If you are self-hosting on a personal machine, you must configure the frontend URL so that the webhook is accessible from the internet.
**Optional:** The easiest way to expose your webhook URL on localhost is by using a service like ngrok. However, it is not suitable for production use.
1. Install ngrok
2. Run the following command:
```bash
ngrok http 8080
```
3. Replace `AP_FRONTEND_URL` environment variable in `.env` with the ngrok url.
![Ngrok](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/docker-ngrok.png)
When deploying for production, ensure that you update the database credentials and properly set the environment variables.
Review the [configurations guide](/install/configuration/environment-variables) to make any necessary adjustments.
## Upgrading
To upgrade to new versions, which are installed using docker compose, perform the following steps. First, open a terminal in the activepieces repository directory and run the following commands.
### Automatic Pull
**1. Run the update script**
```bash
sh tools/update.sh
```
### Manually Pull
**1. Pull the new docker compose file**
```bash
git pull
```
**2. Pull the new images**
```bash
docker compose pull
```
**3. Review changelog for breaking changes**
Please review breaking changes in the [changelog](../../about/breaking-changes).
**4. Run the updated docker images**
```
docker compose up -d --remove-orphans
```
Congratulations! You have now successfully updated the version.
## Deleting
The following command is capable of deleting all Docker containers and associated data, and therefore should be used with caution:
```
sh tools/reset.sh
```
Executing this command will result in the removal of all Docker containers and the data stored within them. It is important to be aware of the potentially hazardous nature of this command before proceeding.
# Easypanel
Run Activepieces with Easypanel 1-Click Install
Easypanel is a modern server control panel. If you [run Easypanel](https://easypanel.io/docs) on your server, you can deploy Activepieces with 1 click on it.
![Deploy to Easypanel](https://easypanel.io/img/deploy-on-easypanel-40.svg)
## Instructions
1. Create a VM that runs Ubuntu on your cloud provider.
2. Install Easypanel using the instructions from the website.
3. Create a new project.
4. Install Activepieces using the dedicated template.
# Elestio
Run Activepieces with Elestio 1-Click Install
You can deploy Activepieces on Elestio using one-click deployment. Elestio handles version updates, maintenance, securtiy, backups, etc. So go ahead and click below to deploy and start using.
[![Deploy on Elestio](https://elest.io/images/logos/deploy-to-elestio-btn.png)](https://elest.io/open-source/activepieces)
# GCP
This documentation is to deploy activepieces on VM Instance or VM Instance Group, we should first create VM template
## Create VM Template
First choose machine type (e.g e2-medium)
After configuring the VM Template, you can proceed to click on "Deploy Container" and specify the following container-specific settings:
* Image: activepieces/activepieces
* Run as a privileged container: true
* Environment Variables:
* `AP_QUEUE_MODE`: MEMORY
* `AP_DB_TYPE`: SQLITE3
* `AP_FRONTEND_URL`: [http://localhost:80](http://localhost:80)
* `AP_EXECUTION_MODE`: SANDBOXED
* Firewall: Allow HTTP traffic (for testing purposes only)
Once these details are entered, click on the "Deploy" button and patiently wait for the container deployment process to complete.\\
After a successful deployment, you can access the ActivePieces application by visiting the external IP address of the VM on GCP.
## Production Deployment
Please visit [ActivePieces](/install/configuration/environment-variables) for more details on how to customize the application.
# Overview
Introduction to the different ways to install Activepieces
Activepieces Community Edition can be deployed using **Docker**, **Docker Compose**, and **Kubernetes**.
Community Edition is **free** and **open source**.
You can read the difference between the editions [here](../about/editions).
## Recommended Options
Deploy Activepieces as a single Docker container using the SQLite database.
Deploy Activepieces with **Redis** and **PostgreSQL** setup.
## Other Options
}
href="./options/easypanel"
>
1-Click Install with Easypanel template, maintained by the community.
1-Click Install on Elestio.
Install on AWS with Pulumi.
Install on GCP as a VM template.
}
href="https://www.pikapods.com/pods?run=activepieces"
>
Instantly run on PikaPods from \$2.9/month.
Easily install on RepoCloud using this template, maintained by the community.
}
href="https://zeabur.com/templates/LNTQDF"
>
1-Click Install on Zeabur.
## Cloud Edition
This is the fastest option.
# Connection Deleted
# Connection Upserted
# Flow Created
# Flow Deleted
# Flow Run Finished
# Flow Run Started
# Flow Updated
# Folder Created
# Folder Deleted
# Folder Updated
# Overview
This table in admin console contains all application events. We are constantly adding new events, so there is no better place to see the events defined in the code than [here](https://github.com/activepieces/activepieces/blob/main/packages/ee/shared/src/lib/audit-events/index.ts).
![Audit Logs](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/audit-logs.png)
# Signing Key Created
# User Email Verified
# User Password Reset
# User Signed In
# User Signed Up
# Environments & Git Sync
The Git Sync feature allows for the creation of an **external backup**, **environments**, and maintaining a **version history**.
### How It Works:
This example explains a simple setup for creating development and production environments. The setup can be extended to include multiple environments and multiple Git branches.
**Requirements:**
* Empty Git Repository
* Two Projects in Activepieces: one for Development and one for Production.
#### 1. Push Flow to the repository
After making changes in the flow, You click on arrow near the flow name and select "Push to Git", Add commit and push.
#### 2. Deleting a Flow from the Repository
When you delete a flow from a project that has the project configured on the development branch, it will also automatically delete the flows from Git.
#### 3. Pull from the Repository
Please note that the credentials will not be synced automatically. You should manually create identical credentials with the same names in both environments.
You can trigger a pull from the Git Repository button in Activepieces. This action will replace all flows in the project with those from the git repository.
* All flows that are enabled in production will be updated and republished, If a flow fails to republish, a new version will be created as a draft.
* All flows that are **not** in the git repository will be deleted.
* New flows created in production will be disabled by default.
#### Approval Workflow (Optional)
To manage your approval workflow, you can use Git by creating two branches: development and production. Then, you can use standard pull requests as the approval step.
#### GitHub action
This GitHub action can be used to automatically pull changes upon merging.
Don't forget to replace `INSTANCE_URL` and `PROJECT_ID`, and add `ACTIVEPIECES_API_KEY` to the secrets.
```yml
name: Auto Deploy
on:
workflow_dispatch:
push:
branches: [ "main" ]
jobs:
run-pull:
runs-on: ubuntu-latest
steps:
- name: deploy
# Use GitHub secrets
run: |
curl --request POST \
--url {INSTANCE_URL}/api/v1/git-repos/pull \
--header 'Authorization: Bearer ${{ secrets.ACTIVEPIECES_API_KEY }}' \
--header 'Content-Type: application/json' \
--data '{
"projectId": "{PROJECT_ID}"
}'
```
# Project Permissions
Documentation on project permissions in Activepieces
Activepieces utilizes Role-Based Access Control (RBAC) for managing permissions within projects. Each project consists of multiple flows and users, with each user assigned specific roles that define their actions within the project.
The supported roles in Activepieces are:
* **Admin:**
* View Flows
* Edit Flows
* Publish/Turn On and Off Flows
* View Runs
* Retry Runs
* View Issues
* Resolve Issues
* View Connections
* Edit Connections
* View Project Members
* Add/Remove Project Members
* Configure Git Repo to Sync Flows With
* Push/Pull Flows to/from Git Repo
* **Editor:**
* View Flows
* Edit Flows
* Publish/Turn On and Off Flows
* View Runs
* Retry Runs
* View Connections
* Edit Connections
* View Issues
* Resolve Issues
* View Project Members
* **Operator:**
* Publish/Turn On and Off Flows
* View Runs
* Retry Runs
* View Issues
* View Connections
* Edit Connections
* View Project Members
* **Viewer:**
* View Flows
* View Runs
* View Connections
* View Project Members
* View Issues
# Security & Data Practices
We prioritize security and follow these practices to keep information safe.
## External Systems Credentials
**Storing Credentials**
All credentials are stored with 256-bit encryption keys, and there is no API to retrieve them for the user. They are sent only during processing, after which access is revoked from the engine.
**Data Masking**
We implement a robust data masking mechanism where third-party credentials or any sensitive information are systematically censored within the logs, guaranteeing that sensitive information is never stored or documented.
**OAuth2**
Integrations with third parties are always done using OAuth2, with a limited number of scopes when third-party support allows.
## Vulnerability Disclosure
Activepieces is an open-source project that welcomes contributors to test and report security issues.
For detailed information about our security policy, please refer to our GitHub Security Policy at: [https://github.com/activepieces/activepieces/security/policy](https://github.com/activepieces/activepieces/security/policy)
## Access and Authentication
**Role-Based Access Control (RBAC)**
To manage user access, we utilize Role-Based Access Control (RBAC). Team admins assign roles to users, granting them specific permissions to access and interact with projects, folders, and resources. RBAC allows for fine-grained control, enabling administrators to define and enforce access policies based on user roles.
**Single Sign-On (SSO)**
Implementing Single Sign-On (SSO) serves as a pivotal component of our security strategy. SSO streamlines user authentication by allowing them to access Activepieces with a single set of credentials. This not only enhances user convenience but also strengthens security by reducing the potential attack surface associated with managing multiple login credentials.
**Audit Logs**
We maintain comprehensive audit logs to track and monitor all access activities within Activepieces. This includes user interactions, system changes, and other relevant events. Our meticulous logging helps identify security threats and ensures transparency and accountability in our security measures.
**Password Policy Enforcement**
Users log in to Activepieces using a password known only to them. Activepieces enforces password length and complexity standards. Passwords are not stored; instead, only a secure hash of the password is stored in the database. For more information.
## Privacy & Data
**Supported Cloud Regions**
Presently, our cloud services are available in Germany as the supported data region.
We have plans to expand to additional regions in the near future.
If you opt for **self-hosting**, the available regions will depend on where you choose to host.
**Policy**
To better understand how we handle your data and prioritize your privacy, please take a moment to review our [Privacy Policy](https://www.activepieces.com/privacy). This document outlines in detail the measures we take to safeguard your information and the principles guiding our approach to privacy and data protection.
# Single Sign-On
## Enforcing SSO
You can enforce SSO by specifying the domain. As part of the SSO configuration, you have the option to disable email and user login. This ensures that all authentication is routed through the designated SSO provider.
![SSO](https://mintlify.s3-us-west-1.amazonaws.com/activepieces/resources/screenshots/sso.png)
## Supported SSO Providers
You can enable various SSO providers, including Google and GitHub, to integrate with your system by configuring SSO.
### Google:
### GitHub:
### SAML with OKTA: