Skip to content

Commit 63a0a3e

Browse files
Merge branch 'main' into branched-storage-ga
2 parents ee8d94e + 85ebd7a commit 63a0a3e

2 files changed

Lines changed: 3 additions & 5 deletions

File tree

transformations/index.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -63,9 +63,7 @@ tables from the input mapping are taken, modified, and produced into the tables
6363

6464
A backend is the engine running the transformation script. It is a database server
6565
([Snowflake](https://www.snowflake.com/),
66-
[Exasol](https://www.exasol.com/),
67-
[Teradata](https://www.teradata.com/),
68-
[BigQuery](https://cloud.google.com/bigquery)),
66+
[BigQuery](https://cloud.google.com/bigquery)),
6967
or a language interpreter
7068
([Python](https://www.python.org/about/),
7169
[R](https://www.r-project.org/about.html)).
@@ -268,4 +266,4 @@ When triggered
268266

269267
With the [read-only input mapping](/transformations/mappings/#read-only-input-mapping) feature, you can access all buckets (your own or linked) in transformations. Your transformation user
270268
has read-only access to buckets (and their tables), so you can access such data. So, there is no need to specify standard input mapping
271-
for your transformations. The name of the backend object (database, schema, etc.) depends on the backend you use, and it contains the bucket ID (not the bucket name).
269+
for your transformations. The name of the backend object (database, schema, etc.) depends on the backend you use, and it contains the bucket ID (not the bucket name).

tutorial/onboarding/architecture-guide/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ storage, and processes. It provides a structured framework for distributing task
1818
Each project includes the following key components:
1919

2020
1. **Storage**
21-
- Relational databases such as Snowflake, Exasol, and others
21+
- Relational databases such as Snowflake, BigQuery, and others
2222
- Object storage options like S3 or Azure Blob Storage
2323
2. **Components**
2424
- Data sources and destinations for loading data into Keboola Storage and exporting to databases, services, or applications

0 commit comments

Comments
 (0)