Skip to content

Commit

Permalink
Update EXECUTE document
Browse files Browse the repository at this point in the history
  • Loading branch information
PastorGL committed Aug 11, 2023
1 parent 1274311 commit ff3f989
Show file tree
Hide file tree
Showing 3 changed files with 111 additions and 41 deletions.
10 changes: 6 additions & 4 deletions BUILD.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,19 @@
### How to Build

To build Data Cooker ETL, you need Java 11 and Apache Maven. Minimum supported version of Maven is enforced in the [project file](./pom.xml), so please look into enforcer plugin section. For Java, [Amazon's Corretto](https://corretto.aws/) is the preferred distribution.
To build Data Cooker ETL executable FatJAR artifact, you need Java 11 and Apache Maven.

There are two profiles to target AWS EMR production environment (`EMR` — selected by default) and for local testing of ETL processes (`local`), so you have to call
Minimum supported version of Maven is enforced in the [project file](./pom.xml), so please look into enforcer plugin section. For Java, [Amazon's Corretto](https://corretto.aws/) is the preferred distribution.

There are two profiles to target AWS EMR production environment (`EMR` — selected by default) and for local debugging of ETL processes (`local`), so you have to call
```bash
mvn clean package
```
or
```bash
mvn -Plocal clean package
```
to build a shaded executable 'Fat JAR' artifact, [datacooker-etl-cli.jar](./cli/target/datacooker-etl-cli.jar).
to build a desired flavor of [datacooker-etl-cli.jar](./cli/target/datacooker-etl-cli.jar).

Currently supported version of EMR is 6.9. For local testing, Ubuntu 22.04 is recommended (either native or inside WSL).
Currently supported version of EMR is 6.9. For local debugging, Ubuntu 22.04 is recommended (either native or inside WSL).

As well as executable artifact, modular documentation is automatically built from the modules' metadata at [docs](./cli/docs/) directory, in both HTML ([single-file](./cli/docs/merged.html) and [linked files](./cli/docs/index.html)) and [PDF](./cli/docs/merged.pdf) formats.
138 changes: 103 additions & 35 deletions EXECUTE.md
Original file line number Diff line number Diff line change
@@ -1,70 +1,138 @@
### Local Execution
### Execution Modes

**Data Cooker ETL** provides a handful of different execution modes, batch and interactive, local and remote, in
different combinations.

Refer to following matrix:

Execution Mode | Batch Script \[Dry\] | Interactive... | ...with AutoExec Script \[Dry\]
------------------------------|----------------------|----------------|---------------------------------
On Spark Cluster | -s \[-d\] | |
Local | -l -s \[-d\] | -R | -R -s \[-d\]
REPL Server On Spark Cluster | | -e | -e -s \[-d\]
REPL Server Local | | -l -e | -l -e -s \[-d\]
REPL Client | | -r | -r -s \[-d\]

To locally test Data Cooker ETL, you need an executable artifact [built](BUILD.md) with `local` profile.
Cells with command line keys indicate which keys to use to run Data Cooker ETL in the desired execution mode. Empty
cells indicate unsupported modes.

### Command Line in General

To familiarize with CLI command line, just invoke artifact with `-h` as lone argument:

First, invoke it with `--help` argument to get a list of options:
```bash
java -jar datacooker-etl-cli.jar --help
java -jar datacooker-etl-cli.jar -h
```

If its output is similar to

```
usage: Data Cooker ETL
-h,--help Print a list of command line options and exit
-s,--script <arg> TDL4 script file
-d,--dry Dry run: only check script syntax and print
errors to console, if found
usage: Data Cooker ETL (ver. 3.8.0)
-h,--help Print full list of command line options and
exit
-s,--script <arg> TDL4 script file. Mandatory for batch modes
-v,--variablesFile <arg> Path to variables file, name=value pairs per
each line
-V,--variables <arg> Pass contents of variables file encoded as
Base64
-l,--local Run in local mode (its options have no effect
-l,--local Run in local batch mode (cluster batch mode
otherwise)
-d,--dry -l: Dry run (only check script syntax and
print errors to console, if found)
-m,--driverMemory <arg> -l: Driver memory, by default Spark uses 1g
-u,--sparkUI -l: Enable Spark UI, by default it is disabled
-L,--localCores <arg> -l: Set cores #, by default * (all cores)
-R,--repl Run in local mode with interactive REPL
interface. -s is optional
-i,--history <arg> -R: Set history file location
```
then everything is OK, working as intended, and you could proceed to test your ETLs (you may safely ignore Spark warnings, if there are any).
interface. Implies -l. -s is optional
-r,--remoteRepl Connect to a remote REPL server. -s is optional
-t,--history -R, -r: Set history file location
-i,--host <arg> Use specified network address:
-e: to listen at (default is all)
-r: to connect to (in this case, mandatory
parameter)
-e,--serveRepl Start REPL server in local or cluster mode. -s
is optional
-p,--port <arg> -e, -r: Use specified port to listen at or
connect to. Default is 9595
```

then everything is OK, working as intended, and you could proceed to begin building your ETL processes.

To specify an ETL Script, use `-s <path/to/script.tdl>` argument. To check just ETL script syntax without performing
the actual process, use `-d` switch for a Dry Run in any mode that supports `-s`. If any syntax error is encountered,
it'll be reported to console.

To specify values for script variables, use either `-v <path/to/vars.properties>` to point to file in Java
properties format, or encode that file contents as Base64, and specify it to `-V <Base64string>` argument.

### Local Execution

To run Data Cooker ETL in any of the Local modes, you need an executable artifact [built](BUILD.md) with `local`
profile.

To specify ETL script, use `--script <path/to/script.tdl>` argument. To check just ETL script syntax without performing the actual process, use `--dry` switch.
You must also use `-l` switch. If you want to limit number of CPU cores available to Spark, use `-L`
argument. If you want to change default memory limit of `1G`, use `-m` argument. For example, `-l -L 4 -m 8G`.

To specify values for script variables, use either `--variablesFile <path/to/vars.properties>` to point to file in Java properties format, or encode that file contents as Base64, and specify it to `--variables <Base64string>` argument.
If you want to watch for execution of lengthy processing in Spark UI, use `-u` switch to start it up. Otherwise, no
Spark UI will be started.

You must also use `--local` switch. If you want to limit number of CPU cores available to Spark, use `--localCores` argument. If you want to change default memory limit of 1G, use `--driverMemory` argument. For example, `-l -L 4 -m 8G`.
### On-Cluster Execution

### REPL Mode
If your environment matches with `EMR` profile (which is targeted to EMR 6.9 with Java 11), you may take
artifact [built](BUILD.md) with that profile, and use your favorite Spark submitter to pass it to cluster, and invoke
with `-s` and `-v` or `-V` command line switches. Entry class name is `io.github.pastorgl.datacooker.cli.Main`.

In addition to standard local mode, which just executes a single TDL4 script and then exits, there is a local REPL mode, useful if you want to interactively debug some scripts.
Otherwise, you may first need to tinker with [commons](./commons/pom.xml) and [cli](./cli/pom.xml) project manifests and
adjust library versions to match your environment. Because there are no exactly same Spark setups in the production,
that would be necessary in most cases.

To run in REPL mode, use `--repl` switch. By default, it stores command history in your home directory, but if you want to redirect some session history to a different location, use `--history <path/to/file>` switch.
We recommend to wrap submitter calls with some scripting and automate execution with CI/CD service.

### REPL Modes

In addition to standard batch modes, which just execute a single TDL4 Script and then exit, there are interactive modes
with REPL, useful if you want to interactively debug your processes.

To run in the Local REPL mode, use `-R` switch. If `-s` were specified, this Script becomes AutoExec, which will be
executed before displaying the REPL prompt.

To run just a REPL Server (either Local with `-l` or On-Cluster otherwise), use `-e` switch. If `-s` were specified,
this Script becomes AutoExec, which will be executed before starting the REST service. `-i` and `-p` control which
interface and port to use for REST. By default, configuration is `0.0.0.0:9595`.

To run a REPl Client only, use `-r`. You need to provide which Server is to connect to using `-i` and `-p`. By default,
configuration is `localhost:9595`.

Please note that currently protocol between Server and Client is simple HTTP without security and authentication. If you
intend to use it within production environment, you should wrap it into the secure tunnel and use some sort of
authenticating proxy.

By default, REPL shell stores command history in your home directory, but if you want to redirect some session history
to a different location, use `-t <path/to/file>` switch.

After starting up, you may see some Spark logs, and then the following prompt:

After starting up, you'll see some Spark logs, and then the following prompt
```
================================
Data Cooker ETL REPL interactive
=============================================
Data Cooker ETL REPL interactive (ver. 3.8.0)
Type TDL4 statements to be executed in the REPL context in order of input, or a command.
Statement must always end with a semicolon. If not, it'll be continued on a next line.
If you want to type several statements at once on several lines, end each line with \
Type \QUIT; to end session and \HELP; for list of all REPL commands
Type \QUIT; to end session and \HELP; for list of all REPL commands and shortcuts
datacooker> _
```

Follow the instructions and explore `\COMMANDs` with `\HELP \COMMAND;` command.
Follow the instructions and explore available `\COMMAND`s with the `\HELP COMMAND;` command.

You may freely execute any valid TDL4 statements, view your data, load scripts from files, and even record them
directly in REPL.

After that, you may freely execute any TDL4 statements, load scripts from files, and even record them directly in REPL.
Also, you may use some familiar shell shortcuts (like reverse search with `Ctrl+R`, automatic last commands expansion
with `!n`) and contextual auto-complete of TDL4 statements with `TAB` key.

Regarding Spark logs, in REPL shell they're automatically set to `WARN` level. If you want to switch back to default
`INFO` level, use

Regarding Spark logs, they're automatically set to `WARN` level. If you want to switch to default `INFO`, use
```sql
OPTIONS @log_level='INFO';
```

### Cluster Execution

If your environment matches with `EMR` profile (which is targeted to EMR 6.9 with Java 11), you may take artifact [built](BUILD.md) with that profile, and use your favorite Spark submitter to pass it to cluster, and invoke with `--script` and `-v` or `-V` command line switches. Entry class name is `io.github.pastorgl.datacooker.cli.Main`.

Otherwise, you may first need to tinker with [commons](./commons/pom.xml) and [cli](./cli/pom.xml) project manifests and adjust library versions to match your environment. Because there are no exactly same Spark setups in the production, that would be necessary in most cases.

We recommend to wrap submitter calls with some scripting and automate execution with CI/CD service.
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ public Configuration() {
addOption("u", "sparkUI", false, "-l: Enable Spark UI, by default it is disabled");
addOption("L", "localCores", true, "-l: Set cores #, by default * (all cores)");
addOption("R", "repl", false, "Run in local mode with interactive REPL interface. Implies -l. -s is optional");
addOption("i", "history", true, "-R, -r: Set history file location");
addOption("r", "remoteRepl", false, "Connect to a remote REPL server. -s is optional");
addOption("t", "history", true, "-R, -r: Set history file location");
addOption("e", "serveRepl", false, "Start REPL server in local or cluster mode. -s is optional");
addOption("r", "remoteRepl", false, "Connect to a remote REPL server");
addOption("i", "host", true, "Use specified network address:\n" +
"-e: to listen at (default is all)\n" +
"-r: to connect to (in this case, mandatory parameter)");
Expand Down

0 comments on commit ff3f989

Please sign in to comment.