Skip to content

Commit

Permalink
update build doc
Browse files Browse the repository at this point in the history
  • Loading branch information
peacewong committed Jul 11, 2023
1 parent 628ef02 commit 7ee93ab
Showing 1 changed file with 1 addition and 11 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -531,14 +531,4 @@ object SqoopOnceJobTest extends App {
}
}
```
3.测试程序完成,引擎会自动销毁,不用手动清除
## Once模式支持引擎:

|名称|需要修改配置项|
|:----|:----|
|Flink|1.可用的gateway地址:LinkisJobClient.config().setDefaultServerUrl("http://xxx:9001");<br>2.引擎标签:addLabel(LabelKeyUtils.ENGINE_TYPE_LABEL_KEY(), "flink-1.12.2")<br>3.租户标签:.addLabel(LabelKeyUtils.USER_CREATOR_LABEL_KEY(), "hadoop-Streamis")<br>4.once模式标签:.addLabel(LabelKeyUtils.ENGINE_CONN_MODE_LABEL_KEY(), "once")<br>5.执行用户:.addExecuteUser("hadoop")<br>6.执行类型:.addJobContent("runType", "sql")<br>7.执行代码:.addJobContent("code", sql)<br>8.任务名称:.addSource("jobName", "OnceJobTest")<br>|
|Sqoop|1.可用的gateway地址:LinkisJobClient.config().setDefaultServerUrl("http://xxx:9001");<br>2.SimpleOnceJob.builder().setCreateService("Linkis-Client") <br>3.引擎标签:.addLabel(LabelKeyUtils.ENGINE_TYPE_LABEL_KEY, "sqoop-1.4.6")<br>4.租户标签:.addLabel(LabelKeyUtils.USER_CREATOR_LABEL_KEY, "hadoop-Client")<br>5.once模式标签:.addLabel(LabelKeyUtils.ENGINE_CONN_MODE_LABEL_KEY, "once")<br>6.启动参数setStartupParams(startUpMap):|
|Spark|1.可用的gateway地址:LinkisJobClient.config().setDefaultServerUrl("http://xxx:9001");<br>2.固定参数:LinkisJobClient.once()<br> .simple()<br> .builder()<br> .setCreateService("Spark-Test")<br> .setMaxSubmitTime(300000)<br> .setDescription("SparkTestDescription")<br> .addExecuteUser(submitUser)<br> .addJobContent("runType", "jar")<br> .addJobContent("spark.app.main.class", "org.apache.spark.examples.JavaWordCount")<br> .addJobContent("spark.app.args", "hdfs:///tmp/log.log -a 10 -b=12")<br> .addJobContent(<br> "spark.extconf", "spark.a=d\nspark.c=d\nspark.args.start_date=2022-06-14")<br>3.引擎标签:.addLabel("engineType", "spark-2.4.7")<br>4.租户标签:.addLabel("userCreator", "spark-IDE")<br>5.once模式标签:.addLabel("engineConnMode", "once")<br>6.启动参数:addStartupParam("spark.app.name", "spark-submit-jar-test-xi")<br>7.资源包:"spark.app.resource", "hdfs:///spark/spark-examples_2.11-2.3.0.2.6.5.0-292.jar")<br>8.任务名称:.addSource("jobName", "OnceJobTest")|
|Seatunnel|1.可用的gateway地址:LinkisJobClient.config().setDefaultServerUrl("http://xxx:9001");<br>2.引擎标签:addLabel(LabelKeyUtils.ENGINE_TYPE_LABEL_KEY(), "seatunnel-2.1.2")<br>3.租户标签:.addLabel(LabelKeyUtils.USER_CREATOR_LABEL_KEY(), "hadoop-seatunnel")<br>4.once模式标签:.addLabel(LabelKeyUtils.ENGINE_CONN_MODE_LABEL_KEY(), "once")<br>5.执行用户:.addExecuteUser("hadoop")<br>6.执行类型:.addJobContent("runType", "spark")<br>7.执行代码:.addJobContent("code", sql)<br>8.任务名称:.addSource("jobName", "OnceJobTest")<br>9..addJobContent("master", "local[4]")<br>10.addJobContent("deploy-mode", "client")<br>|


3.测试程序完成,引擎会自动销毁,不用手动清除

0 comments on commit 7ee93ab

Please sign in to comment.