Flink1.10.2如何编译
这篇文章将为大家详细讲解有关Flink 1.10.2如何编译,小编觉得挺实用的,因此分享给大家做个参考,希望大家阅读完这篇文章后可以有所收获。
目前创新互联公司已为近1000家的企业提供了网站建设、域名、虚拟空间、网站托管、服务器租用、企业网站设计、邓州网站维护等服务,公司将坚持客户导向、应用为本的策略,正道将秉承"和谐、参与、激情"的文化,与客户和合作伙伴齐心协力一起成长,共同发展。
flink 1.10.2 编译
1. 准备
编译条件
macOS Catalina 10.15.3
Hadoop: 3.1.1.3.1.5.1-2(hortonworks)
flink-1.10.2
flink-shaded-9.0
下载源码包
可以从githup下载,也可以从官网直接下载
Flink on github: https://github.com/apache/flink
https://github.com/apache/flink/archive/release-1.10.2.tar.gz
flink-1.10.2-src.tgz:https://archive.apache.org/dist/flink/flink-1.10.2/flink-1.10.2-src.tgz
flink-shaded-9.0-src.tgz:https://archive.apache.org/dist/flink/flink-shaded-9.0/flink-shaded-9.0-src.tgz
其它相关链接:
building: https://ci.apache.org/projects/flink/flink-docs-release-1.11/flinkDev/building.html
hadoop integration: https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/hadoop.html
hadoop集成
关于hadoop的集成,我们需要提前编译flink-shaded-hadoop-2:jar
flink-1.10.2的pom.xml中可以发现默认的
所以我们也选择下载flink-shaded-9.0 编译
添加依赖repo
我们编译的hadoop基于hortonworks,所以这两个project编译前都需要在pom.xml中添加hdp repo,否则会有些依赖包找不到
jetty jetty Repository http://repo.hortonworks.com/content/groups/public/ true daily false warn default
2. flink-shaded-9.0 编译
# 确定我们需要的hadoop版本 $ hadoop version Hadoop 3.1.1.3.1.5.1-2 # build 命令 # $ mvn clean install -DskipTests -Drat.skip=true -Dhadoop.version=3.1.1.3.1.0.0-78 -Dhive.version=3.1.0.3.1.0.0-78 #$ mvn clean install -DskipTests -Drat.skip=true -Dhadoop.version=3.1.1.3.1.5.1-2 $ mvn clean install -DskipTests -Dhadoop.version=3.1.1.3.1.5.1-2 -Dhive.version=3.1.0.3.1.5.1-2 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] flink-shaded 9.0 ................................... SUCCESS [ 1.809 s] [INFO] flink-shaded-force-shading 9.0 ..................... SUCCESS [ 0.532 s] [INFO] flink-shaded-asm-7 7.1-9.0 ......................... SUCCESS [ 0.685 s] [INFO] flink-shaded-guava-18 18.0-9.0 ..................... SUCCESS [ 1.086 s] [INFO] flink-shaded-netty-4 4.1.39.Final-9.0 .............. SUCCESS [ 4.428 s] [INFO] flink-shaded-netty-tcnative-dynamic 2.0.25.Final-9.0 SUCCESS [ 0.694 s] [INFO] flink-shaded-jackson-parent 2.10.1-9.0 ............. SUCCESS [ 0.024 s] [INFO] flink-shaded-jackson-2 2.10.1-9.0 .................. SUCCESS [ 1.266 s] [INFO] flink-shaded-jackson-module-jsonSchema-2 2.10.1-9.0 SUCCESS [ 0.923 s] [INFO] flink-shaded-hadoop-2 3.1.1.3.1.5.1-2-9.0 .......... SUCCESS [ 14.844 s] [INFO] flink-shaded-hadoop-2-uber 3.1.1.3.1.5.1-2-9.0 ..... SUCCESS [ 27.032 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 53.525 s [INFO] Finished at: 2020-12-05T16:32:20+08:00 [INFO] ------------------------------------------------------------------------
3. flink-1.10.2 编译
编译
# mvn install -DskipTests -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Pvendor-repos -Dscala-2.11 -Dhadoop.version=3.1.1.3.1.0.0-78 -Dhive.version=3.1.0.3.1.0.0-78 $ mvn clean install -DskipTests -Dfast -Pvendor-repos -Dhadoop.version=3.1.1.3.1.5.1-2 -Dscala-2.11 -Dhive.version=3.1.0.3.1.5.1-2 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] force-shading [jar] [INFO] flink [pom] [INFO] flink-annotations [jar] [INFO] flink-shaded-curator [jar] [INFO] flink-test-utils-parent [pom] [INFO] flink-test-utils-junit [jar] [INFO] flink-metrics [pom] [INFO] flink-metrics-core [jar] [INFO] flink-core [jar] [INFO] flink-java [jar] [INFO] flink-queryable-state [pom] [INFO] flink-queryable-state-client-java [jar] [INFO] flink-filesystems [pom] [INFO] flink-hadoop-fs [jar] [INFO] flink-runtime [jar] [INFO] flink-scala [jar] [INFO] flink-mapr-fs [jar] [INFO] flink-filesystems :: flink-fs-hadoop-shaded [jar] [INFO] flink-s3-fs-base [jar] [INFO] flink-s3-fs-hadoop [jar] [INFO] flink-s3-fs-presto [jar] [INFO] flink-swift-fs-hadoop [jar] [INFO] flink-oss-fs-hadoop [jar] [INFO] flink-azure-fs-hadoop [jar] [INFO] flink-optimizer [jar] [INFO] flink-clients [jar] [INFO] flink-streaming-java [jar] [INFO] flink-test-utils [jar] [INFO] flink-runtime-web [jar] [INFO] flink-examples [pom] [INFO] flink-examples-batch [jar] [INFO] flink-connectors [pom] [INFO] flink-hadoop-compatibility [jar] [INFO] flink-state-backends [pom] [INFO] flink-statebackend-rocksdb [jar] [INFO] flink-tests [jar] [INFO] flink-streaming-scala [jar] [INFO] flink-table [pom] [INFO] flink-table-common [jar] [INFO] flink-table-api-java [jar] [INFO] flink-table-api-java-bridge [jar] [INFO] flink-table-api-scala [jar] [INFO] flink-table-api-scala-bridge [jar] [INFO] flink-sql-parser [jar] [INFO] flink-libraries [pom] [INFO] flink-cep [jar] [INFO] flink-table-planner [jar] [INFO] flink-table-runtime-blink [jar] [INFO] flink-table-planner-blink [jar] [INFO] flink-jdbc [jar] [INFO] flink-hbase [jar] [INFO] flink-hcatalog [jar] [INFO] flink-metrics-jmx [jar] [INFO] flink-formats [pom] [INFO] flink-json [jar] [INFO] flink-connector-kafka-base [jar] [INFO] flink-connector-kafka-0.9 [jar] [INFO] flink-connector-kafka-0.10 [jar] [INFO] flink-connector-kafka-0.11 [jar] [INFO] flink-connector-elasticsearch-base [jar] [INFO] flink-connector-elasticsearch3 [jar] [INFO] flink-connector-elasticsearch6 [jar] [INFO] flink-connector-elasticsearch7 [jar] [INFO] flink-connector-elasticsearch7 [jar] [INFO] flink-orc [jar] [INFO] flink-csv [jar] [INFO] flink-connector-hive [jar] [INFO] flink-connector-rabbitmq [jar] [INFO] flink-connector-twitter [jar] [INFO] flink-connector-nifi [jar] [INFO] flink-connector-cassandra [jar] [INFO] flink-avro [jar] [INFO] flink-connector-filesystem [jar] [INFO] flink-connector-kafka [jar] [INFO] flink-connector-gcp-pubsub [jar] [INFO] flink-connector-kinesis [jar] [INFO] flink-sql-connector-elasticsearch7 [jar] [INFO] flink-sql-connector-elasticsearch7 [jar] [INFO] flink-sql-connector-kafka-0.9 [jar] [INFO] flink-sql-connector-kafka-0.10 [jar] [INFO] flink-sql-connector-kafka-0.11 [jar] [INFO] flink-sql-connector-kafka [jar] [INFO] flink-connector-kafka-0.8 [jar] [INFO] flink-avro-confluent-registry [jar] [INFO] flink-parquet [jar] [INFO] flink-sequence-file [jar] [INFO] flink-compress [jar] [INFO] flink-examples-streaming [jar] [INFO] flink-examples-table [jar] [INFO] flink-examples-build-helper [pom] [INFO] flink-examples-streaming-twitter [jar] [INFO] flink-examples-streaming-state-machine [jar] [INFO] flink-examples-streaming-gcp-pubsub [jar] [INFO] flink-container [jar] [INFO] flink-queryable-state-runtime [jar] [INFO] flink-end-to-end-tests [pom] [INFO] flink-cli-test [jar] [INFO] flink-parent-child-classloading-test-program [jar] [INFO] flink-parent-child-classloading-test-lib-package [jar] [INFO] flink-dataset-allround-test [jar] [INFO] flink-dataset-fine-grained-recovery-test [jar] [INFO] flink-datastream-allround-test [jar] [INFO] flink-batch-sql-test [jar] [INFO] flink-stream-sql-test [jar] [INFO] flink-bucketing-sink-test [jar] [INFO] flink-distributed-cache-via-blob [jar] [INFO] flink-high-parallelism-iterations-test [jar] [INFO] flink-stream-stateful-job-upgrade-test [jar] [INFO] flink-queryable-state-test [jar] [INFO] flink-local-recovery-and-allocation-test [jar] [INFO] flink-elasticsearch3-test [jar] [INFO] flink-elasticsearch6-test [jar] [INFO] flink-elasticsearch7-test [jar] [INFO] flink-quickstart [pom] [INFO] flink-quickstart-java [maven-archetype] [INFO] flink-quickstart-scala [maven-archetype] [INFO] flink-quickstart-test [jar] [INFO] flink-confluent-schema-registry [jar] [INFO] flink-stream-state-ttl-test [jar] [INFO] flink-sql-client-test [jar] [INFO] flink-streaming-file-sink-test [jar] [INFO] flink-state-evolution-test [jar] [INFO] flink-rocksdb-state-memory-control-test [jar] [INFO] flink-mesos [jar] [INFO] flink-kubernetes [jar] [INFO] flink-yarn [jar] [INFO] flink-gelly [jar] [INFO] flink-gelly-scala [jar] [INFO] flink-gelly-examples [jar] [INFO] flink-metrics-dropwizard [jar] [INFO] flink-metrics-graphite [jar] [INFO] flink-metrics-influxdb [jar] [INFO] flink-metrics-prometheus [jar] [INFO] flink-metrics-statsd [jar] [INFO] flink-metrics-datadog [jar] [INFO] flink-metrics-slf4j [jar] [INFO] flink-cep-scala [jar] [INFO] flink-table-uber [jar] [INFO] flink-table-uber-blink [jar] [INFO] flink-sql-client [jar] [INFO] flink-state-processor-api [jar] [INFO] flink-python [jar] [INFO] flink-scala-shell [jar] [INFO] flink-dist [jar] [INFO] flink-end-to-end-tests-common [jar] [INFO] flink-metrics-availability-test [jar] [INFO] flink-metrics-reporter-prometheus-test [jar] [INFO] flink-heavy-deployment-stress-test [jar] [INFO] flink-connector-gcp-pubsub-emulator-tests [jar] [INFO] flink-streaming-kafka-test-base [jar] [INFO] flink-streaming-kafka-test [jar] [INFO] flink-streaming-kafka011-test [jar] [INFO] flink-streaming-kafka010-test [jar] [INFO] flink-plugins-test [pom] [INFO] dummy-fs [jar] [INFO] another-dummy-fs [jar] [INFO] flink-tpch-test [jar] [INFO] flink-streaming-kinesis-test [jar] [INFO] flink-elasticsearch7-test [jar] [INFO] flink-end-to-end-tests-common-kafka [jar] [INFO] flink-tpcds-test [jar] [INFO] flink-statebackend-heap-spillable [jar] [INFO] flink-contrib [pom] [INFO] flink-connector-wikiedits [jar] [INFO] flink-yarn-tests [jar] [INFO] flink-fs-tests [jar] [INFO] flink-docs [jar] [INFO] flink-ml-parent [pom] [INFO] flink-ml-api [jar] [INFO] flink-ml-lib [jar] [INFO] flink-walkthroughs [pom] [INFO] flink-walkthrough-common [jar] [INFO] flink-walkthrough-table-java [maven-archetype] [INFO] flink-walkthrough-table-scala [maven-archetype] [INFO] flink-walkthrough-datastream-java [maven-archetype] [INFO] flink-walkthrough-datastream-scala [maven-archetype] [INFO] [INFO] -------------------< org.apache.flink:force-shading >------------------- [INFO] Building force-shading 1.10.2 [1/176] [INFO] --------------------------------[ jar ]---------------------------------
打包
为了和hadoop兼容使用,我们需要手动将上一步编译的flink-shaded-hadoop-2-uber-3.1.1.3.1.5.1-2-9.0.jar拷贝至flink-1.10.2-bin/flink-1.10.2/lib
然后手动将flink-1.10.2/flink-dist/target/flink-1.10.2-bin/flink-1.10.2打包成tar.gz去服务器部署安装就可以了
注意hadoop版本最好要一致,否则可能有不兼容地方
# 最后生成编译包所在路径flink-1.10.2-bin $ ll flink-1.10.2/flink-dist/target total 264176 drwxr-xr-x 3 jiazz staff 96B 12 5 18:03 antrun drwxr-xr-x 2 jiazz staff 64B 12 5 18:04 archive-tmp -rw-r--r-- 1 jiazz staff 490K 12 5 18:04 bash-java-utils.jar drwxr-xr-x 5 jiazz staff 160B 12 5 18:03 classes drwxr-xr-x 3 jiazz staff 96B 12 5 18:04 flink-1.10.2-bin -rw-r--r-- 1 jiazz staff 116M 12 5 18:04 flink-dist_2.11-1.10.2.jar drwxr-xr-x 3 jiazz staff 96B 12 5 18:03 generated-test-sources drwxr-xr-x 3 jiazz staff 96B 12 5 18:03 maven-archiver drwxr-xr-x 3 jiazz staff 96B 12 5 18:03 maven-shared-archive-resources -rw-r--r-- 1 jiazz staff 206K 12 5 18:03 original-flink-dist_2.11-1.10.2.jar drwxr-xr-x 4 jiazz staff 128B 12 5 18:03 temporary drwxr-xr-x 5 jiazz staff 160B 12 5 18:03 test-classes # 添加flink-shaded-hadoop-2-uber到lib $ cp ~/.m2/repository/org/apache/flink/flink-shaded-hadoop-2-uber/3.1.1.3.1.5.1-2-9.0/flink-shaded-hadoop-2-uber-3.1.1.3.1.5.1-2-9.0.jar flink-1.10.2-bin/flink-1.10.2/lib #打包 $ tar czvf flink-1.10.2-3.1.5.1-2.tar.gz flink-1.10.2-bin/flink-1.10.2/
编译过程遇到的问题
我明明已经编译了flink-shaded,但是我编译的是10,默认用的是9.0,所以去重新下载flink-shaded-9.0编译好,再继续编译flink就好了,或者改成你编译的flink-shaded版本
[ERROR] Failed to execute goal on project flink-hadoop-fs: Could not resolve dependencies for project org.apache.flink:flink-hadoop-fs:jar:1.10.2: Could not find artifact org.apache.flink:flink-shaded-hadoop-2:jar:3.1.1.3.1.5.1-2-9.0 in jetty (http://repo.hortonworks.com/content/groups/public/) -> [Help 1] [ERROR]
关于“Flink 1.10.2如何编译”这篇文章就分享到这里了,希望以上内容可以对大家有一定的帮助,使各位可以学到更多知识,如果觉得文章不错,请把它分享出去让更多的人看到。
新闻名称:Flink1.10.2如何编译
本文路径:http://azwzsj.com/article/gsgjjp.html