我已经使用了网上AWS控制台启动我的集群以及与Apache的火花。我根据我的Spark应用程序中的脂肪罐子,我已经上传到一个S3桶。
当我尝试它作为一个步骤使用自定义罐
发送,该过程将失败。
任何指针将大大AP preciated。
I have used the online AWS console to launch my cluster along with Apache Spark. I have a fat jar based on my Spark app and I have uploaded it to a S3 Bucket.
When I try to send it as a Step with a Custom Jar
, the process fails.
Any pointers would be greatly appreciated.
使用的 EMR引导安装星火,并作为文档中描述提交作业: https://github.com/awslabs/emr-bootstrap-actions/blob/master/spark/examples/spark-submit-via-step.md
Use EMR bootstrap to install Spark, and submit the job as described in the documentation: https://github.com/awslabs/emr-bootstrap-actions/blob/master/spark/examples/spark-submit-via-step.md
上一篇:如何检测,当WIFI连接已经建立的Android?WIFI、Andr、oid
下一篇:如何修复损坏的InnoDB锁定从创建一个表名(错误:-1)的AWS RDS?创建一个、错误、InnoDB、AWS