bin/spark-shell
下载spark-2.1.0-bin-hadoop2.7.tgz,解压缩直接进入spark根目录,然后运行bin/spark-shell即可进入。
但是今天遇到了一个低级错误:
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service ‘sparkDriver’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
[root@sk1 spark-2.1.0-bin-hadoop2.7]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/04/07 22:33:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/04/07 22:33:38 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
<console>:14: error: not found: value spark
import spark.implicits._
<console>:14: error: not found: value spark
import spark.sql
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
[root@sk1 ~]
ens32: flags=4163<UP,BROADCAST,RUNNING,MULTICAST> mtu 1500
inet 192.168.11.138 netmask 255.255.255.0 broadcast 192.168.11.255
inet6 fe80::a8bd:a097:8ca9:d22a prefixlen 64 scopeid 0x20<link>
ether 00:0c:29:c3:3e:9a txqueuelen 1000 (Ethernet)
RX packets 273939 bytes 395373188 (377.0 MiB)
RX errors 0 dropped 0 overruns 0 frame 0
TX packets 16657 bytes 2472671 (2.3 MiB)
TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0
lo: flags=73<UP,LOOPBACK,RUNNING> mtu 65536
inet 127.0.0.1 netmask 255.0.0.0
inet6 ::1 prefixlen 128 scopeid 0x10<host>
loop txqueuelen 1 (Local Loopback)
RX packets 276 bytes 23980 (23.4 KiB)
RX errors 0 dropped 0 overruns 0 frame 0
TX packets 276 bytes 23980 (23.4 KiB)
TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0
本地IP是192.168.11.138
[root@sk1 ~]# cat /etc/hosts
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.1.138 sk1
很显然,IP配置错了。改正即可
[root@sk1 ~]# vi /etc/hosts
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.11.138 sk1
[root@sk1 spark-2.1.0-bin-hadoop2.7]# bin/spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/04/07 22:41:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/07 22:41:32 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
17/04/07 22:41:33 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
17/04/07 22:41:34 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.11.138:4040
Spark context available as 'sc' (master = local[*], app id = local-1491619281633).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
bin/spark-shell下载spark-2.1.0-bin-hadoop2.7.tgz,解压缩直接进入spark根目录,然后运行bin/spark-shell即可进入。 但是今天遇到了一个低级错误: java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 r
Spark是UCBerkeleyAMPlab所开源的类HadoopMapReduce的通用的并行计算框架,Spark基于mapreduce算法实现的分布式计算,拥有HadoopMapReduce所具有的优点;但不同于MapReduce的是Job中间输出和结果可以保存在内存中,从而不再需要读写HDFS,因此Spark能更好地适用于数据挖掘与机器学习等需要迭代的mapreduce的算法。其架构如下图所示:Spark的中间数据放到内存中,对于迭代运算效率更高。Spark更适合于迭代运算比较多的ML和DM运算。因为在Spark里面,有RDD的抽象概念。Spark比Hadoop更通用。Spark提供的数
当使用bin/spark-shell命令或使用spark-submit提交任务时,会遇到如下报错。
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/12/24 09:58...
11-08-2020 11:38:48 CST t_uac_user INFO - Starting job t_uac_user at 1597117128685
11-08-2020 11:38:48 CST t_uac_user INFO - job JVM args: '-Dazkaban.flowid=order' '-Dazkaban.execid=130' '-Dazkaban.jobid=t_uac_user'
11-08-2020 11:38:48 CST t_uac_user INFO.
在
spark-
shell上
运行sql语句,报错
WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
刚开始在自己的IDEA上都可以编译成功sql部分的代码, 没想到在client上翻了车。
google了好一会才发现是因为
spark上conf里面缺少hive配置文件。
很简单,找到hive里conf的下hive-site.xml文件 拷贝到
spark的conf目录下
1. 环境变量没有配置好,导致无法找到
Spark相关的文件。
2. 缺少必要的依赖库,导致无法启动。
3. 已经有
一个Spark进程正在
运行,导致无法启动新的
Spark Shell。
4. 配置文件有误,导致
Spark无法正常启动。
建议检查上述原因,并根据相应的解决方案进行调整。
navigator2015:
Apache Doris下载安装与启动
itnmg0520:
Apache Doris下载安装与启动
itnmg0520: