Primeiro Erro
Primeiro erro que estourou foi JAVA_HOME, instalei a jdk e jre, acredito que só a jre resolva, caso não, instale as 2.
sudo apt install default-jre
sudo apt install default-jdk
Após isso precisei definir a JAVA_HOME. Pra saber o caminho absoluto do java, utilizei o seguinte comando:
update-alternatives --config java
Com o caminho editei o arquivo environment:
vi /etc/environment
Criei uma nova linha e adicionei:
JAVA_HOME="/usr/lib/jvm/java-14-oracle/bin/java"
Após salvar:
source /etc/environment
Da um echo $JAVA_HOME.
Segundo Erro
Após isso deu o seguinte erro:
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/10/25 17:43:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/10/25 17:43:23 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
22/10/25 17:43:23 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
22/10/25 17:43:23 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
22/10/25 17:43:23 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
22/10/25 17:43:23 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Não foi possível acessar o endereço requisitado: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at java.base/sun.nio.ch.Net.bind0(Native Method)
at java.base/sun.nio.ch.Net.bind(Net.java:459)
at java.base/sun.nio.ch.Net.bind(Net.java:448)
at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:562)
Traceback (most recent call last):
File "/home/anderson/dados/SPARK_STREAMING/client_twitter.py", line 4, in <module>
spark = SparkSession.builder.appName('SparkStreaming').getOrCreate()
File "/home/anderson/.local/lib/python3.10/site-packages/pyspark/sql/session.py", line 269, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "/home/anderson/.local/lib/python3.10/site-packages/pyspark/context.py", line 483, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/home/anderson/.local/lib/python3.10/site-packages/pyspark/context.py", line 282, in _do_init
self._jsc = jsc or self._initialize_context(self._conf._jconf)
File "/home/anderson/.local/lib/python3.10/site-packages/pyspark/context.py", line 402, in _initialize_context
return self._jvm.JavaSparkContext(jconf)
File "/home/anderson/.local/lib/python3.10/site-packages/py4j/java_gateway.py", line 1585, in __call__
return_value = get_return_value(
File "/home/anderson/.local/lib/python3.10/site-packages/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: Não foi possível acessar o endereço requisitado: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at java.base/sun.nio.ch.Net.bind0(Native Method)
at java.base/sun.nio.ch.Net.bind(Net.java:459)
at java.base/sun.nio.ch.Net.bind(Net.java:448)
at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:562)
Para resolver esse precisei rodar a seguinte linha de código no terminal que executaria o client_twitter.py
export SPARK_LOCAL_IP="127.0.0.1"
Caso não funcione, adicione seu hostname no arquivo de hosts do linux: Para descobrir o hostname:
hostname
Adicione em /etc/host:
vi /etc/host
127.0.0.1 anderson
"No caso o meu hostname é anderson"
Após isso execute novamente o client_twitter.py