Seguindo os passos do instrutor não consegui fazer o comando df.show() rodar no windows.
Py4JJavaError Traceback (most recent call last)
Cell In[31], line 1
----> 1 df.show()
File g:\My Drive\sync\menu\courses\alura\formacao-apache-spark-python\.venv\Lib\site-packages\pyspark\sql\classic\dataframe.py:285, in DataFrame.show(self, n, truncate, vertical)
284 def show(self, n: int = 20, truncate: Union[bool, int] = True, vertical: bool = False) -> None:
--> 285 print(self._show_string(n, truncate, vertical))
File g:\My Drive\sync\menu\courses\alura\formacao-apache-spark-python\.venv\Lib\site-packages\pyspark\sql\classic\dataframe.py:303, in DataFrame._show_string(self, n, truncate, vertical)
297 raise PySparkTypeError(
298 errorClass="NOT_BOOL",
299 messageParameters={"arg_name": "vertical", "arg_type": type(vertical).__name__},
300 )
302 if isinstance(truncate, bool) and truncate:
--> 303 return self._jdf.showString(n, 20, vertical)
304 else:
305 try:
File g:\My Drive\sync\menu\courses\alura\formacao-apache-spark-python\.venv\Lib\site-packages\py4j\java_gateway.py:1362, in JavaMember.__call__(self, *args)
1356 command = proto.CALL_COMMAND_NAME +\
1357 self.command_header +\
1358 args_command +\
1359 proto.END_COMMAND_PART
1361 answer = self.gateway_client.send_command(command)
...
at java.base/java.io.DataInputStream.readFully(DataInputStream.java:210)
at java.base/java.io.DataInputStream.readInt(DataInputStream.java:385)
at org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:933)
... 26 more
Deve ser alguma configuração de ambiente, mas ainda não consegui resolver. Estou a dias tentando corrigir e não consigo. Vou acabr desistindo e ir para outro curso.
Precisam atualizar esse tutorial urgente com configurações mais recentes.