Ao executar a linha de import findspark aparece a seginte mensagem:
Exception: Unable to find py4j in /content/spark-3.1.2-bin-hadoop2.7.tgz/python, your SPARK_HOME may not be configured correctly
Segue o código todo que estou usando
!apt-get update -qq
!apt-get install openjdk-8-jdk-headless -qq > /dev/null
!wget -q https://archive.apache.org/dist/spark/spark-3.1.2/spark-3.1.2-bin-hadoop2.7.tgz
!tar xf spark-3.1.2-bin-hadoop2.7.tgz
!pip install -q findspark
import os
os.environ["SPARK_HOME"] = "/content/spark-3.1.2-bin-hadoop2.7.tgz"
import findspark
findspark.init()