GraphFrames 的 PySpark 异常
Posted
技术标签:
【中文标题】GraphFrames 的 PySpark 异常【英文标题】:PySpark exception with GraphFrames 【发布时间】:2019-10-03 15:49:28 【问题描述】:我正在使用 PySpark 和 GraphFrames 构建一个简单的网络图(在 Google Dataproc 上运行)
vertices = spark.createDataFrame([
("a", "Alice", 34),
("b", "Bob", 36),
("c", "Charlie", 30),
("d", "David", 29),
("e", "Esther", 32),
("f", "Fanny", 36),
("g", "Gabby", 60)],
["id", "name", "age"])
edges = spark.createDataFrame([
("a", "b", "friend"),
("b", "c", "follow"),
("c", "b", "follow"),
("f", "c", "follow"),
("e", "f", "follow"),
("e", "d", "friend"),
("d", "a", "friend"),
("a", "e", "friend")
], ["src", "dst", "relationship"])
g = GraphFrame(vertices, edges)
然后,我尝试运行 `label progation'
result = g.labelPropagation(maxIter=5)
但我收到以下错误:
Py4JJavaError: An error occurred while calling o164.run.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 19.0 failed 4 times, most recent failure: Lost task 0.3 in stage 19.0 (TID 829, cluster-network-graph-w-12.c.myproject-bi.internal, executor 2): java.lang.ClassNotFoundException: org.graphframes.GraphFrame$$anonfun$5
看起来包“GraphFrame”不可用——但前提是我运行标签传播。我该如何解决?
【问题讨论】:
运行时路径中是否存在您的图形框架 jar? 【参考方案1】:我已经使用以下参数解决了
import pyspark
from pyspark.sql import SparkSession
conf = pyspark.SparkConf().setAll([('spark.jars', 'gs://spark-lib/bigquery/spark-bigquery-latest.jar'),
('spark.jars.packages', 'graphframes:graphframes:0.7.0-spark2.3-s_2.11')])
spark = SparkSession.builder \
.appName('testing bq')\
.config(conf=conf) \
.getOrCreate()
【讨论】:
【参考方案2】:似乎这是 google Dataproc 中的一个已知问题。
创建一个python文件并添加以下行然后运行它:
from setuptools import setup
setup(name='graphframes',
version='0.5.10',
packages=['graphframes', 'graphframes.lib']
)
您可以访问这里了解详情:
https://github.com/graphframes/graphframes/issues/238, https://github.com/graphframes/graphframes/issues/172
【讨论】:
以上是关于GraphFrames 的 PySpark 异常的主要内容,如果未能解决你的问题,请参考以下文章
如何使用 Pyspark 中的 Graphframes 和 Spark Dataframe 中的原始数据获取连接的组件?