获取 PSQLException:错误:在带有 Postgres 的 spark jdbc 中使用查询而不是表名时在“SELECT”处或附近出现语法错误
Posted
技术标签:
【中文标题】获取 PSQLException:错误:在带有 Postgres 的 spark jdbc 中使用查询而不是表名时在“SELECT”处或附近出现语法错误【英文标题】:Getting PSQLException: ERROR: syntax error at or near "SELECT" when using a query instead of tablename in spark jdbc with Postgres 【发布时间】:2020-03-20 23:06:07 【问题描述】:对于以下通用sql:
showTablesSql = """SELECT table_catalog,table_schema,table_name
FROM information_schema.tables
ORDER BY table_schema,table_name"""
当它被提交到spark jdbc
for postgresql
时,会发生以下异常:
py4j.protocol.Py4JJavaError: An error occurred while calling o34.load.
: org.postgresql.util.PSQLException: ERROR: syntax error at or near "SELECT"
Position: 15
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2578)
这是正在使用的代码:
url = f"jdbc:postgresql://c['db.host']/c['db.name']?user=c['db.user']&password=c['db.password']"
print(url)
empDF = spark.read \
.format("jdbc") \
.option("url", url) \
.option("dbtable", showTablesSql) \
.option("user", c['db.user']) \
.option("password", c['db.password']) \
.load()
以下是堆栈跟踪详细信息:
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
jdbc:postgresql://localhost/bluej?user=bluej&password=mypassword
Traceback (most recent call last):
File "/git/bluej/fusion/python/pointr/bluej/util/sparkmgr.py", line 37, in <module>
tab = readTab(db, tname)
File "/git/bluej/fusion/python/pointr/bluej/util/sparkmgr.py", line 23, in readTab
empDF = spark.read \
File "/shared/spark3/python/pyspark/sql/readwriter.py", line 166, in load
return self._df(self._jreader.load())
File "/shared/spark3/python/lib/py4j-0.10.8.1-src.zip/py4j/java_gateway.py", line 1285, in __call__
File "/shared/spark3/python/pyspark/sql/utils.py", line 98, in deco
return f(*a, **kw)
File "/shared/spark3/python/lib/py4j-0.10.8.1-src.zip/py4j/protocol.py", line 326, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o34.load.
: org.postgresql.util.PSQLException: ERROR: syntax error at or near "SELECT"
Position: 15
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2578)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2313)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:331)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:448)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:369)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:159)
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:109)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:61)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:226)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:339)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:240)
at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:229)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:229)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:179)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:834)
【问题讨论】:
是否可以临时设置log_statement = all
并找出发送到 Postgres 的确切查询?似乎很奇怪,它在位置 15 处有语法错误
是的,该位置位于表名的中间。我会试试的。现在重新启动 pg 服务器
实际上,您可能甚至不需要设置log_statement = all
-- log_min_error_statement
默认应该为您记录查询。只需查看您的 postgres 日志并找出收到的实际查询
我重新启动了具有更高日志记录的数据库 - 并且启动时那里有条目。但是我只是重新运行了几次上述查询没有条目。有什么想法为什么他们不会生成日志条目?
我相信你应该像在 sql from 子句中那样在括号“(Select ... )”中编写子查询。
【参考方案1】:
@BjarniRagnarsson 在评论中提到dbtable
实际上是subquery
。我从受人尊敬的@zero323 中找到了一些相关信息
https://***.com/a/32629170/1056563
由于 dbtable 被用作 SELECT 语句的源,因此它的格式对普通 SQL 查询有效。如果你想使用子查询,你应该在括号中传递一个查询并提供一个别名:
USING org.apache.spark.sql.jdbc
OPTIONS (
url "jdbc:postgresql:dbserver",
dbtable "(SELECT * FROM mytable) tmp"
);
在将 sql 设为 subquery
后,我看到它已正确解析:还没有数据返回,但很可能会出现。
【讨论】:
以上是关于获取 PSQLException:错误:在带有 Postgres 的 spark jdbc 中使用查询而不是表名时在“SELECT”处或附近出现语法错误的主要内容,如果未能解决你的问题,请参考以下文章
java - PSQLException:错误:“$ 1”处或附近的语法错误[重复]
org.postgresql.util.PSQLException: 栏位索引超过许可范围:3,栏位数:2。