尝试使用 pandas.read_gbq() 函数时,pandas-gbq 抛出错误“找不到字段 google.protobuf.FileOptions.php_metadata_namespace”

Posted

技术标签:

【中文标题】尝试使用 pandas.read_gbq() 函数时,pandas-gbq 抛出错误“找不到字段 google.protobuf.FileOptions.php_metadata_namespace”【英文标题】:pandas-gbq throws error "Couldn't find field google.protobuf.FileOptions.php_metadata_namespace" when trying to use pandas.read_gbq() function 【发布时间】:2019-06-13 13:48:33 【问题描述】:

尝试使用带有 pandas.read_gbq() 函数的 SQL 查询从经过身份验证的 BigQuery 表中读取一些数据。之前已经成功执行了几十次。然而,奇怪的是,从今天早上开始,它就一直失败了

KeyError:“找不到字段 google.protobuf.FileOptions.php_metadata_namespace"

尝试了不同的 BigQuery 表,但结果相同。此外,检查 BigQuery 控制台中的 SQL 查询以确保正确性。

pd.read_gbq(query, projectid,dialect="standard")

---> 16 pd.read_gbq(query, projectid,dialect="standard")

~\Anaconda3\lib\site-packages\pandas\io\gbq.py in read_gbq(query, project_id, index_col, col_order, reauth, verbose, private_key, dialect, **kwargs)
     98         private_key=private_key,
     99         dialect=dialect,
--> 100         **kwargs)
    101 
    102 

~\Anaconda3\lib\site-packages\pandas_gbq\gbq.py in read_gbq(query, project_id, index_col, col_order, reauth, verbose, private_key, auth_local_webserver, dialect, **kwargs)
    802     """
    803 
--> 804     _test_google_api_imports()
    805 
    806     if verbose is not None and SHOW_VERBOSE_DEPRECATION:

~\Anaconda3\lib\site-packages\pandas_gbq\gbq.py in _test_google_api_imports()
     62 
     63     try:
---> 64         from google.cloud import bigquery  # noqa
     65     except ImportError as ex:
     66         raise ImportError(

~\Anaconda3\lib\site-packages\google\cloud\bigquery\__init__.py in <module>
     33 __version__ = get_distribution("google-cloud-bigquery").version
     34 
---> 35 from google.cloud.bigquery.client import Client
     36 from google.cloud.bigquery.dataset import AccessEntry
     37 from google.cloud.bigquery.dataset import Dataset

~\Anaconda3\lib\site-packages\google\cloud\bigquery\client.py in <module>
     45 from google.cloud.bigquery._helpers import _str_or_none
     46 from google.cloud.bigquery._http import Connection
---> 47 from google.cloud.bigquery.dataset import Dataset
     48 from google.cloud.bigquery.dataset import DatasetListItem
     49 from google.cloud.bigquery.dataset import DatasetReference

~\Anaconda3\lib\site-packages\google\cloud\bigquery\dataset.py in <module>
     22 import google.cloud._helpers
     23 from google.cloud.bigquery import _helpers
---> 24 from google.cloud.bigquery.model import ModelReference
     25 from google.cloud.bigquery.table import TableReference
     26 

~\Anaconda3\lib\site-packages\google\cloud\bigquery\model.py in <module>
     25 from google.api_core import datetime_helpers
     26 from google.cloud.bigquery import _helpers
---> 27 from google.cloud.bigquery_v2 import types
     28 
     29 

~\Anaconda3\lib\site-packages\google\cloud\bigquery_v2\__init__.py in <module>
     21 __version__ = pkg_resources.get_distribution("google-cloud-bigquery").version  # noqa
     22 
---> 23 from google.cloud.bigquery_v2 import types
     24 from google.cloud.bigquery_v2.gapic import enums
     25 

~\Anaconda3\lib\site-packages\google\cloud\bigquery_v2\types.py in <module>
     20 from google.api_core.protobuf_helpers import get_messages
     21 
---> 22 from google.cloud.bigquery_v2.proto import model_pb2
     23 from google.cloud.bigquery_v2.proto import model_reference_pb2
     24 from google.cloud.bigquery_v2.proto import standard_sql_pb2

~\Anaconda3\lib\site-packages\google\cloud\bigquery_v2\proto\model_pb2.py in <module>
     15 
     16 
---> 17 from google.cloud.bigquery_v2.proto import (
     18     model_reference_pb2 as google_dot_cloud_dot_bigquery__v2_dot_proto_dot_model__reference__pb2,
     19 )

~\Anaconda3\lib\site-packages\google\cloud\bigquery_v2\proto\model_reference_pb2.py in <module>
     15 
     16 
---> 17 from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
     18 
     19 

~\Anaconda3\lib\site-packages\google\api\annotations_pb2.py in <module>
      8 from google.protobuf import reflection as _reflection
      9 from google.protobuf import symbol_database as _symbol_database
---> 10 from google.protobuf import descriptor_pb2
     11 # @@protoc_insertion_point(imports)
     12 

~\Anaconda3\lib\site-packages\google\protobuf\descriptor_pb2.py in <module>
   1113       message_type=None, enum_type=None, containing_type=None,
   1114       is_extension=False, extension_scope=None,
-> 1115       serialized_options=None, file=DESCRIPTOR),
   1116     _descriptor.FieldDescriptor(
   1117       name='ruby_package', full_name='google.protobuf.FileOptions.ruby_package', index=19,

~\Anaconda3\lib\site-packages\google\protobuf\descriptor.py in __new__(cls, name, full_name, index, number, type, cpp_type, label, default_value, message_type, enum_type, containing_type, is_extension, extension_scope, options, serialized_options, has_default_value, containing_oneof, json_name, file)
    532         return _message.default_pool.FindExtensionByName(full_name)
    533       else:
--> 534         return _message.default_pool.FindFieldByName(full_name)
    535 
    536   def __init__(self, name, full_name, index, number, type, cpp_type, label,

KeyError: "Couldn't find field google.protobuf.FileOptions.php_metadata_namespace"

【问题讨论】:

【参考方案1】:

这似乎是 protobuf 版本的问题。我发现了一个类似的问题here。

尝试卸载并再次安装:

pip3 uninstall protobuf
pip3 install protobuf==3.5.2

@marctuscher 提供的解决方案

另外,here's protobuf 版本列表。

【讨论】:

抱歉,评论延迟。谢谢!卸载并重新安装 protobuf 解决了这个问题。 没问题,很高兴有帮助

以上是关于尝试使用 pandas.read_gbq() 函数时,pandas-gbq 抛出错误“找不到字段 google.protobuf.FileOptions.php_metadata_namespace”的主要内容,如果未能解决你的问题,请参考以下文章

pandas read_gbq 不工作并显示错误

在 Jupyter Notebook 中进行时,pandas.read_gbq() 在哪里“保存”查询?

如何防止 pandas.read_gbq 推断列的数据类型

使用 Dataflow 的 Pandas read_gbq 初始化错误

如何在使用 Pandas.read_gbq 加载带有列表列的表后恢复结构?

如果存在于列表中,则根据另一列的值选择列