Downgrade the protobuf package to 3.20 x or lower mac

By accident I updated protobuf on my ubuntu vps. Now some very essential python scripts don't work anymore. Speed isn't really important. I got two solutions:

TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

Downgrade the protobuf package, not sure it that's the way forward

or Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python - but where do I set this. In the python script?

asked Jul 7 at 14:49

See Changes made in May, 2022 for background.

I'd discourage using PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python as your solution. But if you want to use it, you would need to set this environment variable (and possibly export it too?) in the environment where you're running code that uses the generated sources (both client and server if applicable).

See this thread about the above change.

Here's the protobuf releases.

If you don't want to recompile your protos, you may want to try moving to 3.20.1 but realize this is the end of the line and you're delaying the inevitable...

If you're willing to recompile (and test) your protos, you should consider moving to 4.20.x.

answered Jul 7 at 17:25

Downgrade the protobuf package to 3.20 x or lower mac

DazWilkinDazWilkin

29.1k5 gold badges40 silver badges81 bronze badges

I'm still having this error. Running on WSL on windows 10. run from a conda environment.

I0529 14:26:50.882265 140336145225536 run_docker.py:113] Mounting /mnt/b/ALPHAFOLDTEST/alphafold -> /mnt/fasta_path_0
I0529 14:26:50.885260 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/uniref90 -> /mnt/uniref90_database_path
I0529 14:26:50.887961 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/mgnify -> /mnt/mgnify_database_path
I0529 14:26:50.889641 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA -> /mnt/data_dir
I0529 14:26:50.892182 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/pdb_mmcif -> /mnt/template_mmcif_dir
I0529 14:26:50.894763 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/pdb_mmcif -> /mnt/obsolete_pdbs_path
I0529 14:26:50.897026 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/pdb70 -> /mnt/pdb70_database_path
I0529 14:26:50.900008 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/uniclust30/uniclust30_2018_08 -> /mnt/uniclust30_database_path
I0529 14:26:50.902319 140336145225536 run_docker.py:113] Mounting /mnt/d/WORKING2/DATA/bfd -> /mnt/bfd_database_path
I0529 14:26:52.410578 140336145225536 run_docker.py:255] Traceback (most recent call last):
I0529 14:26:52.410748 140336145225536 run_docker.py:255] File "/app/alphafold/run_alphafold.py", line 38, in
I0529 14:26:52.410840 140336145225536 run_docker.py:255] from alphafold.model import model
I0529 14:26:52.410923 140336145225536 run_docker.py:255] File "/app/alphafold/alphafold/model/model.py", line 20, in
I0529 14:26:52.410997 140336145225536 run_docker.py:255] from alphafold.model import features
I0529 14:26:52.411071 140336145225536 run_docker.py:255] File "/app/alphafold/alphafold/model/features.py", line 19, in
I0529 14:26:52.411145 140336145225536 run_docker.py:255] from alphafold.model.tf import input_pipeline
I0529 14:26:52.411217 140336145225536 run_docker.py:255] File "/app/alphafold/alphafold/model/tf/input_pipeline.py", line 17, in
I0529 14:26:52.411290 140336145225536 run_docker.py:255] from alphafold.model.tf import data_transforms
I0529 14:26:52.411363 140336145225536 run_docker.py:255] File "/app/alphafold/alphafold/model/tf/data_transforms.py", line 18, in
I0529 14:26:52.411433 140336145225536 run_docker.py:255] from alphafold.model.tf import shape_helpers
I0529 14:26:52.411505 140336145225536 run_docker.py:255] File "/app/alphafold/alphafold/model/tf/shape_helpers.py", line 16, in
I0529 14:26:52.411578 140336145225536 run_docker.py:255] import tensorflow.compat.v1 as tf
I0529 14:26:52.411651 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/init.py", line 41, in
I0529 14:26:52.411723 140336145225536 run_docker.py:255] from tensorflow.python.tools import module_util as _module_util
I0529 14:26:52.411808 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/python/init.py", line 40, in
I0529 14:26:52.411911 140336145225536 run_docker.py:255] from tensorflow.python.eager import context
I0529 14:26:52.411981 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/python/eager/context.py", line 32, in
I0529 14:26:52.412053 140336145225536 run_docker.py:255] from tensorflow.core.framework import function_pb2
I0529 14:26:52.412125 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/core/framework/function_pb2.py", line 16, in
I0529 14:26:52.412198 140336145225536 run_docker.py:255] from tensorflow.core.framework import attr_value_pb2 as tensorflow_dot_core_dot_framework_dot_attr__value__pb2
I0529 14:26:52.412269 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/core/framework/attr_value_pb2.py", line 16, in
I0529 14:26:52.412342 140336145225536 run_docker.py:255] from tensorflow.core.framework import tensor_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__pb2
I0529 14:26:52.412413 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/core/framework/tensor_pb2.py", line 16, in
I0529 14:26:52.412485 140336145225536 run_docker.py:255] from tensorflow.core.framework import resource_handle_pb2 as tensorflow_dot_core_dot_framework_dot_resource__handle__pb2
I0529 14:26:52.412558 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/core/framework/resource_handle_pb2.py", line 16, in
I0529 14:26:52.412630 140336145225536 run_docker.py:255] from tensorflow.core.framework import tensor_shape_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__shape__pb2
I0529 14:26:52.412703 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/tensorflow/core/framework/tensor_shape_pb2.py", line 42, in
I0529 14:26:52.412774 140336145225536 run_docker.py:255] serialized_options=None, file=DESCRIPTOR),
I0529 14:26:52.412915 140336145225536 run_docker.py:255] File "/opt/conda/lib/python3.7/site-packages/google/protobuf/descriptor.py", line 560, in new
I0529 14:26:52.412988 140336145225536 run_docker.py:255] _message.Message._CheckCalledFromGeneratedFile()
I0529 14:26:52.413060 140336145225536 run_docker.py:255] TypeError: Descriptors cannot not be created directly.
I0529 14:26:52.413133 140336145225536 run_docker.py:255] If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
I0529 14:26:52.413205 140336145225536 run_docker.py:255] If you cannot immediately regenerate your protos, some other possible workarounds are:
I0529 14:26:52.413276 140336145225536 run_docker.py:255] 1. Downgrade the protobuf package to 3.20.x or lower.
I0529 14:26:52.413346 140336145225536 run_docker.py:255] 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).
I0529 14:26:52.413418 140336145225536 run_docker.py:255]
I0529 14:26:52.413490 140336145225536 run_docker.py:255] More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates

What version of protobuf does gRPC use?

While not mandatory, gRPC applications often leverage Protocol Buffers for service definitions and data serialization. Most of the example code from this site uses version 3 of the protocol buffer language (proto3).

What are proto buffers?

Protocol buffers are a combination of the definition language (created in .proto files), the code that the proto compiler generates to interface with data, language-specific runtime libraries, and the serialization format for data that is written to a file (or sent across a network connection).

What is protobuf Ubuntu?

Protobuf is a free and cross platform library used for serialization and deserialization of structured data. In this write up we discussed two methods of installation of Protobuf on Ubuntu system and also its uninstallation.