google / protobuf Goto Github PK
View Code? Open in Web Editor NEWProtocol Buffers - Google's data interchange format
Home Page: http://protobuf.dev
License: Other
Protocol Buffers - Google's data interchange format
Home Page: http://protobuf.dev
License: Other
There is already a PPA for ProtoBuf, https://launchpad.net/ubuntu/+source/protobuf , but it has only 2.5.0-9 version.
To have your official up-to-date PPA on Ubuntu featuring protoc compiler would be a big deal. Also, under the release section, you could publish debian and, maybe, rpm protoc packages, in addition to the already existing Windows' one.
I've tried following the instructions on installing protoc but there is a failure during make check
step. The error I get is:
Making lib/libgtest.a lib/libgtest_main.a in gtest
depbase=`echo src/gtest-all.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
/bin/sh ./libtool --tag=CXX --mode=compile g++ -DHAVE_CONFIG_H -I. -I./build-aux -I. -I./include -D_THREAD_SAFE -DGTEST_HAS_PTHREAD=1 -g -DNDEBUG -MT src/gtest-all.lo -MD -MP -MF $depbase.Tpo -c -o src/gtest-all.lo src/gtest-all.cc &&\
mv -f $depbase.Tpo $depbase.Plo
libtool: compile: g++ -DHAVE_CONFIG_H -I. -I./build-aux -I. -I./include -D_THREAD_SAFE -DGTEST_HAS_PTHREAD=1 -g -DNDEBUG -MT src/gtest-all.lo -MD -MP -MF src/.deps/gtest-all.Tpo -c src/gtest-all.cc -fno-common -DPIC -o src/.libs/gtest-all.o
In file included from src/gtest-all.cc:39:
In file included from ./include/gtest/gtest.h:57:
In file included from ./include/gtest/internal/gtest-internal.h:40:
./include/gtest/internal/gtest-port.h:449:10: fatal error: 'tr1/tuple' file not found
#include <tr1/tuple> // NOLINT
^
1 error generated.
make[3]: *** [src/gtest-all.lo] Error 1
make[2]: *** [check-local] Error 2
make[1]: *** [check-am] Error 2
make: *** [check-recursive] Error 1
Is there any workaround for this?
Dear friends,
Please add my project to your third-party/add-ons list, I call it Friends of ProtoBuf!
The project is located at https://github.com/protobufel/protobuf-el , very near you at GitHub :)
I've implemented 100% ProtoBuf 2.6.1 compatible:
In addition, there is a very efficient file scanning and multi-root resource processing, general use, library, and the Java extended regex implementation.
I know, this list of add-ons is on your Google Pages, so pardon me for bothering you here :)
Looking forward to be at your service, if I can.
Sincerely,
your protobufel friend
David Tesler
As per spec and to retain FileDescriptorProto's canonical nature, protoc should validate, but not make the FieldDescriptorProto's default value different form the original proto. Also, in Java if one manually creates FileDescriptorProto, its defaults are not changed. Different languages do the conversion of the defaults differently, so the produced/serialized protos will be different.
In addition, to make things more canonical/compatible across different language realms, it makes things easier, easier/faster to process, and much easier to implement/maintain.
Hello,
I had my C++ binary linked dynamically together by accident with -lprotobuf and -lprotobuf-lite. Therefore after my commits (pull request #43) I noticed that binary is executing google::protobuf::internal::InitializeDefaultRepeatedFields twice and crashing after calling google::protobuf::internal::DestroyDefaultRepeatedFields second time (once for protobuf and once for protobuf-lite). Because this operates on one memory (global variables are overridden on second initialization) there is double free...
Is this valid scenario? Having two libraries linked in? Or should I fix this crash? This can be easily fixed by using GoogleOnceInit however I don't know if this is worth fixing...
What do you think?
diff --git a/src/google/protobuf/extension_set.cc b/src/google/protobuf/extension_set.cc
index 274554b..579ae31 100644
--- a/src/google/protobuf/extension_set.cc
+++ b/src/google/protobuf/extension_set.cc
@@ -1618,10 +1618,11 @@ PROTOBUF_DEFINE_DEFAULT_REPEATED(bool)
#undef PROTOBUF_DEFINE_DEFAULT_REPEATED
+GOOGLE_PROTOBUF_DECLARE_ONCE(repeated_fields_init);
+
struct StaticDefaultRepeatedFieldsInitializer {
StaticDefaultRepeatedFieldsInitializer() {
- InitializeDefaultRepeatedFields();
- OnShutdown(&DestroyDefaultRepeatedFields);
+ GoogleOnceInit(&repeated_fields_init, &InitializeDefaultRepeatedFields);
}
} static_repeated_fields_initializer;
@@ -1644,6 +1645,7 @@ void InitializeDefaultRepeatedFields() {
new RepeatedField<float>;
RepeatedPrimitiveGenericTypeTraits::default_repeated_field_bool_ =
new RepeatedField<bool>;
+ OnShutdown(&DestroyDefaultRepeatedFields);
}
void DestroyDefaultRepeatedFields() {
The following line causes a ./configure failure on MinGW:
# Need to link against rt on Solaris
AC_SEARCH_LIBS([sched_yield], [rt], [], [AC_MSG_FAILURE([sched_yield was not found on your system])])
It's specific to Solaris and should be moved to m4/acx_check_suncc.m4
Hi all,
I am having issues trying to install the protobuf compiler and am stuck on the first step: running autogen.sh.
Just curious if there is a quick fix for this. Googled for a fix, and the common consensus is to just download (gnu)libtool which I did via homebrew, but regardless I still get the following problem:
configure.ac:29: error: possibly undefined macro: AC_PROG_LIBTOOL
Here is the whole command line result:
o ./autogen.sh
Google Test not present. Fetching gtest-1.5.0 from the web...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 638k 100 638k 0 0 650k 0 --:--:-- --:--:-- --:--:-- 650k
\+ sed -i -e 's/RuntimeLibrary="5"/RuntimeLibrary="3"/g;
s/RuntimeLibrary="4"/RuntimeLibrary="2"/g;' gtest/msvc/gtest-md.vcproj gtest/msvc/gtest.vcproj gtest/msvc/gtest_main-md.vcproj gtest/msvc/gtest_main.vcproj gtest/msvc/gtest_prod_test-md.vcproj gtest/msvc/gtest_prod_test.vcproj gtest/msvc/gtest_unittest-md.vcproj gtest/msvc/gtest_unittest.vcproj
\+ autoreconf -f -i -Wall,no-obsolete
main::scan_file() called too early to check prototype at /usr/bin/aclocal line 603.
Useless use of /d modifier in transliteration operator at /usr/share/automake-1.10/Automake/Wrap.pm line 60.
Useless use of /d modifier in transliteration operator at /usr/share/automake-1.10/Automake/Wrap.pm line 60.
main::scan_file() called too early to check prototype at /usr/bin/aclocal line 603.
main::scan_file() called too early to check prototype at /usr/bin/aclocal line 603.
*configure.ac:29: error: possibly undefined macro: AC_PROG_LIBTOOL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
autoreconf: /usr/bin/autoconf failed with exit status: 1*
Both pip-3.4 install protobuf
and manual installation with python3 setup.py build
fail with an invalid syntax exception in ez_setup.py.
Tested 2.6.0 using Python 3.4 on Ubuntu 14.04.
Stack trace from the failed manual installation:
rob@rob-pc:~/lib/protobuf/python$ python3 setup.py build
Traceback (most recent call last):
File "setup.py", line 199, in <module>
"Protocol Buffers are Google's data interchange format.",
File "/usr/lib/python3.4/distutils/core.py", line 108, in setup
_setup_distribution = dist = klass(attrs)
File "/usr/local/lib/python3.4/dist-packages/setuptools/dist.py", line 262, in __init__
self.fetch_build_eggs(attrs['setup_requires'])
File "/usr/local/lib/python3.4/dist-packages/setuptools/dist.py", line 287, in fetch_build_eggs
replace_conflicting=True,
File "/usr/local/lib/python3.4/dist-packages/pkg_resources.py", line 631, in resolve
dist = best[req.key] = env.best_match(req, ws, installer)
File "/usr/local/lib/python3.4/dist-packages/pkg_resources.py", line 874, in best_match
return self.obtain(req, installer)
File "/usr/local/lib/python3.4/dist-packages/pkg_resources.py", line 886, in obtain
return installer(requirement)
File "/usr/local/lib/python3.4/dist-packages/setuptools/dist.py", line 338, in fetch_build_egg
return cmd.easy_install(req)
File "/usr/local/lib/python3.4/dist-packages/setuptools/command/easy_install.py", line 613, in easy_install
return self.install_item(spec, dist.location, tmpdir, deps)
File "/usr/local/lib/python3.4/dist-packages/setuptools/command/easy_install.py", line 643, in install_item
dists = self.install_eggs(spec, download, tmpdir)
File "/usr/local/lib/python3.4/dist-packages/setuptools/command/easy_install.py", line 833, in install_eggs
return self.build_and_install(setup_script, setup_base)
File "/usr/local/lib/python3.4/dist-packages/setuptools/command/easy_install.py", line 1055, in build_and_install
self.run_setup(setup_script, setup_base, args)
File "/usr/local/lib/python3.4/dist-packages/setuptools/command/easy_install.py", line 1040, in run_setup
run_setup(setup_script, args)
File "/usr/local/lib/python3.4/dist-packages/setuptools/sandbox.py", line 68, in run_setup
DirectorySandbox(setup_dir).run(runner)
File "/usr/local/lib/python3.4/dist-packages/setuptools/sandbox.py", line 114, in run
return func()
File "/usr/local/lib/python3.4/dist-packages/setuptools/sandbox.py", line 67, in runner
_execfile(setup_script, ns)
File "/usr/local/lib/python3.4/dist-packages/setuptools/sandbox.py", line 43, in _execfile
exec(code, globals, locals)
File "/tmp/easy_install-thdmg8q1/google-apputils-0.4.0/setup.py", line 18, in <module>
sys.stderr.write(
File "/tmp/easy_install-thdmg8q1/google-apputils-0.4.0/ez_setup.py", line 79
except pkg_resources.VersionConflict, e:
^
SyntaxError: invalid syntax
String produced by this method cannot be parsed back with merge() due to unescaped newlines.
Shouldn't the escapeDoubleQuotesAndBackslashes() method replace newlines with \n?
I had what is likely an all too common problem, where I added a set_foo() call between ByteSize() and a cached serialize. I suspect this could be detected, at least with a debug mode, by setting a flag in the message, and if it's rendered after a set, either recompute or error out.
Is it a bug, or a feature?
David
Java generator wraps proto comments in <pre> tag and also html-escapes the comment value.
For example, this proto definition:
// <p>a comment</p>
message MyMessage {
}
is compiled into this Java source:
/**
* Protobuf type {@code chronos.proto.MyMessage}
*
* <pre>
* <p>a comment</p>
* </pre>
*/
public static final class MyMessage extends ...
Please change the generator behavior or add an option that will disable this behavior.
When it is enabled, proto comments should be converted into javadoc as is, without wrapping it in any additional tags or performing any escaping.
download release packege from https://protobuf.googlecode.com/svn/rc/protobuf-2.6.0.tar.gz
yeah, for some all we known reasons i cannot build from git source because it need to download gtest from google site
./configure && make && make install && ldconfig
protoc installed successfully, then go to install python module
cd python
export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp
export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION_VERSION=2
python setup.py build --cpp_implementation && python setup.py test --cpp_implementation && python setup.py install --cpp_implementation
indeed it said all goes well, but when i tried my app, it complained that missing pyext
ah... something wrong!
then i took a look at the protobuf egg, it DID HAVE NO pyext module
am i doing wrong?
The DynamicMessage.setField has wrong check, preventing setting any ENUM type list on it!
This is a major bug; this has been introduced only in 2.6.0.
Here is the excerpt from the source:
public Builder setField(FieldDescriptor field, Object value) {
verifyContainingType(field);
ensureIsMutable();
if (field.getType() == FieldDescriptor.Type.ENUM) {
verifyEnumType(field, value);
}
...
And verifyEnumType doesn't check for repeated value; here it is:
private void verifyEnumType(FieldDescriptor field, Object value) {
if (value == null) {
throw new NullPointerException();
}
if (!(value instanceof EnumValueDescriptor)) {
throw new IllegalArgumentException(
"DynamicMessage should use EnumValueDescriptor to set Enum Value.");
}
...
If field is repeated, then its value is instanceof List, will never be EnumValueDescriptor!
C++11 offers new features that may be of interest to applications using protoc, and it would be nice to include an optional flag to protoc indicating C++11 can be used, possibly on a language-feature by language-feature basis.
For example, C++11 offers both enum classes (enum class E
- C++11 7.2 p2 ) and explicit enum types (enum Foo : int
C++11 7.2 p5), both of which allow for forward declaration of enums. By allowing forward declaration, C++ applications can reduce the places that include the .pb.h with the enum definition.
Protobufs currently presume/guarantee that an enum can be expressed as an int (vis-a-vis https://github.com/google/protobuf/blob/e428862450679aaccd3675d1092f9f5253b7bd43/src/google/protobuf/compiler/cpp/cpp_enum.cc#L231 ), so having something like https://github.com/google/protobuf/blob/e428862450679aaccd3675d1092f9f5253b7bd43/src/google/protobuf/compiler/cpp/cpp_enum.cc#L78 instead use enum $classname$ : int {
would allow for forward declarations.
Scenario:
Expected:
There are 2 alternatives:
Actual:
The field builder allows to mutate it with no warnings, and the oneof case is not changed back to this field!
Proposed Remedy:
Add extra oneofFieldChanged(Internal.EnumLite | FieldDescriptor) method to GeneratedMessage.BuilderParent interface, and notify parent of any oneof field builder's mutations.
When notified, either throw an exception if oneof is set to different case, or change the oneof's case to this field's case.
The good news is that it just clarifies the existing behavior and doesn't change any public interfaces and the existing techniques. Not implementing this fix will cause numerous, difficult to trace, logical problems with oneof in such a common scenario.
In addition, due to the wrong bundle number bug, you have to issue minor update anyway; so this would be the opportune time for the proposed solution.
Cheers,
David
PS: Here is the full JUnit 4 test case for the scenario as-is:
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.is;
import org.junit.Test;
import protobuf_unittest.UnittestProto.TestAllTypes;
import protobuf_unittest.UnittestProto.TestAllTypes.Builder;
import protobuf_unittest.UnittestProto.TestAllTypes.OneofFieldCase;
public class OneofPropertyChangeTest {
@test
public void testOneOfPropertyChangeGeneratedBuilder() {
final Builder builder = TestAllTypes.newBuilder().setOneofString("hello");
assertThat(builder.getOneofFieldCase(), is(equalTo(OneofFieldCase.ONEOF_STRING)));
final TestAllTypes.NestedMessage.Builder innerBuilder = builder.getOneofNestedMessageBuilder();
assertThat(builder.getOneofFieldCase(), is(equalTo(OneofFieldCase.ONEOF_NESTED_MESSAGE)));
builder.setOneofString("hello again");
assertThat(builder.getOneofFieldCase(), is(equalTo(OneofFieldCase.ONEOF_STRING)));
/**
* when oneof's field builder property changes, either set oneof to this
* field, or raise exception when other field set on this oneof already!
* At present, the builder will allow any its mutation with no effect on its parent!
*/
innerBuilder.setBb(999);
// the following should not be failing, but it fails!
assertThat(builder.getOneofFieldCase(), is(equalTo(OneofFieldCase.ONEOF_NESTED_MESSAGE)));
}
}
This repo only has a tag for 2.6.0. I was looking for prior versions so I could do some diffs. Presumably the source is here, but would you mind adding tags for the older versions?
Hi guys,
We have some customizations built on top of protoc 2.4... in fact we have our own svn->git import of protoc's version history up to r495. Of course, the SHAs don't match so our history is completely divergent from what's in github now. That's fine, we can work around that, but we'd like to bring in changes up to 2.5.0 before trying to make the jump to 2.6.0. Any chance you guys could bring in the commits on the 2.5.0 svn tag line, and tag 2.5.0 in git?
Thanks,
Scott
Scenario:
test1.proto:
message Message1 {
extensions 1 to max;
}
There is possibility that this is the Java ProtoFileDescriptorSet.parseFrom() bug, though.I didn't have time to check.
Cheers,
David
_net_proto2___python
appears to no longer exist
./google/protobuf/stubs/platform_macros.h:61:2:
#error Host architecture was not detected as supported by protobuf
why? there should be a portable fallback for platforms that are not "zomg asm optimized"
In google/protobuf/unittest_custom_options.proto the message VariousComplexOptions
has been serialized differently, though compatibly, in Java and by protoc:
message_type {
name: "VariousComplexOptions"
options {
7595468 {
7593951: 24
}
7633546: "\b\263\017"
7636463: "\b\t"
7636463: "\023\030\026\024"
7636949: "\020\333\a"
7636949: "\370\346\227\035\216\005"
7636949: "\n\003\b\347\005"
7636949: "\n\006\330\205\236\035\317\017"
7636949: "\n\b\222\365\235\035\003\b\330\017"
7636949: "\302\254\227\035\003\b\345\005"
7636949: "\302\254\227\035\006\330\205\236\035\316\017"
7636949: "\302\254\227\035\b\222\365\235\035\003\b\311\020"
7636949: "\032\003\b\301\002"
7636949: "\"\002\be"
7636949: "\"\003\b\324\001"
7646756: "\b*"
7646756: "\330\205\236\035\304\002"
7646756: "\222\365\235\035\003\b\354\006"
7646756: " c"
7646756: " X"
}
}
message_type {
name: "VariousComplexOptions"
options {
7595468 {
7593951: 24
}
7633546: "\b\263\017"
7636463: "\b\t\023\030\026\024"
7636949: "\n\021\b\347\005\222\365\235\035\003\b\330\017\330\205\236\035\317\017\020\333\a\032\003\b\301\002\"\002\be\"\003\b\324\001\302\254\227\035\021\b\345\005\222\365\235\035\003\b\311\020\330\205\236\035\316\017\370\346\227\035\216\005"
7646756: "\b* c X\222\365\235\035\003\b\354\006\330\205\236\035\304\002"
}
}
Both will result in the same extensions, however the serialized/reserialized protoc's protos will be different as the result of this fact. This violates the reserialization invariant and troublesome, as the proto is the main means to compare FileDescriptors.
Either change protoc with regard to making UnknownFieldSet for proto's custom options, or change the Java end.
At present, to find whether the MessageOrBuilder is empty one needs to call the following:
messageOrBuilder.getUnknownFields().asMap().isEmpty()
&& messageOrBuilder.getAllFields().isEmpty();
It is cumbersome and expensive ( getAllFields() in many cases would recreate the fields' map!).
However, to implement it from within the MessageOrBuilder, all needed is to just call internal map.isEmpty() or such.
There are so many valid and common cases for emptiness check, the are done often and in many places, so this API and performance optimization is crucial, IMHO.
There are two implementation alternatives:
Any implementation alternative will also solve the "is MessageOrBuilder default?" question, which currently resolved differently for GeneratedMessage and DynamicMessage.
The alternative 1 will require casting to these basic classes, so it is not w/o cons.
This request is not extremely urgent, so it could be implemented anytime after 2.6.1.
Here are logs showing what happens when I run: Ctrl+Shift+B (Build Solution) and build libprotobuf.
A commonly occurring error is:
error C2146: syntax error : missing ';' before identifier 'ThreadCache'
At this point of code:
static __thread ThreadCache thread_cache_;
javax.annotation.Generated
exists since Java 6 (relevant javadoc) so it should be safe to generate source code that contains it. The presence of this annotation has no impact on the runtime characteristics of the annotated classes.
When this annotation is present it greatly helps in configuration of different tools related to code analysis, tests and tests coverage.
Both the outer Java class and all inner builder and message classes and interfaces should be annotated with javax.annotation.Generated
.
I think it is long overdue to have the separate
This would leverage the language specific tools, libraries, and domains, allowing for better, easier, and faster maintenance and evolution.
In addition, regarding Java, it is time to get rid of Java 5, and, maybe, Java 6 support, making a really portable Maven project, with modern tests, and the updated source.
It is very easy to do - the separation could be done within hours! The benefits, IMHO, are huge.
Cheers,
David
Here is what I get while doing Maven install, excerpt:
[WARNING] Javadoc Warnings
[WARNING] src\main\java\com\google\protobuf\ByteString.java:711: warning - End Delimiter } missing for possible See Tag in comment string: "Creates an {@code InputStream} which can be used to read the bytes.
[WARNING] <p>
[WARNING] The {@link InputStream} returned by this method is guaranteed to be
[WARNING] completely non-blocking. The method {@link InputStream#available()}
[WARNING] returns the number of bytes remaining in the stream. The methods
[WARNING] {@link InputStream#read(byte[]), {@link InputStream#read(byte[],int,int)}
[WARNING] and {@link InputStream#skip(long)} will read/skip as many bytes as are
[WARNING] available.
[WARNING] <p>
[WARNING] The methods in the returned {@link InputStream} might <b>not</b> be
[WARNING] thread safe."
[WARNING] \src\main\java\com\google\protobuf\ByteString.java:509: warning - @link{#copyTo(byte[],int,int,int}. is an unknown tag.
[WARNING] \src\main\java\com\google\protobuf\ByteString.java:509: warning - @code{numberToCopy is an unknown tag.
[WARNING] \src\main\java\com\google\protobuf\Internal.java:275: warning - Tag @link: can't find equals() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:311: warning - Tag @link: can't find equals() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:324: warning - Tag @link: can't find equals() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:245: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:299: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:288: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:355: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:341: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:256: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:264: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
[WARNING] \src\main\java\com\google\protobuf\Internal.java:236: warning - Tag @link: can't find hashCode() in com.google.protobuf.MessageLite
It is crucial to know whether an ExtensionRegistry is empty - for a number of reasons; but impossible at the moment. It could have been produced as ExtensionRegistry.getEmptyRegistry(), or as the result of ExtensionRegistry.newInstance(), or ExtensionRegistry.newInstance().getUnmodifiable(), and so forth.
However, there is no access to the underlying map, or its length, and also there is no equals.
I've run into a problem when using Python 3.4 and protobuf 2.6.0.
I get several errors including:
protobuf/python/google/protobuf/internal/decoder.py, line 167
_DecodeVarint = _VarintDecoder((1 << 64) - 1, long)
NameError: name 'long' is not defined
I'm seeing the following warning when building code with all the warnings enabled. The diff at the end of the file fixes it.
command is In file included from test.pb.cc:5:
In file included from test.pb.h:23:
In file included from google/protobuf/message.h:120:
google/protobuf/descriptor.h:1283:21: error: unused parameter 'filename' [-Werror,-Wunused-parameter]
const string& filename, // File name in which the error occurred.
^
google/protobuf/descriptor.h:1284:21: error: unused parameter 'element_name' [-Werror,-Wunused-parameter]
const string& element_name, // Full name of the erroneous element.
^
google/protobuf/descriptor.h:1285:22: error: unused parameter 'descriptor' [-Werror,-Wunused-parameter]
const Message* descriptor, // Descriptor of the erroneous element.
^
google/protobuf/descriptor.h:1286:21: error: unused parameter 'location' [-Werror,-Wunused-parameter]
ErrorLocation location, // One of the location constants, above.
^
google/protobuf/descriptor.h:1287:21: error: unused parameter 'message' [-Werror,-Wunused-parameter]
const string& message // Human-readable error message.
^
diff src/google/protobuf/message.h /tmp/protobuf/src/google/protobuf/message.h
392,393c392,393
< virtual bool HasOneof(const Message& /message/,
virtual bool HasOneof(const Message& message,
const OneofDescriptor* oneof_descriptor) const {
397,398c397,398
< virtual void ClearOneof(Message* /message/,< const OneofDescriptor* /oneof_descriptor/) const {}
virtual void ClearOneof(Message* message,
const OneofDescriptor* oneof_descriptor) const {}
403,404c403,404
< const Message& /message/,< const OneofDescriptor* /oneof_descriptor/) const {
const Message& message, const OneofDescriptor* oneof_descriptor) const {
The distribution protobuf-2.6.0.zip (download from the site) has java/pom.xml which in line 151 specifies
wrong version, 2.5.0! And here is the line:
com.google.protobuf;version=2.5.0
IMHO, this is a major bug!
Cheers,
David
due to use of INT32_MAX / INT32_MIN, which are non-standard under C99.
At commit 2c9392f the version-info was changed from 8:0:0 to 9:0:0. Was this necessary? Were interfaces changed or deleted? If not, then this causes unnecessary churn on linux distros.
See https://www.gnu.org/software/libtool/manual/html_node/Updating-version-info.html.
Since google-apputils was added as a dependency on 2.6.0 we have dependency conflict on python-dateutil, apparently google-apputils needs 'python-dateutil>=1.4,<2' but the latest public version is 2.2 which is needed and installed by other packages.
Current workaround is to downgrade python-dateutil to 1.5 but it'll probably break something sooner or later, any chance to loose a bit the requirements?
Thanks
I have two message types:
message Record {
optional string foo = 1;
optional int32 bar = 2;
extensions 100 to max;
}
message SpecializedRecord {
extend Record {
optional SpecializedRecord parent = 100;
}
optional uint32 special_thing = 1;
}
I'm trying to use reflection to access special_thing
by name in C++. First I retrieve parent
and then get the extension message to do reflection on that. I'm having trouble getting the extension from the original message.
This works:
const Reflection* record_reflection = record.GetReflection();
vector<const FieldDescriptor*> fields;
record_reflection->ListFields(record, &fields);
const FieldDescriptor* extension_field = 0;
for (int i = 0; i < fields.size(); ++i) {
if (fields.at(i)->name() == "parent") {
extension_field = fields.at(i);
}
}
CHECK_NOTNULL(extension_field);
This doesn't work:
const Descriptor* record_descriptor = record.GetDescriptor();
const FieldDescriptor* extension_field = record_descriptor->FindFieldByName("parent");
CHECK_NOTNULL(extension_field);
Why?
Visual studio raises a warning when compiling wire_format_lite_inl.h:383
for (int i = 0; i < new_entries; ++i) {
as new_entries is uint32 I believe the line should be
for (uint32 i = 0; i < new_entries; ++i) {
instead
Hi,
When protobuf is patched to expose the generic atomicops implementation on Clang as in PR #21, the following warnings are generated:
./google/protobuf/stubs/atomicops_internals_generic_gcc.h:86:32: warning: memory order argument to atomic operation is invalid [-Watomic-memory-ordering]
__atomic_store_n(ptr, value, __ATOMIC_ACQUIRE);
^~~~~~~~~~~~~~~~
./google/protobuf/stubs/atomicops_internals_generic_gcc.h:102:31: warning: memory order argument to atomic operation is invalid [-Watomic-memory-ordering]
return __atomic_load_n(ptr, __ATOMIC_RELEASE);
^~~~~~~~~~~~~~~~
According to the gcc documentation at https://gcc.gnu.org/onlinedocs/gcc-4.9.1/gcc/_005f_005fatomic-Builtins.html, these arguments are indeed invalid. __atomic_store_n()
cannot be passed __ATOMIC_ACQUIRE
and __atomic_load_n()
cannot be passed __ATOMIC_RELEASE
.
I asked about this on IRC and got the following response from someone knowledgeable about atomic memory ordering.
"release + load (what the function name indicates) usually isn't a very interesting combination of behaviours."
"I doubt it has many users in protobuf?"
"if you want to get rid of the warnings you can probably add a explicit __atomic_thread_fence(__ATOMIC_RELEASE)
and use __ATOMIC_RELAXED
in the actually intended atomic."
"Alternatively just promote them to SEQ_CST
;)"
Indeed, a git grep
of the protobuf repository shows that there aren't actually any users of the Acquire_Store()
and Release_Load()
functions.
Java's FileDescriptor#internalUpdateFileDescriptor() is very useful, however it violates FileDescriptor's finality, and therefor is a source of potential difficult to catch bugs and security threats. And its name "internal" along with JavaDoc's warning, doesn't change this fact.
Please don't remove/hide this public method! Instead, introduce seal() and isSealed() methods. Internally, it could be just one boolean isSealed field, which, when true, will prevent any updates, including internalUpdateFileDescriptor(), and also will vouch for FileDescriptor's finality.
Here is the excerpt from Descriptors$FieldDescriptor constructor:
private FieldDescriptor(final FieldDescriptorProto proto,
final FileDescriptor file,
final Descriptor parent,
final int index,
final boolean isExtension)
throws DescriptorValidationException {
this.index = index;
this.proto = proto;
fullName = computeFullName(file, parent, proto.getName());
this.file = file;
if (proto.hasType()) {
type = Type.valueOf(proto.getType());
}
if (getNumber() <= 0) {
throw new DescriptorValidationException(this,
"Field numbers must be positive integers.");
}
// Only repeated primitive fields may be packed.
if (proto.getOptions().getPacked() && !isPackable()) {
...
public boolean isPackable() {
return isRepeated() && getLiteType().isPackable();
}
public WireFormat.FieldType getLiteType() {
return table[type.ordinal()];
}
However, if there is an enum or a message proto field with unresolved type (which is okay during cross-linking), then isPackable() will NPE because type ==null in getLiteType() !
This is a serious bug, preventing build of ANY protos with enum or message fields with only the TypeName set. The protoc does its own cross-linking so its Type fields are always set, anything else will fail miserably!
The resolution is simple add type != null in this condition, and if wanted check again after cross-linking:
if ((type != null) && proto.getOptions().getPacked() && !isPackable()) {
Thanks in advance, and hope it will be released soon!
Regards,
David
When I install protobuf python lib with README's instruction, 'pyext/cpp_message.py' never included into egg.
I just did
python setup.py build --cpp_implementation
python setup.py install --cpp_implementation
By several tries, I manually put put file like this in setup.py
py_modules = [
'google.protobuf.internal.api_implementation',
'google.protobuf.internal.containers',
'google.protobuf.internal.cpp_message',
'google.protobuf.internal.decoder',
'google.protobuf.internal.encoder',
'google.protobuf.internal.enum_type_wrapper',
'google.protobuf.internal.message_listener',
'google.protobuf.internal.python_message',
'google.protobuf.internal.type_checkers',
'google.protobuf.internal.wire_format',
'google.protobuf.pyext.cpp_message',
'google.protobuf.descriptor',
'google.protobuf.descriptor_pb2',
'google.protobuf.compiler.plugin_pb2',
'google.protobuf.message',
'google.protobuf.descriptor_database',
'google.protobuf.descriptor_pool',
'google.protobuf.message_factory',
'google.protobuf.reflection',
'google.protobuf.service',
'google.protobuf.service_reflection',
'google.protobuf.symbol_database',
'google.protobuf.text_encoding',
'google.protobuf.text_format'],
and now it works well with using cpp option.
Is there any special install options that I didn't know or any instructions I forgot?
Thanks.
google/protobuf/stubs/atomicops_internals_solaris.h: No such file or directory
The file has not been included in protobuf-2.6.1.tar.gz
So after:
./configure LDFLAGS=-L$PWD/src/solaris --disable-64bit-solaris --prefix=/alocation/protobuf32-2.6.1
it fails to make.
( uname -a -> SunOS mxcapd11 5.10 Generic_150401-05 i86pc i386 i86pc )
For example:
// Protocol Buffers - Google's data interchange format
// Copyright 2008 Google Inc. All rights reserved.
// http://code.google.com/p/protobuf/
http://code.google.com/p/protobuf/ should be replaced with either https://developers.google.com/protocol-buffers/ or https://github.com/google/protobuf.
Protobuf's version 2.6.0 Java library is not uploaded to maven central repository yet. Can someone please upload it? Thanks!
Unless I'm mistaken these projects no longer exist?
The recent change in 2.6.0 to kind of allow for repeated custom options, as seen in google/protobuf/unittest_custom_options.proto, feels and is, IMHO, as a hack for allowing repeated fields in custom options.
Therefore, the right thing to do is either disallow repeated option fields as before, or allow repeated LABEL and option LABEL along with it.
In case of both allowed, re-parsing of the FileDescriptorProto in FileDescriptor.buildFrom, should result in either repeated, or optional field/extension set on options.
Here is the google/protobuf/unittest_custom_options.proto excerpt:
// Note that we try various different ways of naming the same extension.
message VariousComplexOptions {
option (.protobuf_unittest.complex_opt1).foo = 42;
option (protobuf_unittest.complex_opt1).(.protobuf_unittest.quux) = 324;
option (.protobuf_unittest.complex_opt1).(protobuf_unittest.corge).qux = 876;
option (protobuf_unittest.complex_opt1).foo4 = 99;
option (protobuf_unittest.complex_opt1).foo4 = 88;
option (complex_opt2).baz = 987;
option (complex_opt2).(grault) = 654;
option (complex_opt2).bar.foo = 743;
option (complex_opt2).bar.(quux) = 1999;
option (complex_opt2).bar.(protobuf_unittest.corge).qux = 2008;
option (complex_opt2).(garply).foo = 741;
option (complex_opt2).(garply).(.protobuf_unittest.quux) = 1998;
option (complex_opt2).(protobuf_unittest.garply).(corge).qux = 2121;
option (ComplexOptionType2.ComplexOptionType4.complex_opt4).waldo = 1971;
option (complex_opt2).fred.waldo = 321;
option (complex_opt2).barney = { waldo: 101 };
option (complex_opt2).barney = { waldo: 212 };
option (protobuf_unittest.complex_opt3).qux = 9;
option (complex_opt3).complexoptiontype5.plugh = 22;
option (complexopt6).xyzzy = 24;
}
And here is the same FileDescriptorProto, as seen by TextFormat:
message_type {
name: "VariousComplexOptions"
options {
7595468 {
7593951: 24
}
7633546: "\b\263\017"
7636463: "\b\t"
7636463: "\023\030\026\024"
7636949: "\020\333\a"
7636949: "\370\346\227\035\216\005"
7636949: "\n\003\b\347\005"
7636949: "\n\006\330\205\236\035\317\017"
7636949: "\n\b\222\365\235\035\003\b\330\017"
7636949: "\302\254\227\035\003\b\345\005"
7636949: "\302\254\227\035\006\330\205\236\035\316\017"
7636949: "\302\254\227\035\b\222\365\235\035\003\b\311\020"
7636949: "\032\003\b\301\002"
7636949: ""\002\be"
7636949: ""\003\b\324\001"
7646756: "\b*"
7646756: "\330\205\236\035\304\002"
7646756: "\222\365\235\035\003\b\354\006"
7646756: " c"
7646756: " X"
}
}
I believe in 2.5 it is (if one disabled protoc 2.5 error reporting) :
message_type {
name: "VariousComplexOptions"
options {
7595468 {
7593951: 24
}
7633546: "\b\263\017"
7636463: "\023\030\026\024"
7636949: ""\003\b\324\001"
7646756: " X"
}
}
In addition, I think that whole concept of custom options defined in the same proto as a message defined in that proto, is self-referencing in a bad way, which requires extra processing for the sake of nothing. Whoever wants custom options can define his/her own format within a custom value of type string; or protobuf can provide for a standard message type for custom options, with this message type fully defined in the standard descriptor.proto, resulting in MUCH less processing, MUCH less confusion, and MUCH clearer and CLEANER specification.
I intend to expand on this thought/critique in a separate request/issue.
Regards,
David
When objects with the same name are defined in two separate .proto
files the above error is thrown only when using the C++ backend for Python
In https://github.com/google/protobuf/wiki/Third-Party-Add-ons, please update the link for rpcz to
https://github.com/thesamet/rpcz
Thanks,
Nadav
Consider the following enum defined within a .proto file:
enum CoreDataScope {
local = 1;
file = 2;
global = 3;
invalid = 4;
}
Protoc.exe then generates the following Python code (showing just a snippet):
///////
_COREDATASCOPE = descriptor.EnumDescriptor(
name='CoreDataScope',
full_name='SIES.Meta.CoreDataScope',
filename=None,
file=DESCRIPTOR,
values=[
descriptor.EnumValueDescriptor(
name='local', index=0, number=1,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='file', index=1, number=2,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='global', index=2, number=3,
options=None,
type=None),
descriptor.EnumValueDescriptor(
name='invalid', index=3, number=4,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=12851,
serialized_end=12918,
)
local = 1
file = 2
global = 3
invalid = 4
///////
The above fails in Python due to the use of the reserved keyword "global". I think this is a defect in the auto code generator for Python (protoc.exe)
Lots of applications want a stream of protobuf messages in a file or a network stream.
It could be as simple as exposing the internal utility functions to write a varint to a stream. An application could then write a varint length prefix and then the blob of serialized protobuf.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.