Code Monkey home page Code Monkey logo

protobuf.js's Introduction

protobuf.js

donate ❤

Protocol Buffers are a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols, data storage, and more, originally designed at Google (see).

protobuf.js is a pure JavaScript implementation with TypeScript support for node.js and the browser. It's easy to use, blazingly fast and works out of the box with .proto files!

Contents

Installation

node.js

$> npm install protobufjs [--save --save-prefix=~]
var protobuf = require("protobufjs");

The command line utility lives in the protobufjs-cli package and must be installed separately:

$> npm install protobufjs-cli [--save --save-prefix=~]

Note that this library's versioning scheme is not semver-compatible for historical reasons. For guaranteed backward compatibility, always depend on ~6.A.B instead of ^6.A.B (hence the --save-prefix above).

Browsers

Development:

<script src="//cdn.jsdelivr.net/npm/[email protected]/dist/protobuf.js"></script>

Production:

<script src="//cdn.jsdelivr.net/npm/[email protected]/dist/protobuf.min.js"></script>

Remember to replace the version tag with the exact release your project depends upon.

The library supports CommonJS and AMD loaders and also exports globally as protobuf.

Distributions

Where bundle size is a factor, there are additional stripped-down versions of the [full library][dist-full] (~19kb gzipped) available that exclude certain functionality:

  • When working with JSON descriptors (i.e. generated by pbjs) and/or reflection only, see the [light library][dist-light] (~16kb gzipped) that excludes the parser. CommonJS entry point is:

    var protobuf = require("protobufjs/light");
  • When working with statically generated code only, see the [minimal library][dist-minimal] (~6.5kb gzipped) that also excludes reflection. CommonJS entry point is:

    var protobuf = require("protobufjs/minimal");
Distribution Location
Full https://cdn.jsdelivr.net/npm/protobufjs/dist/
Light https://cdn.jsdelivr.net/npm/protobufjs/dist/light/
Minimal https://cdn.jsdelivr.net/npm/protobufjs/dist/minimal/

Usage

Because JavaScript is a dynamically typed language, protobuf.js introduces the concept of a valid message in order to provide the best possible performance (and, as a side product, proper typings):

Valid message

A valid message is an object (1) not missing any required fields and (2) exclusively composed of JS types understood by the wire format writer.

There are two possible types of valid messages and the encoder is able to work with both of these for convenience:

  • Message instances (explicit instances of message classes with default values on their prototype) always (have to) satisfy the requirements of a valid message by design and
  • Plain JavaScript objects that just so happen to be composed in a way satisfying the requirements of a valid message as well.

In a nutshell, the wire format writer understands the following types:

Field type Expected JS type (create, encode) Conversion (fromObject)
s-/u-/int32
s-/fixed32
number (32 bit integer) value | 0 if signed
value >>> 0 if unsigned
s-/u-/int64
s-/fixed64
Long-like (optimal)
number (53 bit integer)
Long.fromValue(value) with long.js
parseInt(value, 10) otherwise
float
double
number Number(value)
bool boolean Boolean(value)
string string String(value)
bytes Uint8Array (optimal)
Buffer (optimal under node)
Array.<number> (8 bit integers)
base64.decode(value) if a string
Object with non-zero .length is assumed to be buffer-like
enum number (32 bit integer) Looks up the numeric id if a string
message Valid message Message.fromObject(value)
  • Explicit undefined and null are considered as not set if the field is optional.
  • Repeated fields are Array.<T>.
  • Map fields are Object.<string,T> with the key being the string representation of the respective value or an 8 characters long binary hash string for Long-likes.
  • Types marked as optimal provide the best performance because no conversion step (i.e. number to low and high bits or base64 string to buffer) is required.

Toolset

With that in mind and again for performance reasons, each message class provides a distinct set of methods with each method doing just one thing. This avoids unnecessary assertions / redundant operations where performance is a concern but also forces a user to perform verification (of plain JavaScript objects that might just so happen to be a valid message) explicitly where necessary - for example when dealing with user input.

Note that Message below refers to any message class.

  • Message.verify(message: Object): null|string
    verifies that a plain JavaScript object satisfies the requirements of a valid message and thus can be encoded without issues. Instead of throwing, it returns the error message as a string, if any.

    var payload = "invalid (not an object)";
    var err = AwesomeMessage.verify(payload);
    if (err)
      throw Error(err);
  • Message.encode(message: Message|Object [, writer: Writer]): Writer
    encodes a message instance or valid plain JavaScript object. This method does not implicitly verify the message and it's up to the user to make sure that the payload is a valid message.

    var buffer = AwesomeMessage.encode(message).finish();
  • Message.encodeDelimited(message: Message|Object [, writer: Writer]): Writer
    works like Message.encode but additionally prepends the length of the message as a varint.

  • Message.decode(reader: Reader|Uint8Array): Message
    decodes a buffer to a message instance. If required fields are missing, it throws a util.ProtocolError with an instance property set to the so far decoded message. If the wire format is invalid, it throws an Error.

    try {
      var decodedMessage = AwesomeMessage.decode(buffer);
    } catch (e) {
        if (e instanceof protobuf.util.ProtocolError) {
          // e.instance holds the so far decoded message with missing required fields
        } else {
          // wire format is invalid
        }
    }
  • Message.decodeDelimited(reader: Reader|Uint8Array): Message
    works like Message.decode but additionally reads the length of the message prepended as a varint.

  • Message.create(properties: Object): Message
    creates a new message instance from a set of properties that satisfy the requirements of a valid message. Where applicable, it is recommended to prefer Message.create over Message.fromObject because it doesn't perform possibly redundant conversion.

    var message = AwesomeMessage.create({ awesomeField: "AwesomeString" });
  • Message.fromObject(object: Object): Message
    converts any non-valid plain JavaScript object to a message instance using the conversion steps outlined within the table above.

    var message = AwesomeMessage.fromObject({ awesomeField: 42 });
    // converts awesomeField to a string
  • Message.toObject(message: Message [, options: ConversionOptions]): Object
    converts a message instance to an arbitrary plain JavaScript object for interoperability with other libraries or storage. The resulting plain JavaScript object might still satisfy the requirements of a valid message depending on the actual conversion options specified, but most of the time it does not.

    var object = AwesomeMessage.toObject(message, {
      enums: String,  // enums as string names
      longs: String,  // longs as strings (requires long.js)
      bytes: String,  // bytes as base64 encoded strings
      defaults: true, // includes default values
      arrays: true,   // populates empty arrays (repeated fields) even if defaults=false
      objects: true,  // populates empty objects (map fields) even if defaults=false
      oneofs: true    // includes virtual oneof fields set to the present field's name
    });

For reference, the following diagram aims to display relationships between the different methods and the concept of a valid message:

Toolset Diagram

In other words: verify indicates that calling create or encode directly on the plain object will [result in a valid message respectively] succeed. fromObject, on the other hand, does conversion from a broader range of plain objects to create valid messages. (ref)

Examples

Using .proto files

It is possible to load existing .proto files using the full library, which parses and compiles the definitions to ready to use (reflection-based) message classes:

// awesome.proto
package awesomepackage;
syntax = "proto3";

message AwesomeMessage {
    string awesome_field = 1; // becomes awesomeField
}
protobuf.load("awesome.proto", function(err, root) {
    if (err)
        throw err;

    // Obtain a message type
    var AwesomeMessage = root.lookupType("awesomepackage.AwesomeMessage");

    // Exemplary payload
    var payload = { awesomeField: "AwesomeString" };

    // Verify the payload if necessary (i.e. when possibly incomplete or invalid)
    var errMsg = AwesomeMessage.verify(payload);
    if (errMsg)
        throw Error(errMsg);

    // Create a new message
    var message = AwesomeMessage.create(payload); // or use .fromObject if conversion is necessary

    // Encode a message to an Uint8Array (browser) or Buffer (node)
    var buffer = AwesomeMessage.encode(message).finish();
    // ... do something with buffer

    // Decode an Uint8Array (browser) or Buffer (node) to a message
    var message = AwesomeMessage.decode(buffer);
    // ... do something with message

    // If the application uses length-delimited buffers, there is also encodeDelimited and decodeDelimited.

    // Maybe convert the message back to a plain object
    var object = AwesomeMessage.toObject(message, {
        longs: String,
        enums: String,
        bytes: String,
        // see ConversionOptions
    });
});

Additionally, promise syntax can be used by omitting the callback, if preferred:

protobuf.load("awesome.proto")
    .then(function(root) {
       ...
    });

Using JSON descriptors

The library utilizes JSON descriptors that are equivalent to a .proto definition. For example, the following is identical to the .proto definition seen above:

// awesome.json
{
  "nested": {
    "awesomepackage": {
      "nested": {
        "AwesomeMessage": {
          "fields": {
            "awesomeField": {
              "type": "string",
              "id": 1
            }
          }
        }
      }
    }
  }
}

JSON descriptors closely resemble the internal reflection structure:

Type (T) Extends Type-specific properties
ReflectionObject options
Namespace ReflectionObject nested
Root Namespace nested
Type Namespace fields
Enum ReflectionObject values
Field ReflectionObject rule, type, id
MapField Field keyType
OneOf ReflectionObject oneof (array of field names)
Service Namespace methods
Method ReflectionObject type, requestType, responseType, requestStream, responseStream
  • Bold properties are required. Italic types are abstract.
  • T.fromJSON(name, json) creates the respective reflection object from a JSON descriptor
  • T#toJSON() creates a JSON descriptor from the respective reflection object (its name is used as the key within the parent)

Exclusively using JSON descriptors instead of .proto files enables the use of just the light library (the parser isn't required in this case).

A JSON descriptor can either be loaded the usual way:

protobuf.load("awesome.json", function(err, root) {
    if (err) throw err;

    // Continue at "Obtain a message type" above
});

Or it can be loaded inline:

var jsonDescriptor = require("./awesome.json"); // exemplary for node

var root = protobuf.Root.fromJSON(jsonDescriptor);

// Continue at "Obtain a message type" above

Using reflection only

Both the full and the light library include full reflection support. One could, for example, define the .proto definitions seen in the examples above using just reflection:

...
var Root  = protobuf.Root,
    Type  = protobuf.Type,
    Field = protobuf.Field;

var AwesomeMessage = new Type("AwesomeMessage").add(new Field("awesomeField", 1, "string"));

var root = new Root().define("awesomepackage").add(AwesomeMessage);

// Continue at "Create a new message" above
...

Detailed information on the reflection structure is available within the API documentation.

Using custom classes

Message classes can also be extended with custom functionality and it is also possible to register a custom constructor with a reflected message type:

...

// Define a custom constructor
function AwesomeMessage(properties) {
    // custom initialization code
    ...
}

// Register the custom constructor with its reflected type (*)
root.lookupType("awesomepackage.AwesomeMessage").ctor = AwesomeMessage;

// Define custom functionality
AwesomeMessage.customStaticMethod = function() { ... };
AwesomeMessage.prototype.customInstanceMethod = function() { ... };

// Continue at "Create a new message" above

(*) Besides referencing its reflected type through AwesomeMessage.$type and AwesomeMesage#$type, the respective custom class is automatically populated with:

  • AwesomeMessage.create
  • AwesomeMessage.encode and AwesomeMessage.encodeDelimited
  • AwesomeMessage.decode and AwesomeMessage.decodeDelimited
  • AwesomeMessage.verify
  • AwesomeMessage.fromObject, AwesomeMessage.toObject and AwesomeMessage#toJSON

Afterwards, decoded messages of this type are instanceof AwesomeMessage.

Alternatively, it is also possible to reuse and extend the internal constructor if custom initialization code is not required:

...

// Reuse the internal constructor
var AwesomeMessage = root.lookupType("awesomepackage.AwesomeMessage").ctor;

// Define custom functionality
AwesomeMessage.customStaticMethod = function() { ... };
AwesomeMessage.prototype.customInstanceMethod = function() { ... };

// Continue at "Create a new message" above

Using services

The library also supports consuming services but it doesn't make any assumptions about the actual transport channel. Instead, a user must provide a suitable RPC implementation, which is an asynchronous function that takes the reflected service method, the binary request and a node-style callback as its parameters:

function rpcImpl(method, requestData, callback) {
    // perform the request using an HTTP request or a WebSocket for example
    var responseData = ...;
    // and call the callback with the binary response afterwards:
    callback(null, responseData);
}

Below is a working example with a typescript implementation using grpc npm package.

const grpc = require('grpc')

const Client = grpc.makeGenericClientConstructor({})
const client = new Client(
  grpcServerUrl,
  grpc.credentials.createInsecure()
)

const rpcImpl = function(method, requestData, callback) {
  client.makeUnaryRequest(
    method.name,
    arg => arg,
    arg => arg,
    requestData,
    callback
  )
}

Example:

// greeter.proto
syntax = "proto3";

service Greeter {
    rpc SayHello (HelloRequest) returns (HelloReply) {}
}

message HelloRequest {
    string name = 1;
}

message HelloReply {
    string message = 1;
}
...
var Greeter = root.lookup("Greeter");
var greeter = Greeter.create(/* see above */ rpcImpl, /* request delimited? */ false, /* response delimited? */ false);

greeter.sayHello({ name: 'you' }, function(err, response) {
    console.log('Greeting:', response.message);
});

Services also support promises:

greeter.sayHello({ name: 'you' })
    .then(function(response) {
        console.log('Greeting:', response.message);
    });

There is also an example for streaming RPC.

Note that the service API is meant for clients. Implementing a server-side endpoint pretty much always requires transport channel (i.e. http, websocket, etc.) specific code with the only common denominator being that it decodes and encodes messages.

Usage with TypeScript

The library ships with its own type definitions and modern editors like Visual Studio Code will automatically detect and use them for code completion.

The npm package depends on @types/node because of Buffer and @types/long because of Long. If you are not building for node and/or not using long.js, it should be safe to exclude them manually.

Using the JS API

The API shown above works pretty much the same with TypeScript. However, because everything is typed, accessing fields on instances of dynamically generated message classes requires either using bracket-notation (i.e. message["awesomeField"]) or explicit casts. Alternatively, it is possible to use a typings file generated for its static counterpart.

import { load } from "protobufjs"; // respectively "./node_modules/protobufjs"

load("awesome.proto", function(err, root) {
  if (err)
    throw err;

  // example code
  const AwesomeMessage = root.lookupType("awesomepackage.AwesomeMessage");

  let message = AwesomeMessage.create({ awesomeField: "hello" });
  console.log(`message = ${JSON.stringify(message)}`);

  let buffer = AwesomeMessage.encode(message).finish();
  console.log(`buffer = ${Array.prototype.toString.call(buffer)}`);

  let decoded = AwesomeMessage.decode(buffer);
  console.log(`decoded = ${JSON.stringify(decoded)}`);
});

Using generated static code

If you generated static code to bundle.js using the CLI and its type definitions to bundle.d.ts, then you can just do:

import { AwesomeMessage } from "./bundle.js";

// example code
let message = AwesomeMessage.create({ awesomeField: "hello" });
let buffer  = AwesomeMessage.encode(message).finish();
let decoded = AwesomeMessage.decode(buffer);

Using decorators

The library also includes an early implementation of decorators.

Note that decorators are an experimental feature in TypeScript and that declaration order is important depending on the JS target. For example, @Field.d(2, AwesomeArrayMessage) requires that AwesomeArrayMessage has been defined earlier when targeting ES5.

import { Message, Type, Field, OneOf } from "protobufjs/light"; // respectively "./node_modules/protobufjs/light.js"

export class AwesomeSubMessage extends Message<AwesomeSubMessage> {

  @Field.d(1, "string")
  public awesomeString: string;

}

export enum AwesomeEnum {
  ONE = 1,
  TWO = 2
}

@Type.d("SuperAwesomeMessage")
export class AwesomeMessage extends Message<AwesomeMessage> {

  @Field.d(1, "string", "optional", "awesome default string")
  public awesomeField: string;

  @Field.d(2, AwesomeSubMessage)
  public awesomeSubMessage: AwesomeSubMessage;

  @Field.d(3, AwesomeEnum, "optional", AwesomeEnum.ONE)
  public awesomeEnum: AwesomeEnum;

  @OneOf.d("awesomeSubMessage", "awesomeEnum")
  public which: string;

}

// example code
let message = new AwesomeMessage({ awesomeField: "hello" });
let buffer  = AwesomeMessage.encode(message).finish();
let decoded = AwesomeMessage.decode(buffer);

Supported decorators are:

  • Type.d(typeName?: string)   (optional)
    annotates a class as a protobuf message type. If typeName is not specified, the constructor's runtime function name is used for the reflected type.

  • Field.d<T>(fieldId: number, fieldType: string | Constructor<T>, fieldRule?: "optional" | "required" | "repeated", defaultValue?: T)
    annotates a property as a protobuf field with the specified id and protobuf type.

  • MapField.d<T extends { [key: string]: any }>(fieldId: number, fieldKeyType: string, fieldValueType. string | Constructor<{}>)
    annotates a property as a protobuf map field with the specified id, protobuf key and value type.

  • OneOf.d<T extends string>(...fieldNames: string[])
    annotates a property as a protobuf oneof covering the specified fields.

Other notes:

  • Decorated types reside in protobuf.roots["decorated"] using a flat structure, so no duplicate names.
  • Enums are copied to a reflected enum with a generic name on decorator evaluation because referenced enum objects have no runtime name the decorator could use.
  • Default values must be specified as arguments to the decorator instead of using a property initializer for proper prototype behavior.
  • Property names on decorated classes must not be renamed on compile time (i.e. by a minifier) because decorators just receive the original field name as a string.

ProTip! Not as pretty, but you can use decorators in plain JavaScript as well.

Additional documentation

Protocol Buffers

protobuf.js

Community

Performance

The package includes a benchmark that compares protobuf.js performance to native JSON (as far as this is possible) and Google's JS implementation. On an i7-2600K running node 6.9.1 it yields:

benchmarking encoding performance ...

protobuf.js (reflect) x 541,707 ops/sec ±1.13% (87 runs sampled)
protobuf.js (static) x 548,134 ops/sec ±1.38% (89 runs sampled)
JSON (string) x 318,076 ops/sec ±0.63% (93 runs sampled)
JSON (buffer) x 179,165 ops/sec ±2.26% (91 runs sampled)
google-protobuf x 74,406 ops/sec ±0.85% (86 runs sampled)

   protobuf.js (static) was fastest
  protobuf.js (reflect) was 0.9% ops/sec slower (factor 1.0)
          JSON (string) was 41.5% ops/sec slower (factor 1.7)
          JSON (buffer) was 67.6% ops/sec slower (factor 3.1)
        google-protobuf was 86.4% ops/sec slower (factor 7.3)

benchmarking decoding performance ...

protobuf.js (reflect) x 1,383,981 ops/sec ±0.88% (93 runs sampled)
protobuf.js (static) x 1,378,925 ops/sec ±0.81% (93 runs sampled)
JSON (string) x 302,444 ops/sec ±0.81% (93 runs sampled)
JSON (buffer) x 264,882 ops/sec ±0.81% (93 runs sampled)
google-protobuf x 179,180 ops/sec ±0.64% (94 runs sampled)

  protobuf.js (reflect) was fastest
   protobuf.js (static) was 0.3% ops/sec slower (factor 1.0)
          JSON (string) was 78.1% ops/sec slower (factor 4.6)
          JSON (buffer) was 80.8% ops/sec slower (factor 5.2)
        google-protobuf was 87.0% ops/sec slower (factor 7.7)

benchmarking combined performance ...

protobuf.js (reflect) x 275,900 ops/sec ±0.78% (90 runs sampled)
protobuf.js (static) x 290,096 ops/sec ±0.96% (90 runs sampled)
JSON (string) x 129,381 ops/sec ±0.77% (90 runs sampled)
JSON (buffer) x 91,051 ops/sec ±0.94% (90 runs sampled)
google-protobuf x 42,050 ops/sec ±0.85% (91 runs sampled)

   protobuf.js (static) was fastest
  protobuf.js (reflect) was 4.7% ops/sec slower (factor 1.0)
          JSON (string) was 55.3% ops/sec slower (factor 2.2)
          JSON (buffer) was 68.6% ops/sec slower (factor 3.2)
        google-protobuf was 85.5% ops/sec slower (factor 6.9)

These results are achieved by

  • generating type-specific encoders, decoders, verifiers and converters at runtime
  • configuring the reader/writer interface according to the environment
  • using node-specific functionality where beneficial and, of course
  • avoiding unnecessary operations through splitting up the toolset.

You can also run the benchmark ...

$> npm run bench

and the profiler yourself (the latter requires a recent version of node):

$> npm run prof <encode|decode|encode-browser|decode-browser> [iterations=10000000]

Note that as of this writing, the benchmark suite performs significantly slower on node 7.2.0 compared to 6.9.1 because moths.

Compatibility

  • Works in all modern and not-so-modern browsers except IE8.
  • Because the internals of this package do not rely on google/protobuf/descriptor.proto, options are parsed and presented literally.
  • If typed arrays are not supported by the environment, plain arrays will be used instead.
  • Support for pre-ES5 environments (except IE8) can be achieved by using a polyfill.
  • Support for Content Security Policy-restricted environments (like Chrome extensions without unsafe-eval) can be achieved by generating and using static code instead.
  • If a proper way to work with 64 bit values (uint64, int64 etc.) is required, just install long.js alongside this library. All 64 bit numbers will then be returned as a Long instance instead of a possibly unsafe JavaScript number (see).
  • For descriptor.proto interoperability, see ext/descriptor

Building

To build the library or its components yourself, clone it from GitHub and install the development dependencies:

$> git clone https://github.com/dcodeIO/protobuf.js.git
$> cd protobuf.js
$> npm install

Building the respective development and production versions with their respective source maps to dist/:

$> npm run build

Building the documentation to docs/:

$> npm run docs

Building the TypeScript definition to index.d.ts:

$> npm run types

Browserify integration

By default, protobuf.js integrates into any browserify build-process without requiring any optional modules. Hence:

  • If int64 support is required, explicitly require the long module somewhere in your project as it will be excluded otherwise. This assumes that a global require function is present that protobuf.js can call to obtain the long module.

    If there is no global require function present after bundling, it's also possible to assign the long module programmatically:

    var Long = ...;
    
    protobuf.util.Long = Long;
    protobuf.configure();
  • If you have any special requirements, there is the bundler for reference.

License: BSD 3-Clause License

protobuf.js's People

Contributors

alexander-fenster avatar bcoe avatar cixelyn avatar dcodeio avatar dependabot[bot] avatar drigz avatar fnlctrl avatar foxdavidj avatar gideongoodwin avatar github-actions[bot] avatar gitjuba avatar jbpringuey avatar joakimrapp avatar jonathon-love avatar justinbeckwith avatar litichevskiydv avatar manni83 avatar murgatroid99 avatar nicolasnoble avatar recih avatar release-please[bot] avatar renovate[bot] avatar richgerrard avatar robin-anil avatar sapphi-red avatar senthilkumarkj avatar slyryd avatar taylorcode avatar thegecko avatar timc88 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protobuf.js's Issues

Certain Long values cause "Error: Illegal field id in Message .Foo#decode: 0"

Thanks for your useful library.

I think I might have found a bug:

Given this file: "uint64-bug.proto":

message Foo {
    optional uint32 a = 2;
    required uint32 b = 3;
    required uint64 c = 4;
    required uint64 d = 5;
}

... and this one: "uint64-bug.html":

<html>
<head>

    <script src="http://raw.github.com/dcodeIO/Long.js/master/Long.js"></script>
    <script src="http://raw.github.com/dcodeIO/ByteBuffer.js/master/ByteBuffer.js"></script>
    <script src="http://raw.github.com/dcodeIO/ProtoBuf.js/master/ProtoBuf.js"></script>

    <script type="text/javascript">

        var builder = dcodeIO.ProtoBuf.protoFromFile("uint64-bug.proto");
        var fooCls = builder.build("Foo");
        var foo = new fooCls(2, 0, dcodeIO.Long.fromString("1368057600000"), dcodeIO.Long.fromString("1235455123"));
        var fooEncoded = foo.encode();
        var fooDecoded = fooCls.decode(fooEncoded);

    </script>
</head>
<body></body>
</html>

When opening the html file, the following javascript error occurs:

Error: Illegal field id in Message .Foo#decode: 0

If the second long value is changed from 1235455123 to 123 then the error does not occur.

This occurs in at least Firefox 20.0 and Chrome 26.0.1410.65 on OSX 10.8.3.

Putting into a common namespace two .proto files that both import the same third file throws an error

Consider the following three files:

  • example1.proto
import "example3.proto";

message Test1 {
    required int32 a = 1;
}
  • example2.proto
import "example3.proto";

message Test2 {
  required string b = 2;
}
  • example3.proto
message Test3 {
  required string b = 2;
}

Trying to put example1.proto and example2.proto into a common namespace throws the following:

> var builder = ProtoBuf.protoFromFile("./example1.proto");
undefined
> ProtoBuf.protoFromFile("./example2.proto", builder)
Error: Duplicate name in namespace Namespace : Test3
    at ProtoBuf.Reflect.Namespace.addChild (E:\Downloads\node_modules\protobufjs\ProtoBuf.js:1192:27)
    at ProtoBuf.Builder.Builder.create (E:\Downloads\node_modules\protobufjs\ProtoBuf.js:2530:42)
    at ProtoBuf.Builder.Builder.import (E:\Downloads\node_modules\protobufjs\ProtoBuf.js:2590:26)
    at ProtoBuf.Builder.Builder.import (E:\Downloads\node_modules\protobufjs\ProtoBuf.js:2622:43)
    at Object.ProtoBuf.protoFromString (E:\Downloads\node_modules\protobufjs\ProtoBuf.js:2757:34)
    at Object.ProtoBuf.protoFromFile (E:\Downloads\node_modules\protobufjs\ProtoBuf.js:2790:53)
    at repl:1:11
    at REPLServer.self.eval (repl.js:110:21)
    at Interface.<anonymous> (repl.js:239:12)
    at Interface.EventEmitter.emit (events.js:95:17)

Whats wrong with this buffer? (how to decode a protobuf buffer by hand)

Trying to decode:

0a 0d 08 f9 27 12 02 4f 4b 18 8a 8c 06 20 4e

with this message:

message BuyInResponse {
   enum Code {
        OK = 0;
        ERROR = 1;
        AUTH_ERROR = 2;
    }

    repeated PaymentResponseElement response = 1;
}
message PaymentResponseElement {
    optional int64 pnPaymentId = 1;
    optional string messageCode =2;
    optional int64 balanceAfterTransaction = 3;
    optional int32 version = 4;
}

Getting this error:
Error: Illegal wire type for field Message.Field.core.comm.message.int2s.PaymentResponseElement.messageCode: 2 (0 expected)

type bytes.

Hello!
I have this field in my proto file :

optional bytes samples = 130;

I create object

var msg = new Message({
samples:'55'
});

then encode him and send socket.send(msg.toArrayBuffer());
then get him msg = Message.decode(evt.data);
And get msg.samples=c { array=ArrayBuffer, view=DataView, offset=10, ...
How can i get data from samples (55)?

Add support for /* comments */

This is a feature request to add support for /* */ style comments. Note that this are supported by protoc, even if the protobufs language guide doesn't seem to mention them.

I have added a unit tests for this request to my pull request.

Cheers.

Messages skip "required" fields (=null) if not present in decoded buffer

I found that when the buffer contains less field than the message declared,
the code blow will return a message with the lack fields set to null and
won't be verified.
I am new to this lib, am I missing something?

            Message.prototype.decode = function(buffer, length) {
                length = length || -1;
                var start = buffer.offset;
                var msg = new (this.built)();
                while (buffer.offset < start+length || (length == -1 && buffer.remaining() > 0)) {
                    var tag = buffer.readVarint32();
                    var wireType = tag & 0x07,
                        id = tag >> 3;
                    var field = this.getChild(id); // Message.Field only
                    if (!field) {
                        throw(new Error("Illegal field id in "+this.toString(true)+"#decode: "+id));
                    }
                    if (field.repeated && !field.options["packed"]) {
                        msg.add(field.name, field.decode(wireType, buffer));
                    } else {
                        msg.set(field.name, field.decode(wireType, buffer));
                    }
                }
                return msg;
            };

Decoding of messages does not work - only in Node .8

The library works fine for me in Node 0.10.x
With Node 0.8,x, there are problems while decoding a message.

--> Very unfortunately, appfog supports only Node 0.8, see
https://docs.appfog.com/languages/node

Error message (only in Node 0.8) is:
...src\node_modules\protobufjs\node_modules\bytebuffer\ByteBuffer.js:164
throw(new Error("Cannot wrap buffer of type "+typeof(buffer)))
^
Error: Cannot wrap buffer of type object
at Function.ByteBuffer.wrap (...src\node_modules\protobufjs\node_modules\bytebuffer\ByteBuffer.js:164:23)
at Function.ProtoBuf.Reflect.Message.build.Message.decode (...src\node_modules\protobufjs\ProtoBuf.js:1513:95)
at ...src\webserver\gtfsClients.js:28:66
at fs.readFile (fs.js:176:14)
at Object.oncomplete (fs.js:297:15)

Running my code (exactly the same one) with Node 0.10 does not causes any problems like this.

int64 fields turns into Object

Im trying to encode some objects.. but int64 fields insists in turn to Object with high, low and unsigned properties.. server is nothing expecting int64 with this format.. Already removed Long.js

Question! How to encode reapeated fields?

message BalanceRequest {
repeated BalanceRequestElement request = 1;
}

message BalanceRequestElement {
optional int32 code = 1;
optional int64 playerId = 2;
optional int32 nCode = 3;
}

I have to send BalanceRequest and he have a son: BalanceRequestElement who must have those properties: code = 88, playerId = 99, ncode = 66

How to mount this object before encode? Sorry.. tryed every possible ways... but:

Top level enums

First of all, great library! It's great to see a 100% js alternative to the protobuf-for-node library.

I'm working with a .proto file that has top level enum definitions that are shared among two different messages. ProtoBuf.js claims "Illegal top level declaration", but Google's protoc compiles it just fine. I'm somewhat new to protobufs, but I would think if Google's protoc was ok with it, then it would be a good idea to follow suit.

Floats/Double in nested messages are encoded or decoded incorrectly

Hi again,

Thanks for all the support.

Here is a test case for this issue:

bad-floats.proto:

message Foo {
    required Bar bar = 1;
}
message Bar {
    required float baz = 1;
}

bad-floats.html:

<html>
<head>

    <script src="http://raw.github.com/dcodeIO/Long.js/master/Long.js"></script>
    <script src="http://raw.github.com/dcodeIO/ByteBuffer.js/master/ByteBuffer.js"></script>
    <script src="http://raw.github.com/dcodeIO/ProtoBuf.js/master/ProtoBuf.js"></script>

    <script type="text/javascript">

        var builder = dcodeIO.ProtoBuf.protoFromFile("bad-floats.proto");
        var fooCls = builder.build("Foo");
        var barCls = builder.build("Bar");

        var foo = new fooCls(new barCls(4));
        var fooEncoded = foo.encode();
        var fooDecoded = fooCls.decode(fooEncoded);

        console.log("before: " + foo.bar.baz + ", after: " + fooDecoded.bar.baz);

    </script>

</head>
<body></body>
</html>

The html file above prints this to the console when running in Firefox 20 on OS X:

before: 4, after: 2.0553e-320

This is wrong - after should be 4 too.

This issue seems to afflict both float and double type fields, but only fields inside nested messages. Float fields in the top level message do not seem to be affected.

Precompile to pre-built class

I imagine it would be non-trivial to implement, but I was wondering if it would be possible to extend the proto2js compiler to return what builder.build would return, instead of wrapping the json with a call to builder.build

This would make it possible to avoid building the class each time the app is run, as well as give users the option of using a build of protobuf.js that doesnt have the builder, as all calsses could be pre-built

ProtoBuf.js as JSON "template"

I'm interested to use proto files as JSON template :)
like data from [1:1,2:'First Last',3:'[email protected]']
parsed against ...
message Person {
required int32 id = 1;
required string name = 2;
optional string email = 3;
}

.. so something like

var person = Person.jsonImport( [1:1,2:'First Last',3:'[email protected]'] ) ;

so, it this someway already possible to hack in ? :)

Illegal top level declaration for message definitions with trailing semicolon

Hi,

the following message definition parses with protoc but fails with ProtoBuf.js:

message Test
{
optional uint32 test = 1 [default = 0];
};

Test code:

var ProtoBuf = require("protobufjs");
var builder = ProtoBuf.protoFromFile("test.proto");

Output:

/usr/lib/node_modules/protobufjs/ProtoBuf.js:578
                        throw(new Error("Illegal top level declaration: "+toke
                              ^
Error: Illegal top level declaration: ;
    at ProtoBuf.DotProto.Parser.Parser.parse (/usr/lib/node_modules/protobufjs/ProtoBuf.js:578:31)
    at Object.ProtoBuf.protoFromString (/usr/lib/node_modules/protobufjs/ProtoBuf.js:2671:33)
    at Object.ProtoBuf.protoFromFile (/usr/lib/node_modules/protobufjs/ProtoBuf.js:2713:53)
    at Object.<anonymous> (/tmp/test.js:3:24)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)

Removing the semicolon after the closing brace of the message definition fixes this issue.
Google does not specify if the semicolon is allowed in the language guide but protoc does not error on such definitions.

Extra fields in a message causes decode to throw an error

I'm working with a protobuf schema which is changing over time, additional optional fields are being added to the message definitions but the parser doesn't seem to be able to handle them.

The Google Protobuf Language Guide (https://developers.google.com/protocol-buffers/docs/proto#updating) explains "messages created by your new code can be parsed by your old code: old binaries simply ignore the new field when parsing." I understand that there is no requirement for this library to obey these guidelines but it would be nice if it could handle this case instead of falling over with an exception.

I'm using ProtoBuf.noparse.js and the error is coming from line 1012: throw(new Error("Illegal field id in "+this.toString(true)+"#decode: "+id));

Is this something that could be added?

Thanks

Check for NaN in number types ?

Here is a small problem I ran into, maybe the checking for numbers needs to be a bit stricter ?

protomsgs = """
package vlm;
message Vec3 {
  required float x = 1;
  required float y = 2;
  required float z = 3;
}
"""
pb = require 'protobufjs'
builder = pb.protoFromString protomsgs

Vec3 = builder.build 'vlm.Vec3'
console.log Vec3

v1 = new Vec3 x: 1, y: 2, z: 'a'
console.log v1

produces

> coffee test.coffee
{ [Function: Vec3] decode: [Function] }
{ x: 1, y: 2, z: NaN }

Hope the coffeescript is self explanatory.

Thanks,
Drew

_

I might be wrong, I'll double check it first.

pull protobuf via HTTP GET?

Apologies if this is a naive question -- I'm not very familiar with either JS or the logic of HTTP GET requests.

I'm trying to pull protobuf data from a server via an HTTP get (implemented via restler). I would like to decode the data as directly as possible. I've already used the builder to load the local .proto file (GTFS-RT, as it happens -- I discovered by brute force the small modifications that I needed to make, noted in another Issue).

But when I give the result of the 'get' to Message.decode, it sees a string and tries to wrap it assuming the UTF-8 format. As a result, any of the incoming bytes that are greater than 127 are interpreted as an "unknown character" and get changed to 65533.

I suspect this is not really a Protobuf.js issue, and that I'm just missing something about how to pull or input the data stream. But I'd appreciate a hint.

Cannot encode field of type "bytes"

The if statement at line 1530 of ProtoBuf.js is (wrongly) satisfied and causes an exception. The modified code below works, at least for encoding "bytes", but I don't understand what the "if" statement is supposed to do.

                if (this.type != ProtoBuf.TYPES["message"] && !(ProtoBuf.Long && value instanceof ProtoBuf.Long) && value instanceof Object) {
                    console.log("1531",this.type);
                    if (this.type != ProtoBuf.TYPES["bytes"] ) {
                        throw(new Error("Illegal value for "+this.toString(true)+": "+value+" (is object)"));
                    };
                }

Encode/decode error when a child message have a long string data

message A
{
.... // some fields
message B
{
... // some fields
"str": "long string"; // over about 70 bytes
}
repeated B b = x;
}

// code:

var buffer = a.encode();
//-----------------------------------------------------------
//decode
var a2= A.decode(buffer); // throw an Exception

If you might need my code and .proto file, please feel free to contact me [email protected]
I like this module. Thanks!

Parsing protos with string and default="" fails

Hi,

I am trying to use a proto file that contains string and/or bytes fields with a default value of "".
Unfortunately this fails, as reproducible with this test:

test.proto:

message Test
{
optional string test = 1 [default = ""];
}

Called from Node:

var ProtoBuf = require("protobufjs");
var builder = ProtoBuf.protoFromFile("test.proto");

Output:

/usr/lib/node_modules/protobufjs/ProtoBuf.js:2484
                                throw(new Error("Not a valid message or enum d
                                      ^
Error: Not a valid message or enum definition: {"name":"Test","fields":[{"rule":"optional","type":"string","name":"test","id":1,"options":{"default":""}}],"enums":[],"messages":[],"options":{}}
    at ProtoBuf.Builder.Builder.create (/usr/lib/node_modules/protobufjs/ProtoBuf.js:2484:39)
    at Object.ProtoBuf.protoFromString (/usr/lib/node_modules/protobufjs/ProtoBuf.js:2673:21)
    at Object.ProtoBuf.protoFromFile (/usr/lib/node_modules/protobufjs/ProtoBuf.js:2712:53)
    at Object.<anonymous> (/tmp/test.js:3:24)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)

Removing default="" or using default="something" works, but I wonder if it is intended that default="" fails when the original protoc does not yield an error in that case.

Support for Extensions

I have a bunch of protos I want to use in JavaScript but they use extensions (and I can't change them) so having this support would mean I could use this library instead of hand-rolling everything. That would be nice, as this looks like an excellent library. Cheers.

Message: Iterate over fields / Retrieve keys

Currently I don't see a way to iterate over a message or retrieve all of its keys. They need to be known.
Is this intended by design to prevent a dirty coding style or is it just not implemented yet?

var msg = Message.decode(data);

// Something like this...
msg.keys().forEach(function(key) {
  console.log(key, msg.get[key];
});
// ...or this...
msg.keys.forEach(function(key) {
  console.log(key, msg.get[key];
});
// ...or this would be handy.
Object.keys(msg).forEach(function(key) {
  console.log(key, msg[key];
});

Why not support 'extend'?

I'm hitting an issue when using protobuf.js as a protoc plugin under node.js. The proto parser ignores the extend blocks, so the descriptor.proto meta data used to parse the inbound plugin request message (via stdin) is ignoring the extended field options.

I'm wondering if extend could be supported so that the meta-data is extended as expected? I thought I would ask before attempting to make a change and sending a PR.

Thanks

INTERNAL ERROR when set value for int64 field

my message:
message pbStartCombat {
optional int64 id = 1;
optional int32 type = 2;
}

when i call:
var startCombat = new StartCombat();
startCombat.setId(1);
or
var longVar = new dcodeIO.Long(0x00000000, 0x00000001);
startCombat.setId(longVar);

get the same error:
ProtoBuf.min.js:40:Error: [INTERNAL ERROR] Illegal value for Message.Field .pb_55.pbStartCombat.id: 1 (undefined

After I change my message to:
message pbStartCombat {
optional int32 id = 1;
optional int32 type = 2;
}

and call
var startCombat = new StartCombat();
startCombat.setId(1);

It works fine;

repeated message fields breaking the parser

using a repeated message field inside another message crashed the parser. i'm currently not sure if the error happens while de- or encoding, but i'll dive deeper into it.

the following sample code triggers the issue under 0.9.9 from npm, using node.js.

test.proto

message Outer {
  repeated Inner inner = 1;
}
message Inner {
  optional uint32 inner_value = 1;
}

test.js

var ProtoBuf = require('protobufjs'),
    builder = ProtoBuf.protoFromFile('./test.proto'),
    Outer = builder.build('Outer')

var outer = new Outer({
  inner: [
    { inner_value: 1 },
    { inner_value: 2 }
  ]
})

Outer.decode(outer.encode())

Use camelCase for object properties

A Message object defines getXYZ and setXYZ methods for each field converted to camelCase – however, constructing a Message object from an object literal is much cleaner than repeatedly calling setters on an empty Message, and using getters offers almost no advantage over accessing the object's properties directly. Consequently, code that uses ProtoBuf.js ends up with a lot of snake_case identifiers, which look quite foreign in JavaScript.

Since this would be a breaking change, it would probably be a good idea to support both camelCase and snake_case for a while. For example, a Message constructor could attempt to convert each key into snake_case, and a Message object could have a set of accessors defined through Object.defineProperty. I would recommend using camelCase for the "default" set, since this would result in a nicer console.log output and I can't imagine a situation where this change would break any code.

On a semi-related note, camelCase is used for properties by the other protobuf module, so making this change might encourage more people to switch.

Question about socket communication

Hi. I have a question

I'm trying to use ProtoBuf.js for socket communication

Dgram socket only accepts Buffer object.

So I used toBuffer() method followed by encode() method to convert

However, I don't know reversal process.

I tried many methods but could not get original json object

Thanks a lot for your work

Negative enum value throws an error

I have an enumeration that looks like this:

    enum LobbyType {
        INVALID = -1;
        MATCH = 0;
        PRACTICE = 1;
        TOURNAMENT = 2;
        COOP_BOT_MATCH = 4;
        TEAM_MATCH = 5;
        SOLO_QUEUE_MATCH = 6;
    }

Parsing it throws the following error:

Error: Illegal enum value id in enum LobbyType: -1

Protobuf explicitly allows negative enumerator values, quoting their Language Guide:

Enumerator constants must be in the range of a 32-bit integer. Since enum values use varint encoding on the wire, negative values are inefficient and thus not recommended.

Google Dart

What are the chances of getting a version of this for google dart?

How to use ProtoBuf in Cocos2d-jsb?

I use ProtoBuf(require files: ByteBuffer.min.js, Long.min.js, ProtoBuf.min.js) in cocos2d-jsb, and require ByteBuffer.min.js error, need other files( buffer, process).

please help me, how to use ProtoBuf in Cocos2d-jsb

toArrayBuffer question

Hello!
here is my test code.

var test = new protoObject({
      "id": 130,
      "name": "abc"
});
var buff = test.encode(); 
 var arrayBuffer = buff.toArrayBuffer();
var _uint16Buffer = new Uint16Array(arrayBuffer);

but, get an error when i new Uint16Array. the log said that arrayBuffer is a invalid arguments. why arrayBuffer is an invalid arguments? but, if i do this var _uint8Array = new Uint8Array(arrayBuffer), it is ok. what's wrong of that? arrayBuffer is a ArrayBuffer? if arrayBuffer is a ArrayBuffer, why i can't new Uint16Array?

here is my second test code

var array = new Uint8Array(arrayBuffer);
var str = String.fromCharCode.apply(null, array);
i need to change arrayBuffer to string and send that string to server. but server receive a wrong data(string is right, but int is wrong).
such as when i send this struct to server
var test = new protoObject({
     "id": 130,
     "name": "abc"
}) 

server receive such struct
id: 16706
name: "abc"

id's data isn't right, but string data is ok.
why?

toplevel enums with imports still fail.

import "toplevel.proto";

package My;

enum MyEnum1 {
    ONE = 1;
    TWO = 2;
}

message Test1 {
    required MyEnum num = 0 [default=ONE];
    required MyEnum1 num1 = 1 [default=ONE];
}

"Capacity problem" - or problem in Protobuf logic

Running ProtoBuf.js in Real Life with real data results sometimes in an error message:

Cannot read uint8 from ByteBuffer(offset=642,markedOffset=-1,length=644,capacity
=644): Capacity overflow.

The problem clearly depends on the data which are handled. Anyhow, the reason and how to solve it is open, unfortunately.

importing inner package protos leads to duplicate names

Consider these .proto files. Relevant are all files starting with playlist4. They depend on each other, so they import each other and use the same namespace.

When trying to build them, everything starts crashing down on the first import statement.
The import-less files (playlist4meta.proto and playlist4issues.proto) work well. But as soon as files are loaded, that depend on other files in their package I get errors like Error: Duplicate name in namespace Namespace .spotify.playlist4.proto: ListChecksum.

ListChecksum is defined in playlist4meta.proto. Upon importing it in playlist4ops.proto, which intends to use exactly this type, it is redeclared and provokes the error.

Is this a bug or am I just too stupid?

Some background information: I'm trying to update @TooTallNate's node-spotify-web to use your great ProtoBuf.js.

Workaround for eval?

I'm working on a Chrome Packaged App that uses this lib, and apparently Chrome does not allow eval to be run from inside their new Chrome Packaged Apps.

var Message = eval("0, (function "+T.name+"() { ProtoBuf.Builder.Message.call(this); this.__construct.apply(this, arguments); })");
// Any better way to create a named function? This is so much nicer for debugging with util.inspect()

I've tried a few different solutions, new Function(...) is also viewed as an eval, and I haven't had much luck with other attempts.

I'm basically stuck here, any ideas for a work-around?

Validating against a proto file

This is more of a question than an issue.

I am currently able to read a proto file and apply it against a byte stream to populate an object that the byte stream represents.

What I am wondering is, does ProtoBufjs provide a API to view the data structure the protobuf file represents, so that I can highlight which fields the byte stream did not actually populate.

For example, I have a message

    message Test {
        optional string name = 1;
        optional Child child = 2;
    }

  message Child {
        optional string name = 1;
    }

And I chose to populate 'child' but not 'name' in the byte stream - is it possible to get ProtoBuf.js to return an object that contains;

  • The field 'name', the fact that it is a string, and in addition that it has not been populated by the byte stream
  • The field child, the fact that it is a Child type and what it was populated with.

Feature Request: IE 8 Support

I have been attempting to use ProtoBuf.js in IE 8, without much success. I notice that the demo app for ProtoBuf.js, http://www.esoccer.me/#game also doesn't seem to work in IE 8.

This first problem I discovered is that ProtoBuf.js contains this line:

var Message = eval("(function "+T.name+"() { this.__construct.apply(this, arguments); })");

... which apparently IE doesn't like: It can be fixed by changing it to:

var Message = eval("0, (function "+T.name+"() { this.__construct.apply(this, arguments); })");

(see http://stackoverflow.com/questions/7201502/javascript-eval-behavior-in-ie8).

The second problem is determining which set of shims to use. I am currently using https://github.com/inexorabletash/polyfill/blob/master/polyfill.js and https://github.com/inexorabletash/polyfill/blob/master/typedarray.js but I am running into issues that I do not yet understand. This combination of shims does work in IE 9.

If ProtoBuf.js could somehow be made to work on IE 8, that would be great, but I understand if you don't want to support such a PITA. If I do discover how to make it work, I will post back here.

parser only and support for IE browsers(before IE10)

This is a request to add support for a lite version of ProtoBuf.js which contains parser only and support for IE browsers.
Sometime some guys may just need parse a .proto file just like me.
I had removed all the encode, decode function and add some fix code to support IE Browsers(ie9 and before).Tested in IE8 and works great.
So, would it be ok to add this parser only version?

does not work with gtfs-realtime.proto

Google defined the so-called GTFS-realtime interface in 2011.
https://developers.google.com/transit/gtfs-realtime/

The interface is implemented using protocol buffers. The proto file can be found here:
https://developers.google.com/transit/gtfs-realtime/gtfs-realtime-proto

I try to use ProtoBuf.js to work with this proto file:

var ProtoBuf = require("protobufjs");
var builder = ProtoBuf.protoFromFile("gtfs-realtime.proto");

This causes an error:
...ProtoBuf.js:572
throw(new Error("Illegal top level declaration: "+toke
^
Error: Illegal top level declaration: syntax
at ProtoBuf.DotProto.Parser.Parser.parse (..ProtoBuf.js:572:31)
at Object.ProtoBuf.protoFromString (..ProtoBuf.js:2592:33)
at Object.ProtoBuf.protoFromFile (..ProtoBuf.js:2634:53)
at Object. (..test.js:6:24)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)

About int64 support

Hi, Thx for the great job. This is exactly what i need. I wander is there a plan that int64 will be implemented in the future?

Some way to reflect on a message's fields

Given a message class, it would be useful to know programmatically which fields it has, what their types are and whether they are optional.

Currently, I think this is possible to accomplish through namespace children, but this is not documented and probably not meant to be used externally.

decoding messages from express + socket.io under node

I am probably doing something stupid but I cannot decode even the tiniest bit.

serverside:

var Foo = protobuff.protoFromFile('Foo.proto').build('Foo');
var foo = new Foo('bar');

// ...
console.log(foo.toArrayBuffer() instanceof ArrayBuffer); // true
socket.emit('data', foo.toArrayBuffer());

clientside

socket.on('data', function(data){
    console.log(data, data instanceof ArrayBuffer); // { props... }, false 
    Foo.decode(data); 
    // throws Error: Cannot wrap buffer of type object, Object
});

I have also tried sending / attempting to decode via:

socket.emit('data', foo.encode());
// also... 
bb = new ByteArray();
foo.encode(bb);
socket.emit('data', bb);
socket.emit('data', bb.toHex());

all to no avail. I realise this is not lossless up the socket message - so how else can I coerce the foo object from the server to be sent and decoded on the client? only need this so i can prototype some proto parsing and extensions...

Incorrect require statement

ProtoBuf.js:32 reads:

ByteBuffer = require("ByteBuffer");

I believe it should be:

ByteBuffer = require("bytebuffer");

This fixes the package when used with browserify.

Treating int64 as string when decoding. (Question)

I'm decoding some messages and when the .proto-message-field is int64 type. the number becomes high {low} unsigned. So far so good. Knowing that the javascript does not have native support for int64, not intend to do any operation with this number in the Node.js But I need to save this number as it was originally, it can be string or numeric sequence. Can you tell me how I can change the Protobuf.js that, when decoding, int64 treat it as string? Is it Possible? Where Can I change this particularity?

About imports implemention

As package "some" define in some.proto import package "common" define in common.proto
May it be used like this:
var commonBuilder = ProtoBuf.protoFromFile("common.proto");
var someBuilder = ProtoBuf.protoFromFile("some.proto", commonBuilder);
var Some = someBuilder.build("some");
var some = new some() ...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.