Code Monkey home page Code Monkey logo

pyrolite's Introduction

Pyrolite - Pyro5 client library for Java and .NET

saythanks Build Status Maven Central NuGet

Pyrolite is written by Irmen de Jong ([email protected]). This software is distributed under the terms written in the file LICENSE.

Introduction: Pyro

This library allows your Java or .NET program to interface very easily with a Python program, using the Pyro protocol to call methods on remote objects (see https://github.com/irmen/Pyro5).

Pyrolite only implements a part of the client side Pyro library, hence its name 'lite'... For the full Pyro experience (and the ability to host servers and expose these via Pyro) you have to run Pyro itself in Python. But if you don't need Pyro's full feature set, and don't require your Java/.NET code to host Pyro objects but rather only call them, Pyrolite could be a good choice to connect Java or .NET and Python!

Pyro4 ?

If you are still using Pyro4 and want to use Pyrolite with it, you should stick to an older version of this library (versions 4.xx). The current 5.xx version only supports Pyro5.

In Github, use the pyro4-legacy branch.

Installation and usage

Precompiled libraries are available:

Some Java example code:

import net.razorvine.pyro.*;

NameServerProxy ns = NameServerProxy.locateNS(null);
PyroProxy remoteobject = new PyroProxy(ns.lookup("Your.Pyro.Object"));
Object result = remoteobject.call("pythonmethod", 42, "hello", new int[]{1,2,3});
String message = (String)result;  // cast to the type that 'pythonmethod' returns
System.out.println("result message="+message);
remoteobject.close();
ns.close();

Some C# example code:

using Razorvine.Pyro;

using( NameServerProxy ns = NameServerProxy.locateNS(null) )
{
    // this uses the statically typed proxy class:
    using( PyroProxy something = new PyroProxy(ns.lookup("Your.Pyro.Object")) )
    {
        object result = something.call("pythonmethod", 42, "hello", new int[]{1,2,3});
        string message = (string)result;  // cast to the type that 'pythonmethod' returns
        Console.WriteLine("result message="+message);
        result = something.getattr("remote_attribute");
        Console.WriteLine("remote attribute="+result);
    }
    
    // but you can also use it as a dynamic!
    using( dynamic something = new PyroProxy(ns.lookup("Your.Pyro.Object")) )
    {
        object result = something.pythonmethod(42, "hello", new int[]{1,2,3});
        string message = (string)result;  // cast to the type that 'pythonmethod' returns
        Console.WriteLine("result message="+message);
        result = something.remote_attribute;
        Console.WriteLine("remote attribute="+result);
    }
}

More examples can be found in the examples directory. You could also study the unit tests.

"Where is Pickle?"

Until version 5.0, Pyrolite included a pickle protocol implementation that allowed your Java or .NET code to read and write Python pickle files (pickle is Python's serialization format). From 5.0 onwards, this is no longer included because Pyro5 no longer uses pickle.

If you still want to read or write pickled data, have a look at the now separate pickle library: https://github.com/irmen/pickle

Required dependency: Serpent serializer

The serializer used is 'serpent' (a special serilization protocol that I designed for the Pyro library) So this requires the Razorvine.Serpent assembly (.NET) or the net.razorvine serpent artifact (serpent.jar, Java) to be available.

Serpent is a separate project (als by me), you'll have to install this dependency yourself. You can find it at: https://github.com/irmen/Serpent Download instructions are there as well.

Dealing with exceptions

Pyrolite also maps Python exceptions that may occur in the remote object. It has a rather simplistic approach:

all exceptions, including the Pyro ones (Pyro4.errors.*), are converted to PyroException objects. PyroException is a normal Java or C# exception type, and it will be thrown as a normal exception in your program. The message string is taken from the original exception. The remote traceback string is available on the PyroException object in the _pyroTraceback field.

pyrolite's People

Contributors

adamsitnik avatar airhorns avatar davies avatar dependabot[bot] avatar irmen avatar mengxr avatar stephentoub avatar torokati44 avatar viirya avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pyrolite's Issues

java: Relax the visibility of `Unpickler.NO_RETURN_VALUE` class constant

Related to issue #49. Currently, I'm working around the private access control like this:

public static Object noReturnValue(){
	try {
		Field field = Unpickler.class.getDeclaredField("NO_RETURN_VALUE");
		if(!field.isAccessible()){
			field.setAccessible(true);
		}
		return field.get(null);
	} catch(Exception e){
		throw new RuntimeException(e);
	}
}

Compare by value when memoizing

I ran across a use case where this check makes me lose all benefits of memoization.
I'm pickling a data structure where a couple of fairly long strings are repeated many times, so memoizing it should make the resulting stream a lot shorter.
However, the instances are not considered equal, because they are only equal by value, not by reference. Now this doesn't necessarily matter with Strings, and maybe not for many different classes either.
So an option where the == check is removed, or replaced by a .equals call could be considered for addition in my opinion.

possible decode_escaped problem with unsupported escape sequences

pickling datetime.datetime(2014, 7, 8, 10) gives a pickle with a S opcode containing a "\n" as escaped character in the string. decode_escaped seems unable to deal with this.
See issue #19 for details

pickling datetime.datetime(2014, 10, 10, 10, 0) gives even more \n's in the string obviously

The byte values that will cause problems are the ones that are encoded by python using one of the standard escape characters rather than a hexadecimal escape:

9 -> \t
10 -> \n
13 -> \r
See the output of

import pprint; pprint.pprint({i:chr(i) for i in range(33)})

Also, what about decode_unicode_escaped (which is called from the load_unicode opcode)? Seems like it has to be able to deal with the same escaped characters...

DotNet package doesn't work in latest and release 4.15

Tested out with the following steps:

(1) Server. Python. Used python 3.4 and installed pyro4 using pip

def main():
daemon = Pyro4.Daemon() # make a Pyro daemon
uri = daemon.register(CenterController)
print("Ready. Object uri =", uri) # print the uri so we can use it in the client later
daemon.requestLoop()

if name == 'main':
main()

(2) Client. C#. The URI string is coming from the the print out of the python based server

public class TestURI
{
    public static void Run()
    {

        string uri = "PYRO:obj_0de46bb35aa84b729b796b9937cfe6fe@localhost:59773";

        PyroURI pyURI = new PyroURI(uri);

        using (PyroProxy p = new PyroProxy(pyURI))
        {
            dynamic result = p.call("getDst");
            System.Console.WriteLine(result.toString());
        }
    }
}

Exception:
unhandled exception: System.InvalidCastException: Unable to cast object of type
'System.Collections.Generic.Dictionary`2[System.Object,System.Object]' to type '
System.Collections.Hashtable'.
at Razorvine.Pyro.PyroProxy._handshake() in E:\projects\PyroLite\Pyrolite-pyr
olite-4.15\dotnet\Pyrolite\Pyro\PyroProxy.cs:line 385
at Razorvine.Pyro.PyroProxy.connect() in E:\projects\PyroLite\Pyrolite-pyroli
te-4.15\dotnet\Pyrolite\Pyro\PyroProxy.cs:line 83
at Razorvine.Pyro.PyroProxy.internal_call(String method, String actual_object
Id, UInt16 flags, Boolean checkMethodName, Object[] parameters) in E:\projects\P
yroLite\Pyrolite-pyrolite-4.15\dotnet\Pyrolite\Pyro\PyroProxy.cs:line 236
at Razorvine.Pyro.PyroProxy.call(String method, Object[] arguments) in E:\pro
jects\PyroLite\Pyrolite-pyrolite-4.15\dotnet\Pyrolite\Pyro\PyroProxy.cs:line 184

at Pyrolite.TestPyroEcho.TestURI.Run() in E:\projects\PyroLite\Pyrolite-pyrol
ite-4.15\dotnet\Pyrolite.TestPyroEcho\TestURI.cs:line 23
at Pyrolite.TestPyroEcho.Program.Main(String[] args) in E:\projects\PyroLite
Pyrolite-pyrolite-4.15\dotnet\Pyrolite.TestPyroEcho\Program.cs:line 29

Fix:
PyroProxy.cs
//var response_dict = (Hashtable)handshake_response;
var response_dict = (System.Collections.Generic.Dictionary<Object, Object>)handshake_response;
//_processMetadata(response_dict["meta"] as Hashtable);
_processMetadata(response_dict["meta"] as System.Collections.Generic.Dictionary<Object, Object>);

Conclusion
The return result of functions are now returned as Dictionary type and dictionary table is no more directly castable to hash table...

Unpickling datetimes with timezones is not supported

At the moment Pyrolite doesn't support unpickling datetime.date or datetime.datetime objects if they have a tzinfo property describing their timezone set. I would like to talk about adding support for this feature, but it's a bit tricky because of the different ways you can give a datetime a timezone.

For a datetime like this guy: datetime.datetime(2014, 7, 8, 10), the pickletools.dis output looks like this:

    0: c    GLOBAL     'datetime datetime'
   19: p    PUT        0
   22: (    MARK
   23: S        STRING     '\x07\xde\x07\x08\n\x00\x00\x00\x00\x00'
   65: p        PUT        1
   68: t        TUPLE      (MARK at 22)
   69: p    PUT        2
   72: R    REDUCE
   73: p    PUT        3
   76: .    STOP
highest protocol among opcodes = 0

but for one like this:

from dateutil.tz import tzutc
obj = datetime.datetime(2014, 7, 8, 10, 10, 0, 0, tzutc())
pickletools.dis(obj)

the pickle stream looks like this:

    0: c    GLOBAL     'datetime datetime'
   19: p    PUT        0
   22: (    MARK
   23: S        STRING     '\x07\xde\x07\x08\n\n\x00\x00\x00\x00'
   63: p        PUT        1
   66: c        GLOBAL     'copy_reg _reconstructor'
   91: p        PUT        2
   94: (        MARK
   95: c            GLOBAL     'dateutil.tz tzutc'
  114: p            PUT        3
  117: c            GLOBAL     'datetime tzinfo'
  134: p            PUT        4
  137: g            GET        4
  140: (            MARK
  141: t                TUPLE      (MARK at 140)
  142: R            REDUCE
  143: p            PUT        5
  146: t            TUPLE      (MARK at 94)
  147: p        PUT        6
  150: R        REDUCE
  151: p        PUT        7
  154: t        TUPLE      (MARK at 22)
  155: p    PUT        8
  158: R    REDUCE
  159: p    PUT        9
  162: .    STOP

which has that reconstructor call and then the tuple for the tzinfo instance which Pyrolite doesn't expect at all and fails to deserialize. Worse yet are tzinfos from pytz:

from pytz import timezone
amsterdam = timezone('Europe/Amsterdam')
pickletools.dis(datetime.datetime(2014, 7, 8, 10, 10, 0, 0, amsterdam))

gives a pickle stream of

    0: c    GLOBAL     'datetime datetime'
   19: p    PUT        0
   22: (    MARK
   23: S        STRING     '\x07\xde\x07\x08\n\n\x00\x00\x00\x00'
   63: p        PUT        1
   66: c        GLOBAL     'pytz _p'
   75: p        PUT        2
   78: (        MARK
   79: S            STRING     'Europe/Amsterdam'
   99: p            PUT        3
  102: I            INT        1200
  108: I            INT        0
  111: S            STRING     'LMT'
  118: p            PUT        4
  121: t            TUPLE      (MARK at 78)
  122: p        PUT        5
  125: R        REDUCE
  126: p        PUT        6
  129: t        TUPLE      (MARK at 22)
  130: p    PUT        7
  133: R    REDUCE
  134: p    PUT        8
  137: .    STOP
highest protocol among opcodes = 0

which has the pytz._p object in there with a string representing the zoneinfo name and the offset as well. You can also (and sadly I've seen code in the wild do this) use pytz's UTC representation like so:

import pytz
utc = pytz.utc
pickletools.dis(datetime.datetime(2014, 7, 8, 10, 10, 0, 0, utc))
    0: c    GLOBAL     'datetime datetime'
   19: p    PUT        0
   22: (    MARK
   23: S        STRING     '\x07\xde\x07\x08\n\n\x00\x00\x00\x00'
   63: p        PUT        1
   66: c        GLOBAL     'pytz _UTC'
   77: p        PUT        2
   80: (        MARK
   81: t            TUPLE      (MARK at 80)
   82: R        REDUCE
   83: p        PUT        3
   86: t        TUPLE      (MARK at 22)
   87: p    PUT        4
   90: R    REDUCE
   91: p    PUT        5
   94: .    STOP
highest protocol among opcodes = 0

which is another somewhat annoying variation. Finally, I believe one could in a very twisted world give custom, extended objects as the tzinfo property that aren't of the base dateutil.tz.tzinfo class or pytz._p class.

My suggestion/plan is to implement support for the second tuple element, and to hardcode an understanding of those two classes which almost everyone uses into the datetime unpickler, and still explode if someone passes a custom class that isn't registered in the java unpickler. If this approach sounds reasonable, I have some further questions:

  • Is there a better way to unpickle that second tuple element than the current approach where we bit shift out members of the byte string? Can I call out to a different piece of the system to get an object I can then inspect to build a java-side Timezone to add to the Date object?
  • In the first example above I see escaped newlines coming out in the pickle stream and I don't see that in any of the tests, any idea why my system might be doing that?
  • Has anyone requested this functionality before or tried this kind of thing?

Thanks for any pointers or help or feedback you can give me!

I discovered this while debugging Apache Spark, so for posterity here's the Spark JIRA: https://issues.apache.org/jira/browse/SPARK-6411

java9 compile fails due to JAXB not available by default

java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter
at net.razorvine.serpent.Serializer.serialize_calendar(Serializer.java:437)

JAXB is no longer available in JDK 9 by default.
Replace JAXB dependency with something else.

Flame and "datetime.datetime" module

The current implementation of Flame failure when the package has more than one level.
For example: 'datetime.datetime'

The error prevents the use of the function 'now'

class Flame(object):
...
    def module(self, name):
        """import a module on the server given by the module name and returns a proxy to it"""
        if importlib:
            importlib.import_module(name)
        else:
            __import__(name)
        return FlameModule(self, name)

Below is an implementation that can help in this case:

class Flame(Pyro4.utils.flame.Flame):
...
    def module(self, name):
        packages = name.split('.')

        package = __import__(packages[0])
        modules = [package] + packages[1:]

        sys.modules[name] = reduce(lambda a, b: getattr(a, b, a), modules)

        return Pyro4.utils.flame.FlameModule(self, name)

add pickling support for java 8's java.time API

Pickler only supports the old style time/calendar objects.
We probably need to support java 8's java.time API as well.

An interesting issue arises for unpickling: to what objects should we unpickle?
Make it configurable? Autoselect based on JDK version?

Also how do we ensure that the class file still works on JDK 1.7?

Strong-name sign the C# assembly?

Would it be possible for you to strong-name sign the managed assembly? That's necessary for other strong-named assemblies to be able to reference it.

Asynchronous calls

Hello there,

I'm getting PyroException("result msg out of sync") when calling normal method asynchronously from Java. Is there any way to bypass the sequenceNr checking or is this required for mapping each "result msg" to the callee?

Thanks for the great work! :)

java: Class name initialization and retrieval

Is there a specific reason why the constructor ClassDict(String, String) does not initialize the __class__ attribute value? I have a situation where I need to invoke the method ClassDict#__setstate__(Map) with an empty HashMap instance to have a newly constructed ClassDict instance properly initialized.

Also, would it be possible to expose the field ClassDict#classname via a getter method? This method could specify protected access modifier if you don't want to clutter the public API.

HashSet accidentally converted to a list

import com.google.common.collect.{Maps, Sets}
import net.razorvine.pickle.{Unpickler, Pickler}

object PickleTest extends App {

  val set1 = Sets.newHashSet("value1", "value2")
  val map1 = Maps.newHashMap[String, Any]
  map1.put("pkey", 1)
  map1.put("s", set1)

  val set2 = Sets.newHashSet("value1", "value2")
  val map2 = Maps.newHashMap[String, Any]
  map2.put("pkey", 2)
  map2.put("s", set2)


  val obj = Array(map1, map2)
  val pickler = new Pickler
  val array = pickler.dumps(obj)

  val unpickler = new Unpickler
  val result = unpickler.loads(array).asInstanceOf[Array[_]]
  println(result(0).asInstanceOf[java.util.HashMap[String, Any]].get("s").getClass)   // HashSet  (ok)
  println(result(1).asInstanceOf[java.util.HashMap[String, Any]].get("s").getClass)   // ArrayList (wrong!)
}

java: ByteArrayConstructor should graciously handle existing byte arrays

The method ByteArrayConstructor#construct(Object[]) fails with a ClassCastException if the argument is already a byte array. This situation should be handled by introducing an instanceof type check:

if(args.length == 1){
  if(args[0] instanceof byte[]){
    return args[0];
  }
  // Proceed as usual
}

I've come across pickled xgboost.sklearn.XGBRegressor object instances (using Python 3.4) where the builtins.bytearray is already a byte array. For example, the following commit contains such joblib dump file XGBAuto.pkl: jpmml/jpmml-sklearn@bc022a3

Unpickle array from Python 2.6

In Python 2.6, array is pickled differently than in 2.6, Pyrolite will failed to unpickle it.

We find this issue when test pyspark with python 2.6 1:

Reconstructor looking for method "reconstruct" in an IObjectConstructor(ClassDictConstructor), which only specifies method "construct"

My apologies if this is the wrong place to report this; I've just started experimenting with this and it seems like this is a bug but it might be my own user-error. I'm having trouble unpickling (protocol 0, though I also have problems with 2) a fairly simple custom Python class into Java. To get right to the point, load_reduce pops an IObjectConstructor from the pickle stack, which ends up being a Reconstructor instance, and then calls construct on it. In the end, however, this particular construct is looking for a reconstruct method on the reduce argument, which doesn't exist in ClassDictConstructor. In fact, the only class anywhere that implements a reconstruct seems to be TimeZoneConstructor. This of course results in a NoSuchMethod exception. I think I may be missing something... but I don't know what. It seems like ClassDictConstructor is supposed to have a reconstruct method but it doesn't. I was under the impression from the documentation that instances of custom Python classes would simply be unpickled into a Map of the data members.

Here's the pickle text up to the REDUCE where it dies:

(dp0
I0
ccopy_reg
_reconstructor
p1
(cgmGroup2
TreeNodeData
p2
c__builtin__
object
p3
Ntp4
Rp5

Here's the Python class in question:

class TreeNodeData (object):
    def __init__ (self, xpos, *children):           
        self.children = children                    
        self.parent = None                          
        self.size = 1                               
        self.position = (xpos, None)                
        return                                      

And, the debug view after I tracked down the problem:
screen shot 2015-07-01 at 10 29 40 am

Run python script using Pyrolite

Hi,

I have a simple python script in my local machine, which returns a string. I want to run this script from java application and get the return value. I'm trying to do this using Pyrolite. I downloaded the jar files and added them to my java class path. But I'm not able to run the script.

I got the below sample code from readme.txt

NameServerProxy ns = NameServerProxy.locateNS(null);
PyroProxy remoteobject = new PyroProxy(ns.lookup("Your.Pyro.Object"));
Object result = remoteobject.call("pythonmethod", 42, "hello", new int[]{1,2,3});
String message = (String)result; // cast to the type that 'pythonmethod' returns
System.out.println("result message="+message);
remoteobject.close();
ns.close();

But this is not working for me. My system configuration is

OS: Windows 8
JDK: jdk1.7.0_51
Python: 2.6

Please help me with this.

Remove HMAC_KEY config item

Remove global HMAC_KEY config item in favor of per-proxy hmac key setting.
Like the change made in Pyro4 for 4.29.

java: cal+tz pickle results in wrong tz offset in python after unpickling

pickling a Calendar with a TimeZone results in a pickle that contains an embedded pytz._p constructor call. Unfortunately it seems that the tz offsets encoded in it are incorrect, the resulting datetime object on the Python side (after unpickling) has a wrong tz offset.

I think this is because the pickle code encodes tz.getRawOffset() which is wrong when DST is in effect.

Because tz's are pickled without context, I don't see how we can fix this :-( Because we'll need access to the calendar's date and time to know when DST is in effect

Example code:

        cal = Calendar.getInstance();
        cal.set(Calendar.YEAR, 2015);
        cal.set(Calendar.MONTH, 4);
        cal.set(Calendar.DAY_OF_MONTH, 18);
        cal.set(Calendar.HOUR_OF_DAY, 23);
        cal.set(Calendar.MINUTE, 59);
        cal.set(Calendar.SECOND, 59);
        cal.set(Calendar.MILLISECOND, 0);
        cal.setTimeZone(TimeZone.getTimeZone("Europe/Amsterdam"));

after pickling that cal, unpickling it with Python (2.7) results in a datetime like this;

2015-05-18T23:59:59+01:00
datetime.datetime(2015, 5, 18, 23, 59, 59, tzinfo=<DstTzInfo 'Europe/Amsterdam' CEST+1:00:00 
DST>)

whose tz offset is wrong by 1 hour (should be +02:00)

(btw for UTC timezones, the offset is 0, and there is no problem)

Registering custom picklers for a tree of classes

Hi, like working with the pickle part of Pyrolite for Apache Spark / Cassandra / Python integration. However, as I'm using / touching Scala / Cassandra code I need to add some custom picklers and constructors. That's obvious of course, but things would be a lot easier and above all, less error prone if I could register a pickler for an entire tree of classes at once by registering it for a parent class / interface.

To illustrate the issue: in order to support pickling Scala maps, it's easy to write a single simple implementation of net.razorvine.pickle.IObjectPickler. However it has to be registered to every possible specialization of scala.collection.Map. In practice that means registering it for scala.collection.immutable.Map.Map1 and scala.collection.immutable.Map.Map2, etc.

Another example is java.nio.ByteBuffer. The actual objects you'd probably see flying around are of the type java.nio.HeapByteBuffer, which is a package protected class. So you'd have to register pickler something like:

Pickler.registerCustomPickler(Class.forName("java.nio.HeapByteBuffer"), ByteBufferPickler)

while

Pickler.registerCustomPickler(classOf[ByteBuffer], ByteBufferPickler)

should capture just any specialization of ByteBuffer.

Cheers!

java 9 test fails

testSTOP(net.razorvine.pickle.test.UnpickleOpcodesTest) Time elapsed: 0.006 sec <<< ERROR!
java.lang.Exception: Unexpected exception, expected<java.lang.ArrayIndexOutOfBoundsException> but was<java.lang.IndexOutOfBoundsException>

implement pyro 4.22's new wire protocol v46

Pyro 4.22 got a new wire protocol, v46. Implement this in Pyrolite. It is NOT backwards compatible.

(note: this can be done separately from the multi-serializer support)

Java pickle of `java.sql.Date` misses offset for Calendar zero-based months

Hi,

The offset for the java.util.Calendar zero-based months misses when pickling java.sql.Date objects, this results in a month less when picking a java sql Date for python. This is because the Date object is converted to a Calendar object, which contains zero-based months.

Location: Pickler.java#489
Note that this is already correct on line #430 for the Calendar putter.

Steps to reproduce:

  1. Java: Create a java.sql.Date object with for example date 2018-05-06
  2. Pickle and save/send
  3. Unpickle in python
  4. Print the date object, it will output 2018-04-06

I will create a pull request which contains the fix.

Thanks :)

Json serializer customer Lua

I am developing a client on the Lua 3.2 programming language.
As will be used in a legacy project, the Lua version being used is 3.2

By defining a simple API like this

class Math(object):
    def sum(self, a, b):
        return a + b

When calling the API client in the Lua

local px = proxy('PYRO:obj_5d871322110a4062b25d7a21bc34ae8e@localhost:59249')

px:start_connection()

local a, b = 2, 3
local result = px.sum(2, 3)

if type(result) == 'number' then
    println(format("%d + %d = %d", a, b, result))
else
    tprint(result)
end

Using json serializer, I have to pass parameters of type 'kwargs' for remote function. Otherwise I get an error

exceptions.KeyError
...
self.loadsCall(data)
4: File "C:\Python27\lib\site-packages\Pyro4\util.py", line 494, in loadsCall kwargs = self.recreate_classes(data["kwargs"])
5: KeyError: 'kwargs'
__exception__: true
args: {
n: 1
1: kwargs
}

When calling the api with any parameter:
https://github.com/alexsilva/Pyrolite/blob/lua-dev/lua32/core.lua#L55

local data = serializer:dumps({
        object= self.uri.objectid,
        method = methodname,
        params = args,
        kwargs = { x =1 }
    })

I get a new error:

__exception__: true
args: {
n: 1
1: sum() got an unexpected keyword argument 'x'
}

You can see that there is a problem in json serializer here:
https://github.com/irmen/Pyro4/blob/master/src/Pyro4/util.py#L494

kwargs = self.recreate_classes(data["kwargs"])

This could be written this way:

if  "kwargs" in data:
    kwargs = self.recreate_classes(data["kwargs"])
else:
   kwargs = {}

Because even without declaring 'kwargs' in function, is valid for the python:

>>> def x():
...     pass
...     
>>> d = {}
>>> x(**d)

java: pickling calendar results in invalid pickle for Python 3.x

Commit 9448bce broke the Calendar pickling to datetime objects.

Pickling this calendar object:

    cal = Calendar.getInstance();
    cal.set(Calendar.YEAR, 2015);
    cal.set(Calendar.MONTH, 4);
    cal.set(Calendar.DAY_OF_MONTH, 18);
    cal.set(Calendar.HOUR, 23);
    cal.set(Calendar.MINUTE, 59);
    cal.set(Calendar.SECOND, 59);
    cal.set(Calendar.MILLISECOND, 0);
    cal.setTimeZone(TimeZone.getTimeZone("Europe/Amsterdam"));

results in a pickle that makes python 3.x crash when unpickling:
UnicodeDecodeError: 'ascii' codec can't decode byte 0xdf in position 1: ordinal not in range(128)

(works ok in python 2.x)

cause: pickle contains SHORT_BINSTRING opcode which fails decoding in Python3. Python 3 pickles it with a SHORT_BINBYTES opcode instead.

(note: the SHORT_BINSTRING opcode is also used when pickling array.array, but it only encodes the typecode character there and it seems to work fine when unpickling in python3)

java: Unpickler should handle `Opcodes.OBJ`

Came across this issue when unpickling "callable class" instances.

By observing Python2.7 and Python 3.4 pickle.py files, I've managed to provide a temporary handler like this:

Unpickler unpickler = new Unpickler(){

	@Override
	protected Object dispatch(short key) throws IOException {
		switch(key){
			case Opcodes.OBJ:
				return load_obj();
			default:
				return super.dispatch(key);
		}
	}

	private Object load_obj(){
		List<Object> args = super.stack.pop_all_since_marker();
		IObjectConstructor constructor = (IObjectConstructor)args.get(0);
		args = args.subList(1, args.size());
		Object object = constructor.construct(args.toArray());
		super.stack.add(object);

		return noReturnValue();
	}
};

Support pickle/unpickle for numpy types

Hello,

I am a PySpark user. In PySpark, it currently uses Pyrolite to pickle/unpickle data in jvm and send to/from python. Here is where it gets used:

https://github.com/apache/spark/blob/6a5a7254dc37952505989e9e580a14543adb730c/sql/core/src/main/scala/org/apache/spark/sql/execution/python/BatchEvalPythonExec.scala#L25

It is a known issue that this doesn't work with numpy types in python, that is, if the user tries to send a numpy type to the jvm, it throws an exception inside:

https://github.com/irmen/Pyrolite/blob/master/java/src/main/java/net/razorvine/pickle/Unpickler.java#L154

The exception is something like:

Caused by: net.razorvine.pickle.PickleException: expected zero arguments for construction of ClassDict (for numpy.dtype)
	at net.razorvine.pickle.objects.ClassDictConstructor.construct(ClassDictConstructor.java:23)
	at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:707)
	at net.razorvine.pickle.Unpickler.dispatch(Unpickler.java:175)
	at net.razorvine.pickle.Unpickler.load(Unpickler.java:99)
	at net.razorvine.pickle.Unpickler.loads(Unpickler.java:112)
	at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1$$anonfun$apply$5.apply(BatchEvalPythonExec.scala:137)
	at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1$$anonfun$apply$5.apply(BatchEvalPythonExec.scala:136)
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:396)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:369)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:370)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:246)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:803)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:803)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
	at org.apache.spark.scheduler.Task.run(Task.scala:86)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	... 1 more

I am wondering what's your thought on supporting numpy types?

add metadata stuff

Pyro 4.40 has new metadata support in the name server.
Add support for this to Pyrolite.

pyrolite and Android: VerifyError

Hello bro, I think your project is very useful.

I was trying to do an interface with your library, I got no mistakes with only Java but when I tried on Android, fail =S.

I get this log:

03-05 22:19:05.039: E/AndroidRuntime(9558): java.lang.VerifyError: net/razorvine/pickle/Pickler

I hope you can help me.

Regards, best!

fix serializer used in handshake

handshake always assumes serializer as configured in Config.
This is not correct, it should look at the serializer_id in the response message.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.