speckleworks / specklecore Goto Github PK
View Code? Open in Web Editor NEWCheck a brand new Speckle at: https://github.com/specklesystems
Home Page: https://speckle.systems
License: MIT License
Check a brand new Speckle at: https://github.com/specklesystems
Home Page: https://speckle.systems
License: MIT License
Step 0:
Clients, when deserialised, should use the current user's api token.
They use the creator's api token (they get serialised with it).
Will propagate through Speckle Rhino and co, but ideally on the dev side changes will be minimal.
Server returns a list of objectIds
, response thinks its called Objects
.
The Rhino and Grasshopper clients have various update methods that have been tailored to their environment. TODOs:
This is a big, non-urgent, issue
I did some more work on the DSConverter and this is what is still open and maybe SpeckleCore issues?
I only did a quick test, it is not checked if geometry perfectly matches input 100%, yet.
Type | Geometry | UserData |
---|---|---|
Point | ok | ok |
Vector | ok | No UserData possible (SetUserData node has error) |
Line | Conversion not called (uses SpecklePolyline class) |
?? |
Rectangle (closed) | Conversion not called (uses SpecklePolyline class) Has duplicated start + end point |
?? |
Polyline (open) | ok | ok |
Polyline (closed) | Has duplicated start + end point | ok |
Circle (closed) | Conversion not called (uses SpeckleArc class) |
?? |
Ellipse (closed) | ok | ok |
Arc Circular (open) | ok | ok |
Arc Elliptical (open) | Unsupported? | ?? |
Nurbs (open) | Conversion does not work (Nurbs input unsuitable) |
?? |
Nurbs (closed) | Conversion does not work (Nurbs input unsuitable) |
?? |
Polycurve (open) | (ToDo) | (ToDo) |
Polycurve (closed) | (ToDo) | (ToDo) |
Plane | ok | No UserData possible (SetUserData node has error) |
Mesh | ok | ok |
Nurbs issues
ERROR
Unable to create BSplineCurve from vertices : B-spline knot sequence is decreasing (should be non-decreasing)
NOTE
Degree: Should be greater than 1 (piecewise-linear spline) and less than 26 (the maximum B-spline basis degree supported by ASM). Weights: All weight values (if supplied) should be strictly positive. Weights smaller than 1e-11 will be rejected and the function will fail. Knots: The knot vector should be a non-decreasing sequence. Interior knot multiplicity should be no larger than degree + 1 at the start/end knot and degree at an internal knot (this allows curves with G1 discontinuities to be represented). Note that non-clamped knot vectors are supported, but will be converted to clamped ones, with the corresponding changes applied to the control point/weight data. Knot Array: The array size must be num_control_points + degree + 1
What do we change in SpeckleCore, which needs to be adjusted in other places?
Essentially, he freaks out and does not serialise them.
Whatever works, and we get rid of the mess I made.
Step 0:
Clients should set accept-encoding: gzip
headers and properly decompress responses.
Server returns uncompressed data. 🤕
All projects dependent on SpeckleCore should update once this is fixed.
Speckle Core converter should be the single point of contact for conversions, and the application specific converters should just add extension methods to their types, for which the core converter will look into.
Flow as it is now:
Object -> AppConverter (ie, rhinoconverter) -> checks type & converts | tries to convert to abstract -> converted object
Flow as it can be:
Object -> SpeckleCore.Converter.Convert -> checks for extension method "ToSpeckle" -> if yes, returns the result of that, if not converts to abstract
Simpler clearer and less switches! Or just switching problems around? 🤔
Add project number please - this is often used together with the name.
This should not affect deployment usability like #7 so it's a go
Where x = 10mb
deflated, or x
is grabbed from the root api endpoint. These errors should nicely bubble up into clients.
For example, the rhino radial dimension has public properties/fields, that nevertheless can not be set due to the restrictions of rhino, throwing an ominous DocumentCollectedException: This object cannot be modified because it is controlled by a document.
Fix (in progress): wrap all property/field setters in try { } catch { }
and take care on implications when it comes to references (some might be null!).
Remove api calls for stream layers & stream name
They should be replaced with stream patch
Step 0:
Getting all streams for a user should be GET <server>/streams
.
This fails with a 400
response ("Something went wrong.").
Adding ?omit=objects
makes it work. Something goes wrong when it tries to include the objects in the stream list.
Core, server.
For now, adding ?omit=objects
makes it work.
Step 0:
We probably need an interface for the interop class that applications using SpeckleView can rely on:
More things to come...
byObject, byLayer?
For streams, objects, clients, users, comments
This would allow for objects to be deserialised to native types even if the assembly name differs, as the case might be when producing libraries for various applications that don't share a core
lib
The number of points does not tell you if the last point is supposed to be connected with the first one. Add boolean parameter IsClosed or something similar?
Concerning the future SpeckleRhino interface, I noticed that there is no code written to handle ToNurbsCurve() conversions (in order to retrieve the UserDictionary), like it is done for the Grasshopper component here:
https://github.com/speckleworks/SpeckleUserDataUtils/blob/dev/SetUserDataComponent.cs#L67-L74
Therefore I propose to add the same conversion here (I am not sure if this is consistent):
Step 0:
SpeckleAbstract hash should be unique based on its property, type, and assembly.
Hash is only based on property dictionary. This causes issues when different objects with the same property dictionary are sent as it causes collisions.
SpeckleCore
Add the assembly and type to the hashes of SpeckleAbstracts.
Patch
I noticed that the Add User component exposes full email addresses.
I know this is nasty stuff, but given that users don't have to accept any T&C to use the service and that a new regulation on data protection will soon take effect in the EU, I think we should be careful with user data.
To make things snappier and nice fallback for offline. Would be nice to have a 'make available offline' sort of functionality.
Some notes here about what I was suggesting for StreamAdd.
StreamAdd would be useful in the case scenario where the user adds to an existing stream (containing objects & layers) new objects and new layers (among those, some objects or some layers might already exist in the stream).
The following steps would be:
1 - running through all the objects the user intends to add and dismiss the ones whose Guid matches with existing ones in the DB.
2 - running through all the layers the user intends to add and dismiss the ones whose Guid matches with existing ones in the DB.
3 - Finally, it remains to the user to organize his objects and layers so they can match startIndex and objectCount before calling StreamAdd. However, StreamAdd should be aware of the number of objects existing in the stream, so the startIndices found by the user when running through the new list of objects he intends to add (let's say 0-2, 3-8, 9-13) would be shifted and become (if 8 objects exist): 8-11, 12-17 etc. etc.
Let's hope that this way of operating would not be too expensive with a large amount of objects
The SpeckleAbstract Type has a few cool ideas in it, but after initial frolicking, we will defer to several sets of good practice, namely:
[Serializable]
[NotSerialised]
Also docs will need to be updated...
To keep in mind, also custom attributes could be used, on the lines of [SpeckleSafe] to mark safe-for-speckle classes, and [SpeckleSkip] to mark fields/properties that should not be serialised.
Step 0:
Deserializing a SpeckleAbstract object which has a field or property of a type List<int>
properly populates the property.
Currently returns the object with an empty list. Behavior is due to the fact that all numbers are received as type long
which results in an error when adding to a List<int>
.
SpeckleCore
List<int>
.Convert the object to be added to the property/field list explicitly.
Patch
Figure out AppVeyor NuGet plumbing.
Because we're no longer checking for the assembly in ToAbstract
and only in Serialise
, objects that have ToSpeckle methods do not get those called.
Essentially Serialise and ToAbstract should be combined with the initial checks from Serialise. TODO: check the deserialisation routes
ref speckleworks/SpeckleGrasshopper#7
migrating issue, filed in a bad repo
Step 0:
SpeckleObject
should have a Duplicate()
method.
It doesn't.
Projects where you want to duplicate a SpeckleObject
.
Try to use SpeckleObject.Duplicate()
Implement Duplicate()
for SpeckleObject
s.
Should probably throw an exception when no internet is available or the API cannot be reached. If the call below fails there is not catch
to handle it:
Things like Speckle.RhinoConverter
and its Speckle.RevitConverter
cousin should implement the same ISpeckleConverter
interface. This came out of issue 4 on SpeckleRevit repo where I realised there wasn't a base class or interface defined for what these converters should implement as a minimum.
Same goes for the clients themselves, for example the Rhino client defines an interface for itself in the client repo (filename doesn't match contents btw @didimitrie @fraguada ) but it doesn't look like Speckle.Core
defines or has any opinions on what a client should implement.
Provisional list of interfaces required :
ISpeckleConverter
ISpeckleClient
ISpeckleSender & ISpeckleGetter ?
I might be off-base here, keen to hear what others think ?
Step 0:
[ ] Introduce "Name" as a standard property on SpeckleObject
[ ] Rename existing property "Name" on SpeckleBlock to "BlockName"
Step 0:
Users should be able to register.
Registration throws on error from the SpecklePopup.
SpeckleGrasshopper, SpeckleDynamo
Step 0:
Speckle objects should have a transform property as defined by the specs
They don't.
Contained to speckle core as it's not used yet throughout.
https://github.com/speckleworks/SpeckleCore/blob/master/SpeckleCore/ModelObjects.cs#L122-L151
This is a tricky one:
On the server side, compression should be enabled on the webserver/proxy level (nginx). As such, if implementations details vary - someone will not go through the extra steps of enabling compression, or will not have a proxy - then the client will fail.
Can we please get back SpeckleLine class instead of using SpecklePolyline with 2 points?
Thankssss
Step 0:
The api for converting between speckle and native types currently hides the dependency on extension methods named ToNative and ToSpeckle that lie in an assembly with Speckle and Converter in its name. Reading the code or documentation is the only way to realise the requirement of this dependency. This doesn’t offer a lot of flexibility in choosing how converters are consumed by the Speckle api.
Any projects where the current converter is used.
Looking through the repo, this effects:
Just an idea on how to solve this:
Step 0:
In progress. Every speckle object will have a .Scale( double factor )
method that will make unit conversions easier at a pre-geometry level.
This will not work for breps though.
Resulting in no returns as the server fails to parse them properly.
Wouldn't it make sense to use SpecklePoints instead of double[] Points { get; set; } for Curves etc.?
Name "Points" might be misleading as it's in reality Point coordinates and not SpecklePoints.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.