Comments (8)
It's probably an issue with the save and load method, as I haven't tested this too extensively (and ran into issues specific to pytest). I'll take a look.
from mat_discover.
Note that some of the attributes are assigned during predict, and in the current implementation, mat_discover
can't be used on new data due to use of "precomputed" distance matrices calculated during predict
. This hasn't been a major hindrance for me since it runs fairly quickly on a GPU, and even a CPU for moderate-sized dataset (e.g. 10k points). This is on my to-do list, so if it's a major setback for you I can take another stab sooner.
from mat_discover.
Did you run disc.predict
before running into this error? I probably need to add a useful warning for this for most of the subfunctions like the ones you called. If that's not the issue, then that changes things.
from mat_discover.
ok that was pretty dumb on my side, I didn't define the loaded model like d = disc.load(path)
. Doing this loads the indicated model, and the methods I mentioned above work with the exception of cluster_avg
, that gives [TypeError: 'numpy.ndarray' object is not callable]()
. Running disc.predict
doesn't seem to affect the load
method.
Interestingly though, if I try to load a model I fitted on Google Colab, the loading doesn't always work. Most of the times it returns [ModuleNotFoundError: No module named 'crabnet.model']()
, even though I download the disc.pkl file as it is from the Colab folder through Drive. I'll try to get some more information about this problem.
from mat_discover.
ok that was pretty dumb on my side, I didn't define the loaded model like
d = disc.load(path)
. Doing this loads the indicated model, and the methods I mentioned above work with the exception ofcluster_avg
, that gives[TypeError: 'numpy.ndarray' object is not callable]()
. Runningdisc.predict
doesn't seem to affect theload
method.
No worries. Ok, yeah I'll need to check into why that is for cluster_avg
.
Interestingly though, if I try to load a model I fitted on Google Colab, the loading doesn't always work. Most of the times it returns
[ModuleNotFoundError: No module named 'crabnet.model']()
, even though I download the disc.pkl file as it is from the Colab folder through Drive. I'll try to get some more information about this problem.
I ran into this issue before and couldn't figure it out at the time, so I opened it in a separate GitHub issue before.
from mat_discover.
See #39
from mat_discover.
@ancarnevali The issue is probably a difference in mat_discover
versions. When I bumped from version 1 to version 2, I replaced crabnet.model
with crabnet.crabnet_
and got rid of train_crabnet.py
. This was part of a series of changes I've been wanting to make for a long time. If you install e.g. 1.3.1
I think you will be able to load the models you fitted before. pickle
is easy, but maintaining load
and save
version compatibility when breaking changes occur can be tough (maybe there's an easier way I'm not aware of). Let me know if there's anything you need on this front.
from mat_discover.
Btw, I checked back into cluster_avg
and it looks like it's not a method, but an array that only gets assigned under certain conditions.
Condition:
mat_discover/mat_discover/mat_discover_.py
Line 554 in 025ac8a
mat_discover/mat_discover/mat_discover_.py
Lines 634 to 639 in 025ac8a
from mat_discover.
Related Issues (20)
- hdbscan temporarily not working HOT 1
- Issue with custom color for pareto_plot (can't use strings for color values)
- peak_score.csv should probably have `k_neigh_avg` instead of `density`
- Readthedocs not creating mat_discover_ docs
- Error while using `nearest_neigh_prop` HOT 3
- Density scores are negative if property values are negative (?)
- LLVM error on CPU HOT 15
- Support Python 3.10
- Remove `flit_core >=3.2,<4` in meta.yaml produced by run_grayskull.py
- megnet not installed, how to deal with crystal vs. composition dependencies?
- make crystal structure packages optional (i.e., do try-except imports)
- why was jaxlib incompatible?
- GitHub actions: make sure that workflow recognizes the error of "already uploaded to PyPI" and stops there
- Add/improve docstrings for `Adapt` class
- add `skip_write_image` kwarg
- add unique elements and unique templates from extraordinary as measure of both performance + novelty
- Additional performance/novelty metric
- Compare with Bayesian optimization
- Aitchison compositional distance metric as a swap-out for element mover's distance
- FileNotFoundError while splitting val_df into chunks HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mat_discover.