judftteam / masci-tools Goto Github PK
View Code? Open in Web Editor NEWPost-processing toolkit for electronic structure calculations
Home Page: https://masci-tools.readthedocs.io
License: MIT License
Post-processing toolkit for electronic structure calculations
Home Page: https://masci-tools.readthedocs.io
License: MIT License
'use_input_alat' should be properly defined, i.e. without <> around it
appeared in:
2020-02-03 15:47:55 [478 | REPORT]: [817|kkr_startpot_wc|start]: INFO: started VoroStart workflow version 0.10.6
2020-02-03 15:47:55 [479 | REPORT]: [817|kkr_startpot_wc|start]: INFO: use the following parameter:
use_mpi: True
Resources: {'num_machines': 1}
Walltime (s): 3600
queue name:
scheduler command:
description:
label:
dos_params: {'emax': 1, 'emin': -1, 'kmesh': [30, 30, 30], 'nepts': 61, 'tempr': 200}
Max. number of voronoi reruns: 8
factor cluster increase: 1.1
default cluster radius (in alat): 0.7212881311274263
min. number of atoms in screening cls: 60
min. dist in DOS contour to emin/emax: 1.0 eV
threshold where DOS is zero: 0.01 states/eV
minimal distance of highest core state from EMIN: 0.2 Ry
2020-02-03 15:47:57 [480 | REPORT]: [817|kkr_startpot_wc|run_voronoi]: INFO: input params: {'FCM': None, 'INS': None, 'EMAX': None, 'EMIN': None, 'GMAX': 60.0, 'ICST': None, 'IMIX': None, 'IRID': 600, 'LMAX': 3, 'NAEZ': None, 'NMIN': None, 'NPOL': None, 'NPT1': None, 'NPT2': None, 'NPT3': None, 'RMAX': 6.0, 'EFSET': None, 'FILES': None, 'IEMXD': None, 'IPAND': 100, 'KVREL': None, 'NATYP': None, 'NCHEB': None, 'NSPIN': 2, 'R_LOG': None, 'TEMPR': None, '': None, 'BRYMIX': None, 'HFIELD': None, 'ITDBRY': None, 'JIJRAD': None, 'KEXCOR': None, 'KSHAPE': None, 'N1SEMI': None, 'N2SEMI': None, 'N3SEMI': None, 'NSHELD': None, 'NSTEPS': None, 'QBOUND': None, 'RUNOPT': ['FINDSIZE', 'MPIenerg', 'WRITEALL'], 'STRMIX': None, 'TKSEMI': None, 'VCONST': None, '': None, '': None, '': None, 'BRAVAIS': None, 'CPAINFO': None, 'EMUSEMI': None, 'LINIPOL': None, 'NPAN_EQ': None, 'RCLUSTZ': 0.8, 'TESTOPT': None, 'XINIPOL': None, '': None, '': None, '': None, '': None, '': None, '': None, '': None, 'BZDIVIDE': None, 'EBOTSEMI': None, 'JIJRADXY': None, 'JIJSITEI': None, 'JIJSITEJ': None, 'NAT_LDAU': None, 'NLEFTHOS': None, 'NPAN_LOG': None, 'NPOLSEMI': None, 'NRIGHTHO': None, 'RCLUSTXY': None, 'ZPERIODL': None, 'ZPERIODR': None, '': None, '': None, '': None, '': None, '': None, '': None, '': None, 'ALATBASIS': 22.980959400836, 'CARTESIAN': None, 'FSEMICORE': None, 'INTERFACE': None, 'KREADLDAU': None, 'LAMBDA_XC': None, 'LDAU_PARA': None, '': None, '': None, '<use_input_alat>': True}
2020-02-03 15:47:57 [481 | REPORT]: [817|kkr_startpot_wc|run_voronoi]: INFO: setting KSHAPE to default value 2
2020-02-03 15:47:57 [482 | REPORT]: [817|kkr_startpot_wc|run_voronoi]: INFO: setting INS to default value 1
2020-02-03 15:47:57 [483 | REPORT]: [817|kkr_startpot_wc|run_voronoi]: INFO: setting RCLUSTZ to 0.8
2020-02-03 15:47:57 [484 | REPORT]: [817|kkr_startpot_wc|run_voronoi]: INFO: setting RMAX to 6.0 (needed for DOS check with KKRcode)
2020-02-03 15:47:57 [485 | REPORT]: [817|kkr_startpot_wc|run_voronoi]: INFO: setting GMAX to 60.0 (needed for DOS check with KKRcode)
2020-02-03 15:47:58 [487 | REPORT]: [817|kkr_startpot_wc|on_except]: Traceback (most recent call last):
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/process_states.py", line 220, in execute
result = self.run_fn(*self.args, **self.kwargs)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/aiida/engine/processes/workchains/workchain.py", line 181, in _do_step
finished, stepper_result = self._stepper.step()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/workchains.py", line 283, in step
finished, result = self._child_stepper.step()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/workchains.py", line 515, in step
finished, result = self._child_stepper.step()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/workchains.py", line 283, in step
finished, result = self._child_stepper.step()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/workchains.py", line 234, in step
return True, self._fn(self._workchain)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/src/aiida-kkr/aiida_kkr/workflows/voro_start.py", line 393, in run_voronoi
params = update_params_wf(self.ctx.last_params, updatenode)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/aiida/engine/processes/functions.py", line 191, in decorated_function
result, _ = run_get_node(*args, **kwargs)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/aiida/engine/processes/functions.py", line 163, in run_get_node
result = process.execute()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/aiida/engine/processes/functions.py", line 374, in execute
result = super(FunctionProcess, self).execute()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/processes.py", line 88, in func_wrapper
return func(self, *args, **kwargs)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/processes.py", line 1065, in execute
return self.future().result()
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/futures.py", line 34, in result
return super(Future, self).result(timeout)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result
raise_exc_info(self._exc_info)
File "", line 4, in raise_exc_info
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/plumpy/process_states.py", line 220, in execute
result = self.run_fn(*self.args, **self.kwargs)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/lib/python3.7/site-packages/aiida/engine/processes/functions.py", line 422, in run
result = self._func(*args, **kwargs)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/src/aiida-kkr/aiida_kkr/tools/common_workfunctions.py", line 60, in update_params_wf
nodedesc=nodedesc, **updatenode_dict)
File "/Users/kovacik/CODES/venv_aiida1.0.1_py3/src/aiida-kkr/aiida_kkr/tools/common_workfunctions.py", line 105, in update_params
raise InputValidationError('unvalid key "{}" in input parameter node'.format(key))
aiida.common.exceptions.InputValidationError: unvalid key "<use_input_alat>" in input parameter node
Hello,
I'd like to suggest to set multiply_by_equiv_atoms=True
as default for DOS plots, because the other one is just confusing. In some systems this isn't too important, but if you have a lot of symmetry I can distort the plot a lot.
f, ax = plt.subplots(1,1, figsize=np.array([6,4])*1.2, constrained_layout=True)
with HDF5Reader("../../calc/YIG/PBE/kpts=0003/DOS/banddos.hdf") as h5reader:
data, attributes = h5reader.read(recipe=FleurDOS)
plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=1.5, axis=ax,
show=False, title="PBE", legend_options={'fontsize':12},
limits={'energy':[-8.5,8.5]}, legend=False, spin_arrows=False,
colors=["k", "C0", "C1", "C2", "C3", "C4"],
linestyle=["-", "-", "-", "-", "--", "-"]
)
handles, labels = ax.get_legend_handles_labels()
handels = handles[:-int(len(handles)/2)]
labels = labels[:-int(len(handles)/2)]
bandgap = 0.4386253446
ax.axvspan(0, bandgap, facecolor='r', alpha=0.1)
ax.text(bandgap, -61, "$\leftarrow$0.44 eV ", fontsize=14)
labels = transform_legend(labels)
ax.set_xlim([-8.5,8.5])
ax.set_ylim([-65,65])
ax.legend(handles, labels, ncol=4, fontsize=12, bbox_to_anchor=(0.95,-0.18))
ax.text(-8.5, 50, r"$\uparrow$", fontsize=26)
ax.text(-8.5, -60, r"$\downarrow$", fontsize=26)
plt.savefig("YIG_PBE.pdf")
plt.show()
Notice, that the oxygen is only dashed for the spin up and that the x-axis somehow also is dashed.
The size of these files can be reduced a lot. Some ideas:
I missed the convergence results routines when converting bokeh plots to the new framework. since they use bokeh_line they use the old interface and do not work
This would make it easier to reuse paths for e.g. bandstructures
Now that the bokeh version is no longer pinned, we can start using latex labels, especially in the fleur visualizations
Atm the GreensFunction
class is very tied to the Fleur IO and the LS basis
It would be useful to have the ability to transform into different decompositions, e.g. from d Green's function to e2g/t2g blocks or from ls to j
The idea would be to make the GreensFunction class more generic instead of spins, there could be just the concept of blocks in the matrix, inspired by Greens functions in TRIQS) and either subclassing or completely separating it for Fleur
At the moment there are two problems with bandstructure plots with a large number of bands especially:
h5py
has some way of doing that) reading the datasets in the HDF5ReaderWe should investigate whether the bokeh version limit of 1.4.0
can be lifted. The main reason was version conflicts with tornado
requirements between different libraries, which at least aiida-core
no longer uses already in the current releases since it switched to asyncio
Following the docu I tried to do this:
#Read in data
with HDF5Reader("PBE/kpts=0003/DOS/banddos.hdf") as h5reader:
data, attributes = h5reader.read(recipe=FleurDOS)
#Plot the data
#Notice that you get the axis object of this plot is returned
#if you want to make any special additions
ax = plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=1.5, limits={'x': (-5,5)})
The linewidth one works, but the limits one returns:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-11-09c6c8132087> in <module>
6 #Notice that you get the axis object of this plot is returned
7 #if you want to make any special additions
----> 8 ax = plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=1.5, limits={'x': (-5,5)})
~/.local/lib/python3.8/site-packages/masci_tools/vis/fleur.py in plot_fleur_dos(dosdata, attributes, spinpol, bokeh_plot, multiply_by_equiv_atoms, plot_keys, show_total, show_interstitial, show_sym, show_atoms, show_lresolved, key_mask, **kwargs)
364 legend_labels, keys = np.array(legend_labels)[key_mask].tolist(), np.array(keys)[key_mask].tolist()
365
--> 366 kwargs = _process_dos_kwargs(keys, **kwargs)
367
368 if bokeh_plot:
~/.local/lib/python3.8/site-packages/masci_tools/vis/fleur.py in _process_dos_kwargs(ordered_keys, **kwargs)
404 new_dict[ordered_keys.index(plot_label)] = new_dict.pop(plot_label)
405 else:
--> 406 raise ValueError(f'The label {plot_label} is not a valid label for the current plot')
407 kwargs[key] = new_dict
408 return kwargs
ValueError: The label x is not a valid label for the current plot
The plotdos routine:
https://masci-tools.readthedocs.io/en/latest/module_guide/code.html?highlight=plot_dos#masci_tools.vis.plot_methods.plot_dos
has a saveas
argument, which I call with saveas="bla.pdf"
. This results in a file called bla.pdf.png
. This is a bit unfortunate, since often you want a pdf, rather than a png (it's a vector graphic).
validate_nmmpmat
thinks a completely fine mmpmat is broken (It thinks a negative element not on the diagonal is actually on the diagonal).
For reference:
0.9907556309755 -0.0000000000000 0.0002048894567 -0.0000361273993 -0.0010011912548 0.0003644034368 -0.0003398363314
0.0001962043484 0.0004184671113 -0.0003511347013 0.0000576882419 -0.0000687500194 -0.0003113085253 0.0005392008958
0.0002048894567 0.0000361273993 0.9891233299310 -0.0000000000000 -0.0003179049980 0.0000560553310 -0.0006909750415
0.0002514934955 -0.0003935419412 0.0002272112948 0.0000824154988 -0.0000691551997 0.0000576926863 -0.0000687553967
-0.0010011912548 -0.0003644034368 -0.0003179049980 -0.0000560553310 0.9916535865630 0.0000000000000 -0.0004028194591
0.0000710280120 -0.0011476749278 0.0004177194512 -0.0003935541763 0.0002272183286 0.0004184660833 -0.0003511338264
-0.0003398363314 -0.0001962043484 -0.0006909750415 -0.0002514934955 -0.0004028194591 -0.0000710280120 0.9898854083391
0.0000000000000 -0.0004028111370 0.0000710265332 -0.0006909739179 0.0002514930826 -0.0003398349087 0.0001962035728
0.0004184671113 0.0003511347013 -0.0003935419412 -0.0002272112948 -0.0011476749278 -0.0004177194512 -0.0004028111370
-0.0000710265332 0.9916535874154 -0.0000000000000 -0.0003179086904 0.0000560559875 -0.0010011859333 0.0003644014831
0.0000576882419 0.0000687500194 0.0000824154988 0.0000691551997 -0.0003935541763 -0.0002272183286 -0.0006909739179
-0.0002514930826 -0.0003179086904 -0.0000560559875 0.9891233311464 -0.0000000000000 0.0002049028181 -0.0000361297769
-0.0003113085253 -0.0005392008958 0.0000576926863 0.0000687553967 0.0004184660833 0.0003511338264 -0.0003398349087
-0.0001962035728 -0.0010011859333 -0.0003644014831 0.0002049028181 0.0000361297769 0.9907556255140 0.0000000000000
0.3379221511060 -0.0000000000000 0.3688647472021 -0.0650407342032 0.1096535510850 -0.0399106080703 -0.1192407294700
0.0688435699516 -0.0906753568609 0.0760854554711 0.0486221614704 -0.0579455174541 0.0769609015452 -0.1332998236474
0.3688647472021 0.0650407342032 0.4929169673635 0.0000000000000 0.2859576634634 -0.0504219769857 -0.0007231486045
0.0002632978522 -0.0832062106398 0.0480390526968 0.0006171368347 -0.0005177372601 0.0486216945239 -0.0579449420778
0.1096535510850 0.0399106080703 0.2859576634634 0.0504219769857 0.4019891403577 -0.0000000000000 0.3141404071897
-0.0553913675733 0.0873273533334 -0.0317846408193 -0.0832066141528 0.0480392931632 -0.0906756792209 0.0760857341635
-0.1192407294700 -0.0688435699516 -0.0007231486045 -0.0002632978522 0.3141404071897 0.0553913675733 0.4919142779375
0.0000000000000 0.3141393535378 -0.0553911792004 -0.0007231618951 0.0002633029195 -0.1192400231383 0.0688431522061
-0.0906753568609 -0.0760854554711 -0.0832062106398 -0.0480390526968 0.0873273533334 0.0317846408193 0.3141393535378
0.0553911792004 0.4019887148110 0.0000000000000 0.2859586259080 -0.0504221488659 0.1096545477629 -0.0399109778759
0.0486221614704 0.0579455174541 0.0006171368347 0.0005177372601 -0.0832066141528 -0.0480392931632 -0.0007231618951
-0.0002633029195 0.2859586259080 0.0504221488659 0.4929170453269 0.0000000000000 0.3688636472000 -0.0650405377722
0.0769609015452 0.1332998236474 0.0486216945239 0.0579449420778 -0.0906756792209 -0.0760857341635 -0.1192400231383
-0.0688431522061 0.1096545477629 0.0399109778759 0.3688636472000 0.0650405377722 0.3379197726207 0.0000000000000
Up till now the output of the outxml_parser tries to mirror the current status of the hardcoded parser in aiida-fleur as closely as possible. Here we collect things that should be added/modified in the output. Feel free to add to this list
charge_density
. Maybe not the most telling name (alternative density_convergence
for example)In the atomGroup
section of the inp.xml
all of the tags from the species
can occur overriding the tags set on the species
(version constraint?)
This should be included in get_parameter_data
. Since this is a feature that has no large usage atm this is low-priority.
I am trying to use plot as spin-polarized dos and I have two complaints about the little arrows that indicate which is up and which is down. First, the arrows don't scale with the plot size, so can be too large in small plots or vice versa:
My main complaint is however, that the show_arrows=False
option doesn't work:
f, ax = plt.subplots(1,1, figsize=np.array([6,4])*1.2, constrained_layout=True)
with HDF5Reader("../../calc/YIG/PBE/kpts=0003/DOS/banddos.hdf") as h5reader:
data, attributes = h5reader.read(recipe=FleurDOS)
plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=1.5, axis=ax,
show=False, title="PBE", legend_options={'fontsize':12},
limits={'energy':[-8.5,8.5]}, legend=False, spin_arrows=False)
plt.savefig("PBE.pdf")
plt.show()
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
/tmp/ipykernel_24650/841951966.py in <module>
4 data, attributes = h5reader.read(recipe=FleurDOS)
5
----> 6 plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=1.5, axis=ax,
7 show=False, title="PBE", legend_options={'fontsize':12},
8 limits={'energy':[-8.5,8.5]}, legend=False, spin_arrows=False)
~/.local/lib/python3.8/site-packages/masci_tools/vis/fleur.py in plot_fleur_dos(dosdata, attributes, spinpol, bokeh_plot, multiply_by_equiv_atoms, plot_keys, show_total, show_interstitial, show_sym, show_atoms, show_lresolved, key_mask, **kwargs)
381 dosdata_up = [dosdata[key].to_numpy() for key in keys if '_up' in key]
382 dosdata_dn = [dosdata[key].to_numpy() for key in keys if '_down' in key]
--> 383 fig = plot_spinpol_dos(dosdata['energy_grid'], dosdata_up, dosdata_dn, plot_label=legend_labels, **kwargs)
384 else:
385 dosdata_up = [dosdata[key].to_numpy() for key in keys if '_up' in key]
~/.local/lib/python3.8/site-packages/masci_tools/vis/parameters.py in ensure_consistency(*args, **kwargs)
77
78 try:
---> 79 res = func(*args, **kwargs)
80 except Exception:
81 plotter_object.remove_added_parameters()
~/.local/lib/python3.8/site-packages/masci_tools/vis/plot_methods.py in plot_spinpol_dos(energy_grid, spin_up_data, spin_dn_data, saveas, energy_label, dos_label, title, xyswitch, energy_grid_dn, e_fermi, spin_dn_negative, **kwargs)
1767
1768 with NestedPlotParameters(plot_params):
-> 1769 ax = multiple_scatterplots(x,
1770 y,
1771 xlabel=xlabel,
~/.local/lib/python3.8/site-packages/masci_tools/vis/parameters.py in ensure_consistency(*args, **kwargs)
77
78 try:
---> 79 res = func(*args, **kwargs)
80 except Exception:
81 plotter_object.remove_added_parameters()
~/.local/lib/python3.8/site-packages/masci_tools/vis/plot_methods.py in multiple_scatterplots(xdata, ydata, xlabel, ylabel, title, saveas, axis, xerr, yerr, area_curve, **kwargs)
353 **kwargs)
354 else:
--> 355 result = ax.errorbar(x, y, yerr=yerrt, xerr=xerrt, **plot_kw, **kwargs)
356 colors.append(result.lines[0].get_color())
357
~/.local/lib/python3.8/site-packages/matplotlib/__init__.py in inner(ax, data, *args, **kwargs)
1359 def inner(ax, *args, data=None, **kwargs):
1360 if data is None:
-> 1361 return func(ax, *map(sanitize_sequence, args), **kwargs)
1362
1363 bound = new_sig.bind(ax, *args, **kwargs)
~/.local/lib/python3.8/site-packages/matplotlib/axes/_axes.py in errorbar(self, x, y, yerr, xerr, fmt, ecolor, elinewidth, capsize, barsabove, lolims, uplims, xlolims, xuplims, errorevery, capthick, **kwargs)
3339 # that would call self._process_unit_info again, and do other indirect
3340 # data processing.
-> 3341 (data_line, base_style), = self._get_lines._plot_args(
3342 (x, y) if fmt == '' else (x, y, fmt), kwargs, return_kwargs=True)
3343
~/.local/lib/python3.8/site-packages/matplotlib/axes/_base.py in _plot_args(self, tup, kwargs, return_kwargs)
535
536 if return_kwargs:
--> 537 return list(result)
538 else:
539 return [l[0] for l in result]
~/.local/lib/python3.8/site-packages/matplotlib/axes/_base.py in <genexpr>(.0)
530 labels = [label] * n_datasets
531
--> 532 result = (make_artist(x[:, j % ncx], y[:, j % ncy], kw,
533 {**kwargs, 'label': label})
534 for j, label in enumerate(labels))
~/.local/lib/python3.8/site-packages/matplotlib/axes/_base.py in _makeline(self, x, y, kw, kwargs)
352 default_dict = self._getdefaults(set(), kw)
353 self._setdefaults(default_dict, kw)
--> 354 seg = mlines.Line2D(x, y, **kw)
355 return seg, kw
356
~/.local/lib/python3.8/site-packages/matplotlib/lines.py in __init__(self, xdata, ydata, linewidth, linestyle, color, marker, markersize, markeredgewidth, markeredgecolor, markerfacecolor, markerfacecoloralt, fillstyle, antialiased, dash_capstyle, solid_capstyle, dash_joinstyle, solid_joinstyle, pickradius, drawstyle, markevery, **kwargs)
395 # update kwargs before updating data to give the caller a
396 # chance to init axes (and hence unit support)
--> 397 self.update(kwargs)
398 self.pickradius = pickradius
399 self.ind_offset = 0
~/.local/lib/python3.8/site-packages/matplotlib/artist.py in update(self, props)
1060 func = getattr(self, f"set_{k}", None)
1061 if not callable(func):
-> 1062 raise AttributeError(f"{type(self).__name__!r} object "
1063 f"has no property {k!r}")
1064 ret.append(func(v))
AttributeError: 'Line2D' object has no property 'spin_arrows'
Somehow it doesn't get caugt and just drops down to the lowest level.
Here issues are collected that occured during the fleur workshop related to masci-tools
A function that is used in the KKR parsers to search output files slows down for some reason if the output file is larger. This can be seen in the attached screenshot where the search alone is completed in 2ms, the reading of the outputfile alone also takes 2ms but when both steps are combined the parser takes 300ms.
These things I plan to work on for the plotting methods but probably won't get to till after the fleur workshop:
data
argument (https://stackoverflow.com/questions/39919170/matplotlib-scatter-plot-how-to-use-the-data-argument). We can use that to unify the interface for passing data and provide a unified way to still provide direct lists to plot (look at bokeh_line
for example)color={'MT:1': 'red'}
without needing to know the exact order of the plotsplot_methods
and bokeh_plots
periodic_table_plot
by introducing the BokehPlotterHere care must be taken with ambiguouities
At the moment to use the colors from the Jรผlich corporate design in plots one must look up the hex values and use them manually
It would be nice to have something like use_jรผlich_colors=True
to use colors cycling through the corporate colors
Additionally I painfully made the experience what kinds of issues it can cause when fonts are not completely embedded in pdf files produced by matplotlib this can be solved by setting mpl.rcParams['pdf.fonttype'] = 42
. I don't know whether this should be the default, since it would make plots larger due to embedded fonts
I think some parts of this repository can benefit if they are exposed through a command line interface. The list below is not complete:
The current way of providing data for plots in the plot_methods is through arrays or lists. The dimension checking is quite fragile and also varies from method to method (both on the develop
and plot_methods_refactor
branch)
We should have:
The following inpgen input fails in read_inpgen_file
PrAlGe ! latest template
&input film=f cartesian=f cal_symm=f /
-0.291202873 0.291202873 1.000000000 ! a1
0.291202873 -0.291202873 1.000000000 ! a2
0.291202873 0.291202873 -1.000000000 ! a3 and dvac
13.970477352011365855224 ! lattice constant in bohr. 1 bohr= 0.529177249 ai
1.0 1.0 1.0 ! scale
6
13 0.8326614023758784 0.8326614023758784 0.0000000000000000
13 0.5826614023758784 0.0826614023758784 0.5000000000000000
32 0.9991821765633356 0.9991821765633356 0.0000000000000000
32 0.7491821765633356 0.2491821765633357 0.5000000000000000
59 0.4166564210607859 0.4166564210607859 -0.0000000000000000
59 0.1666564210607859 0.6666564210607858 0.5000000000000000
&soc 0.0 0.0 /
&atom element="Al" id=13 bmu=0.0 /
&atom element="Ge" id=32 bmu=0.0 /
&atom element="Pr" id=59 lo="5s 5p" bmu=4.0 /
&kpt div1=12 div2=12 div3=12 /
&end /
With
/opt/masci-tools/masci_tools/io/fleur_inpgen.py in <listcomp>(.0)
569 raise ValueError('Too few lines found for lattice+atom information')
570 cell_information, atom_information = lattice_information[:5], lattice_information[5:]
--> 571 cell = np.array([[float(val) for val in value.split()] for value in cell_information[:3]])
572
573 global_scaling = float(cell_information[3])
ValueError: could not convert string to float: '!'
Probably in a first step the comments need to be stripped from the input
For example the LDA+U density matrix is written as a matrix of complex numbers, which makes the output very concise but requires special handling for reading it
The type is StringVecType
so a special alias needs to be introduced once again, e.g. FortranComplex
and FortranComplexVecType
scf
namelist allowedOpen question:
Should there be some way of forbidding new features with a given inpgen version argument
When trying to run a CPA calculation using aiida-kkr the system fails to setup the RMTCORE
raise TypeError('Error: array input not consistent for key {}'.format(key))
TypeError: Error: array input not consistent for key <RMTCORE>
My guess is that there is a size inconsistency somewhere, here, I'm unsure if RMTCORE
should be of natyp
size or naez
size.
listargs = dict([['<RBASIS>', naez], ['<RBLEFT>', nlbasis], ['<RBRIGHT>', nrbasis], ['<SHAPE>', natyp],
['<ZATOM>', natyp], ['<SOCSCL>', natyp], ['<SITE>', natyp], ['<CPA-CONC>', natyp],
['<KAOEZL>', nlbasis], ['<KAOEZR>', nrbasis], ['XINIPOL', natyp], ['<RMTREF>', natyp],
['<RMTREFL>', nlbasis], ['<RMTREFR>', nrbasis], ['<FPRADIUS>', natyp], ['BZDIVIDE', 3],
['<RBLEFT>', nrbasis], ['ZPERIODL', 3], ['<RBRIGHT>', nrbasis], ['ZPERIODR', 3],
['LDAU_PARA', 5], ['CPAINFO', 2], ['<DELTAE>', 2], ['FILES', 2], ['DECIFILES', 2],
['<RMTCORE>', natyp], ['<AT_SCALE_BDG>', natyp]])
# deal with special stuff for voronoi:
if self.__params_type == 'voronoi':
listargs['<RMTCORE>'] = natyp
self.update_to_voronoi()
some parts of the schema dictionary could be made case insensitive to make it easier to find attributes or tags
For this the following needs to done:
This is only relevant for the entries attrib_types
, unique_attribs
, unique_path_attribs
, other_attribs
, tag_paths
This inp.xml has 4 atom species 2 Fe, 1 O, 1 Y,
inp.zip
it results in this banddos:
banddos.zip
Then I plot them like this:
with HDF5Reader('banddos.hdf') as h5reader:
data, attributes = h5reader.read(recipe=FleurDOS)
#Plot the data
#Notice that you get the axis object of this plot is returned
#if you want to make any special additions
ax = plot_fleur_dos(data, attributes, show_atoms='all')
example with the test file in tests/files/hdf5_reader/banddos_spinpol_dos.hdf
from masci_tools.io.parsers.hdf5 import HDF5Reader
from masci_tools.io.parsers.hdf5.recipes import FleurDOS
from masci_tools.vis.fleur import plot_fleur_dos
with HDF5Reader(file) as h5reader:
data, attributes = h5reader.read(recipe=FleurDOS)
plot_fleur_dos(data, attributes, show=False, area_plot={'MT:1_up': True,'MT:1_down': True}, area_alpha=0.3)
Gives
It can be seen that the color from the area plot is not from the selected palette and it also messes up the colors of the plots after it
One thing that the schema dictionaries cannot handle period is renaming of tags and attributes. It might be nice if there would be an aliasing mechanism, where in one central place thes renamings are collected and the schema dictionaries could understand the old and new names and return the correct paths. It should be possible to swicth these off.
However, one downside of this would be that the code using aliases could be potentially more confusing.
I have a orbcomp DOS with keys like ORB:12,ind:3_up
and I try to plot it like:
with HDF5Reader("banddos.hdf") as h5reader:
data, attributes = h5reader.read(recipe=FleurORBCOMP)
for indx_list in [[1], [2,3,4], [5,6,7,8,9], [10,11,12,13,14,15,16]]:
f, ax = plt.subplots(1,1,figsize=(8,6))
for indx in indx_list:
print(indx)
plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=2, axis=ax,
show=False, title="PBE0", legend_options={'fontsize':14}, weights=f"ORB:13,ind:{indx}_up",
limits={'x':[1.7,1.9], 'y':[-0.5,0]}, legend=False, spin_arrows=False)
and this triggers the following error:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
/tmp/ipykernel_35667/1664073625.py in <module>
3 for indx in indx_list:
4 print(f"ORB:13,ind:{indx}_up")
----> 5 plot_fleur_dos(data, attributes, multiply_by_equiv_atoms=True, linewidth=2, axis=ax,
6 show=False, title="PBE0", legend_options={'fontsize':14}, weights=f"ORB:13,ind:{indx}_up",
7 limits={'x':[1.7,1.9], 'y':[-0.5,0]}, legend=False, spin_arrows=False)
~/masci-tools/masci_tools/vis/fleur.py in plot_fleur_dos(dosdata, attributes, spinpol, multiply_by_equiv_atoms, plot_keys, show_total, show_interstitial, show_sym, show_atoms, show_lresolved, key_mask, backend, **kwargs)
358
359 spinpol = attributes['spins'] == 2 and spinpol and any('_down' in key for key in dosdata.keys())
--> 360 legend_labels, keys = _generate_dos_labels(dosdata, attributes, spinpol)
361
362 if key_mask is None:
~/masci-tools/masci_tools/vis/fleur.py in _generate_dos_labels(dosdata, attributes, spinpol)
482 types_elements.append(attributes['atoms_elements'][ind])
483
--> 484 for key in sorted(dosdata.keys(), key=_dos_order):
485 if key == 'energy_grid':
486 continue
~/masci-tools/masci_tools/vis/fleur.py in _dos_order(key)
448 return (spin, general.index(key))
449 elif ':' in key:
--> 450 before, after = key.split(':')
451
452 tail = after.lstrip('0123456789')
ValueError: too many values to unpack (expected 2)
I guess the second :
is causing a headace. @janssenhenning I will send you a link to my banddos.hdf via mattermost. I don't want to put it on github.
We need to preserve the species information in get_structuredata
and get_parameterdata
to be able to make roundtrips via the inpgen possible, since we always need the id.
Therefore we need to normalize the species name to the format 'name-id', since the input parameters and StructureData need to keep this information independently
There are currently two different approaches used in fleur and get_parameterdata
fleur/inpgen uses id=index of occurence of the species
if not specified (Though I don't know if this holds under all circumstances)
get_parameterdata uses id=index of occurrence of species with the given element
The second approach is more natural in my opinion but this means renaming more species coming from fleur directly.
Another approach is to not change the species name if it corresponds to the valid format and only change the ones not conforming like the default names Iron (Fe)
. There we have to be careful not to double the species names
Hi! I added the branch support/aiida-1.X. It is is based on release 0.13.0. It should remain fixed at that version and not be updated.
This branch is aimed at those who need a Python environment with aiida-core[atomic-tools] 1.x (1.6.9 = last before 2.0) and masci-tools[cmdline-extras] installed.
masci-tools[cmdline-extras] 0.14.0 added the requirement 'pymatgen-io-fleur~=0.4,>=0.4.1'
. This, in turn, requires pymatgen==2022.7.8
, resp. pymatgen>=2022.7.8
. While aiida-core[atomic-tools] 1.6.9 has the constraint pymatgen!=2019.9.7, <=2022.02.03 and >=2019.7.2
.
This dependency conflict can be fixed by downgrading to masci-tools[atomic-tools] 0.13.0, respectively this branch pinned at 0.13.0.
(Here is a non-public reference for an AiiDA JupyterHub installation where this issue occurred. Here, the installation of aiida-core[atomic-tools] 1.6.9 and masci-tools[cmdline-extras] 0.13.0 succeeded. Here, the installation of aiida-core[atomic-tools] 1.6.9 and masci-tools[cmdline-extras] 0.14.0 failed due to the dependency conflict described above.)
Here some ideas for useful xml modifying functions beyond the ones available are collected.
This issue sketches a design to enable easier construction of complex xpath expressions from simple ones without needing to fall back to knowing the full xpath.
The current workflow for this is
#Get the simple xpath from the schema dictionary
simple = schema_dict.tag_xpath(tag_name)
#Build your complex xpath manually with string manipulation
...
#Evaluating the complex xpath
This is used in the xml_setters
set_species
for example to select specific species
This building of the complex xpath could be designed in a more concise way. This is just a sketch how the usage of this functionality could look. What I have in mind is only adding so called predicates and not wildcard expressions or entering relative expressions https://www.w3schools.com/xml/xpath_syntax.asp
lo_path = schema_dict.tag_xpath('lo', contains='species')
builder = XPathBuilder(lo_path, filters={
'species': {'index': {'<=': 3 }}, #Get the first three species nodes
'lo': {'type': {'==': 'SCLO'}}
})
xpath = builder.build() #/fleurInput/atomSpecies/species[position()<=3]/lo[@type='SCLO']
The syntax of the filters dictionary is borrowed from the AiiDA Querybuilder. I don't know if this is the best approach but it seems like a robust starting point to me.
Ultimately this allows to add an argument filters
or where
to specify these filters for xml_setters
/xml_getters
More things that can be added on to this are:
XPathBuilder
with SchemaXPathBuilder
with additional validation and functionality from teh schema dictionaries/fleurInput/atomSpecies/species[position()<=3]/lo[@type='SCLO']
the recommended safe way to do these variable expressions is /fleurInput/atomSpecies/species[position()<=$index]/lo[@type=$type]
and passing the actual values as variables to the Xpath engine. This can also be supported by this builder mechanismI am trying to plot a bandstructure with lines instead of dots. I tried to use the option line_plot=True
documented here:
I did it in this jupyter notebook, but I'll try to copy it a swell
line_plot.zip
from masci_tools.io.parsers.hdf5 import HDF5Reader
from masci_tools.io.parsers.hdf5.recipes import FleurBands
from masci_tools.vis.fleur import plot_fleur_bands
#Read in data
with HDF5Reader('banddos.hdf') as h5reader:
data, attributes = h5reader.read(recipe=FleurBands)
#Plot the data
#Notice that you get the axis object of this plot is returned
#if you want to make any special additions
ax = plot_fleur_bands(data, attributes, limits={'y': (-3,5)}, line_plot=True)
results in
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-25-afd0580a8c3a> in <module>
12 #Notice that you get the axis object of this plot is returned
13 #if you want to make any special additions
---> 14 ax = plot_fleur_bands(data, attributes, limits={'y': (-3,5)}, line_plot=True)
~/.local/lib/python3.8/site-packages/masci_tools/vis/fleur.py in plot_fleur_bands(bandsdata, bandsattributes, spinpol, only_spin, bokeh_plot, weight, **kwargs)
276 else:
277 if spinpol:
--> 278 fig = plot_spinpol_bands(bandsdata['kpath'],
279 bandsdata['eigenvalues_up'],
280 bandsdata['eigenvalues_down'],
~/.local/lib/python3.8/site-packages/masci_tools/vis/parameters.py in ensure_consistency(*args, **kwargs)
77
78 try:
---> 79 res = func(*args, **kwargs)
80 except Exception:
81 plotter_object.remove_added_parameters()
~/.local/lib/python3.8/site-packages/masci_tools/vis/plot_methods.py in plot_spinpol_bands(kpath, bands_up, bands_dn, size_data, show_spin_pol, special_kpoints, e_fermi, xlabel, ylabel, title, saveas, markersize_min, markersize_scaling, scale_color, **kwargs)
1965 plot_params.set_defaults(default_type='function', sub_colormap=(0.15, 1.0))
1966
-> 1967 ax = multi_scatter_plot([kpath, kpath], [bands_up, bands_dn],
1968 size_data=size_data,
1969 xlabel=xlabel,
~/.local/lib/python3.8/site-packages/masci_tools/vis/parameters.py in ensure_consistency(*args, **kwargs)
77
78 try:
---> 79 res = func(*args, **kwargs)
80 except Exception:
81 plotter_object.remove_added_parameters()
~/.local/lib/python3.8/site-packages/masci_tools/vis/plot_methods.py in multi_scatter_plot(xdata, ydata, size_data, color_data, xlabel, ylabel, title, saveas, axis, **kwargs)
476 plot_kw.pop('color')
477
--> 478 res = ax.scatter(x, y=y, s=size, c=color, **plot_kw, **kwargs)
479 if plot_kw.get('label', None) is not None and color is not None:
480 if isinstance(color, (list, np.ndarray, pd.Series)):
/usr/local/anaconda3/lib/python3.8/site-packages/matplotlib/__init__.py in inner(ax, data, *args, **kwargs)
1359 def inner(ax, *args, data=None, **kwargs):
1360 if data is None:
-> 1361 return func(ax, *map(sanitize_sequence, args), **kwargs)
1362
1363 bound = new_sig.bind(ax, *args, **kwargs)
/usr/local/anaconda3/lib/python3.8/site-packages/matplotlib/axes/_axes.py in scatter(self, x, y, s, c, marker, cmap, norm, vmin, vmax, alpha, linewidths, edgecolors, plotnonfinite, **kwargs)
4595 )
4596 collection.set_transform(mtransforms.IdentityTransform())
-> 4597 collection.update(kwargs)
4598
4599 if colors is None:
/usr/local/anaconda3/lib/python3.8/site-packages/matplotlib/artist.py in update(self, props)
1060 func = getattr(self, f"set_{k}", None)
1061 if not callable(func):
-> 1062 raise AttributeError(f"{type(self).__name__!r} object "
1063 f"has no property {k!r}")
1064 ret.append(func(v))
AttributeError: 'PathCollection' object has no property 'line_plot'
At the moment xml attributes are restricted to be one value from the point of view of the masci-tools functions. Unifying the format for attribute and text types would allow these cases (Note that there are already special cases for forcetheorem attributes). But since the usage of multi value attributes is very limited this is not a very pressing issue.
I think we should very soon create a new release, since we then could move forward with fleur interfaces to ase
and pymatgen
. Below I post a few questions/thoughts on this:
Should this release be 0.5.0
, since it adds new features with the inpgen IO functions?
Is there something that should still be added @PhilippRue @broeder-j ? I for example see a branch with small changes combime_imp_masci_tool
, which would need to be rebased/merged.
@Irratzo The constants versioning should probably also be included, right? Could you open a PR for that?
The PR #54 should be merged after the release for Jens HTC paper I think, since it has large impact on the plotting functions. And though I'm confident it breaks/changes no existing plots I would feel even more confident if we have this functionality on the develop branch for a time so that more people test it.
if the spin index is missing it should be applied to both spins
i.e.
linestyle={'MT:1': '--'}
should set the linestyle for spin-up and if available spin-down components
The pattern matching will sum up MT:1 and MT:10 weights for DOS plots ๐
The package size of masci-tools is quite large 5.8MB, the wheel is still ok.
We should think about seperating the tests and datafiles from the source code.
At the moment updating the pytest-mpl baseline images is a real pain, since we have multiple different directories with baseline images with similar names and pytest-mpl can only generate baseline images in one flat folder, where images with the same name simply overwrite each other
The easiest solution would be to move the test functions outside of classes and put the plot function name in the test function name. Then we can define one baseline directory and use the standard @pytest.mark.mpl_image_compare
without options to generate the images
It should be possible to have predefined tasks in default_parser_tasks.py
, which are only performed if the user specifically asks for these
First examples for these could be parsing the corelevel states from the out.xml
This is a nice to have feature to round of the flexibilty of the fleur-schema system
For example on iffaiida in order to add a fleur schema you have to download it manually into the container. It would be nice if add-fleur-schema could download the schema files from the fleur git (for the develop branch by default but a specific branch is also possible)
At the moment most functions inside the common_xml_util
and schema_dict_util
modules get a argument parser_info_out
or conversion_warnings
, where errors are collected. This is nice for the Fleur parsers this means that not many things can cause the Parser to crash and produce no output. In most cases we still return the unprocessed string from the XML file for example. However, for using these functions outside this is slightly annoying because you can never be certain if they succeeded.
Solutions:
calculate_expression
, convert_from_fortran_bool
)parser_info_out
argumentAn additional solution would be to replace parser_info_out with a logger object and produce a custom Handler that produces the dictionary for the parser output. This would have the additional benefit to give people the possibility of providing their own logger to these functions, if they are reused in a different context.
One problem I noticed during development of the parsers, is that it is very easy to mutate the schema dictionaries since it is just lists and dicts. So forgetting to copy some entry before doing something with it can have unintended consequences. This is not ideal, when there could be a variety of tools in the future relying on the schema dict
I have some Ideas to implement a subclass of UserDict (the same thing I did in #15 to get a case insensitive key lookup), which can prevent modifications to itself and will immediately raise an error if someone tries.
This is a reminder to myself that I should do something about this at some point
The addition of schema files for new versions is very easy now and for the inpxml_parser and xml modification methods/getters this works seamlessly.
However, since the outxml_parser
has an additional component in the parse task definitions where the version has to be entered explicitely into the set __working_out_versions__
at the top of the file in order to parse any file from that version.
An additional fallback with a warning saying that the version was not explicitely stated as working
so errors can occur would be nice to completely allow for adding a schema from the outside without having to change any code and having everything work (except of course someone broke functionality with changes in the schema)
The schema dictionaries for the fleur input and output schemas can only work of the information provided in the schema files. This means that some ambiguity that is discovered after the fact (meaning the schema is frozen for the Fleur release) is difficult to correct in the schema dictionaries and has to be handled with special cases. It would be useful to add a way to mutate the schema dictionary after creation but before it is locked and stored in the cache with custom functions to correct edge cases.
Of course this has to be done with a lot of care, but for example cases like changing the type of an attribute from string
to float_expression
should be no problem for cases where this is necessary
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.