Code Monkey home page Code Monkey logo

picaso's People

Contributors

allcontributors[bot] avatar ayanaman271 avatar caoimherooney11 avatar dependabot-preview[bot] avatar dependabot[bot] avatar dreamjade avatar james-mang avatar kappibw avatar logan-pearce avatar martiancolonist avatar mcnixon96 avatar nainas2024 avatar natashabatalha avatar ninarobbins avatar rangertreaty33 avatar sagnickm avatar ziva18t avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

picaso's Issues

Suggested modification to jdi.star when using use star file

picaso/picaso/justdoit.py

Lines 658 to 674 in 0737df3

if (not isinstance(temp, type(None))):
sp = psyn.Icat(database, temp, metal, logg)
sp.convert("um")
sp.convert('flam')
wno_star = 1e4/sp.wave[::-1] #convert to wave number and flip
flux_star = sp.flux[::-1]*1e8 #flip here and convert to ergs/cm3/s to get correct order
#but you can also upload a stellar spec of your own
elif (not isinstance(filename,type(None))):
star = np.genfromtxt(filename, dtype=(float, float), names='w, f')
flux = star['f']
wave = star['w']
#sort if not in ascending order
sort = np.array([wave,flux]).T
sort= sort[sort[:,0].argsort()]
wave = sort[:,0]
flux = sort[:,1]

In jdi.star() the expectation is most people will use data from the database (fair), but this should be the first check rather than radius and semi major axis and instructions should be provided that you should NOT give it a temperature if inputting your own file. It is probably smoother to just switch to check for user file first and then load from database.

Add default 661 wavelength to wavelength.py

#if it's a user specified pressure and wavenumber
            if (('pressure' in cols) & ('wavenumber' in cols)):
                df = df.sort_values(['pressure', 'wavenumber']).reset_index(drop=True)
                self.inputs['clouds']['wavenumber'] = df['wavenumber'].unique()
                nwave = len(self.inputs['clouds']['wavenumber'])
                nlayer = len(df['pressure'].unique())
                assert df.shape[0] == (self.nlevel-1)*nwave, "There are {0} rows in the df, which does not equal {1} layers previously specified x {2} wave pts".format(df.shape[0], self.nlevel-1, nwave) 
            
            #if its eddysed, make sure there are 196 wave points 
            else: 
                #assert df.shape[0] == (self.nlevel-1)*196, "There are {0} rows in the df, which does not equal {1} layers x 196 eddysed wave pts".format(df.shape[0], self.nlevel-1) 
                if df.shape[0] == (self.nlevel-1)*196 :
                    self.inputs['clouds']['wavenumber'] = get_cld_input_grid('wave_EGP.dat')
                elif df.shape[0] == (self.nlevel-1)*661:
                    self.inputs['clouds']['wavenumber'] = get_cld_input_grid('wave_EGP.dat',grid661=True)

and to wavelength.py

def get_cld_input_grid(filename_or_grid='wave_EGP.dat',grid661=False):
	"""
	The albedo code relies on the cloud code input, which is traditionally on a 196 wavelength grid. 
	This method is to retrieve that grid. This file should be kept in the package reference data. Alternatively, 
	it might be useful to add it directly to the input.cld file. 

	Parameters
	----------
	filename_or_grid : str or ndarray
		filename of the input grid OR a numpy array of wavenumber corresponding to cloud 
		input

	Returns 
	-------
	array 
		array of wave numbers in increasing order 
	"""
	if grid661 == True:
		grid,dwni_new = np.loadtxt(os.path.join(__refdata__, 'climate_INPUTS/wvno_661'),usecols=[0,1],unpack=True)
		return grid
	if filename_or_grid == 'wave_EGP.dat':
		grid = pd.read_csv(os.path.join(__refdata__, 'opacities',filename_or_grid), delim_whitespace=True)
		grid = grid.sort_values('wavenumber')['wavenumber'].values
	elif isinstance(filename_or_grid, np.ndarray):
		grid = np.sort(filename_or_grid)
	elif (isinstance(filename_or_grid, str) & (filename_or_grid != 'wave_EGP.dat') & 
		os.path.exists(filename_or_grid)):	
		grid = pd.read_csv(os.path.join(filename_or_grid), delim_whitespace=True)
		if 'wavenumber' in grid.keys():
			grid = grid.sort_values('wavenumber')['wavenumber'].values
		else: 
			raise Exception('Please make sure there is a column named "wavenumber" in your cloud wavegrid file')
	else:
		raise Exception("Please enter valid cloud wavegrid filepath, or numpy array. Or use default in reference file.")

	return grid

pi

def brightness_temperature(out_dict,plot=True, R = None):

the picaso flux here needs to be divided by a pi before doing the brightness temperature calculation.

opacities database in tutorial notebook 9

In tutorial 9_SwappingOpacities, the notebook expects the opacity table to be named 'opacity.db' But the file I downloaded from Sonora is called 'opacities.db'

Everything works if I change the notebook to expect 'opacities.db', and I didn't have a problem with the other tutorial notebooks. But, I do wonder if there other places that expect a different filename.

Error running channon_grid_high()

Not sure if I have the wrong file or something, but when I ran channon_grid_high(), I got the following error:

  File "/home/jwang/pylibs/psisim/psisim/spectrum.py", line 265, in generate_picaso_inputs
    params.channon_grid_high()
  File "/home/jwang/pylibs/picaso/picaso/justdoit.py", line 1233, in channon_grid_high
    df.loc[i,mols] = channon.loc[channon['pt_id'] == pair_for_layer,mols].values
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/indexing.py", line 716, in __setitem__
    iloc._setitem_with_indexer(indexer, value, self.name)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/indexing.py", line 1688, in _setitem_with_indexer
    self._setitem_with_indexer_split_path(indexer, value, name)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/indexing.py", line 1727, in _setitem_with_indexer_split_path
    self._setitem_with_indexer_2d_value(indexer, value)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/indexing.py", line 1797, in _setitem_with_indexer_2d_value
    self._setitem_single_column(loc, value[:, i].tolist(), pi)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/indexing.py", line 1885, in _setitem_single_column
    ser._mgr = ser._mgr.setitem((pi,), value)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/internals/managers.py", line 337, in setitem
    return self.apply("setitem", indexer=indexer, value=value)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/internals/managers.py", line 304, in apply
    applied = getattr(b, f)(**kwargs)
  File "/home/jwang/miniconda3/envs/psisim/lib/python3.10/site-packages/pandas/core/internals/blocks.py", line 937, in setitem
    values[indexer] = value
ValueError: setting an array element with a sequence.

I debugged this a bit and it looks like channon.loc[channon['pt_id'] == pair_for_layer,mols].values has a shape [1,N] whereas the df.loc[i,mols] is expecting an array of size N. It works for me if I modify line 1233 of justdoit.py if I just add a [0]:

df.loc[i,mols] = channon.loc[channon['pt_id'] == pair_for_layer,mols].values[0]

I just pulled the up to date version of the repo and installed PICASO. Perhaps it could be a versioning issue of a dependency? I am on pandas 1.4.1.

Climate models that arent converging

Here is a place to keep track of climate models that do not converge. Please add:

  • code snippet with model setup (make sure to show what CK file you are using)
  • starting P-T guess
  • any other info we need to reproduce the run

Thermal Emission tutorial crashes on Fp/Fs

When running the Thermal Emission tutorial (https://natashabatalha.github.io/picaso/notebooks/5_AddingThermalFlux.html), everything works fine until the step 'Standard Relative Flux Fp/Fs.' The command
jpi.show(jpi.spectrum(wno,fpfs*1e6,plot_width=500,y_axis_type='log'))
then crashes with:

In [43]: jpi.show(jpi.spectrum(wno,fpfs*1e6,plot_width=500,y_axis_type='log'))
---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
Input In [43], in <cell line: 1>()
----> 1 jpi.show(jpi.spectrum(wno,fpfs*1e6,plot_width=500,y_axis_type='log'))

File /home/exolab/anaconda3/lib/python3.9/site-packages/picaso-3.0-py3.9.egg/picaso/justplotit.py:352, in spectrum(xarray, yarray, legend, wno_to_micron, palette, muted_alpha, **kwargs)
    350 else: 
    351     if isinstance(legend,type(None)):
--> 352         fig.line(conv(xarray), yarray,  color=palette[i], line_width=3)
    353     else:
    354         f = fig.line(conv(xarray), yarray, color=palette[i], line_width=3,
    355                         muted_color=palette[np.mod(i, len(palette))], muted_alpha=muted_alpha)

IndexError: tuple index out of range

Debugging shows that valid palette options were

ipdb> palette
('#0072B2', '#E69F00', '#F0E442', '#009E73', '#56B4E9', '#D55E00', '#CC79A7', '#000000')

But that it's trying to index 'palette' with i=8 (i.e., the 9th entry) -- and there are only 8 values.

Many thanks! -Ian

opacities.db vs. opacity.db

In this new CHIMERA-less version of the tutorial, the set-up at the top does not explicitly say where to put the opacity.db file. It should go in ../picaso_reference/opacities. Also, I think the file needs to be named "opacity.db", not "opacities.db"; the latter is the name of the file downloaded from Zenodo (https://zenodo.org/record/3759675#.Y7XMby-B2EB). Without renaming the file, I get the following error:


Exception Traceback (most recent call last)
in
1 #load opacities
----> 2 opas = pj.opannection(wave_range=[1,12]) #defined in code 1
3
4 #load planet in the same way as before
5 gj436_trans = pj.load_planet(choose_from.loc[choose_from['pl_name']==planet_name],

~/.conda/envs/propexo/lib/python3.7/site-packages/picaso/justdoit.py in opannection(wave_range, filename_db, raman_db, resample)
364 filename_db = os.path.join(refdata, 'opacities', inputs['opacities']['files']['opacity'])
365 if not os.path.isfile(filename_db):
--> 366 raise Exception('The opacity file does not exist: ' + filename_db+' The default is to a file opacities.db in reference/opacity/. If you have an older version of PICASO your file might be called opacity.db. Consider just adding the correct path to filename_db=')
367 elif not isinstance(filename_db,type(None) ):
368 if not os.path.isfile(filename_db):

Exception: The opacity file does not exist: /Users/jteske/Documents/staff/research/observing_proposals/propexo/picaso_reference/opacities/opacity.db The default is to a file opacities.db in reference/opacity/. If you have an older version of PICASO your file might be called opacity.db. Consider just adding the correct path to filename_db=

Potential issue in pre-mixed ck tables for Teq>=1700

For planets with equilibrium temperatures greater than about 1600/1700, we see temperature inversions at P<1e-4. @sagnickm has verified the issue. The problem goes away when using on the fly mixing, which hasn't been fully released yet (still in dev). Therefore there isn't a direct solution just yet.

Temporary fix: Clip pressure grid at 1e-4. <- see update below, this does not work

Permanent fix:

  • use on the fly (release expected in ~1 month)
  • remake premixed files such that the problem goes away (release expected in ~1 month)

include warning on different environments in installation?

Hi @natashabatalha !

The installation instructions are super clear (importantly, since there's a lot to do). The one thing that wasn't the same for me, and might be different for other folks too, is that I'm not using bash. When I got a new Mac, it switched me over to zsh, so instead of editing ~/.bash_profile, I needed to include your export statements in my ~/.zshrc file. It might be worth giving people a heads up that it won't be .bash_profile for everyone.

It's a tiny thing, but your tutorial said to submit small things! Thanks!

Issue specifying custom atmophere

Hi. I'm just beginning to play around with Picaso. I wanted to compute a reflected light spectrum of Earth. So, following the "Getting Started : Basic Inputs and Outputs" in the docs, I put together a file for P, T and mixing ratios for Earth. I've attached it here: earth.pt.zip

I then run

opacity = jdi.opannection(wave_range=[0.3,2]) #lets just use all defaults
start_case = jdi.inputs()

#phase angle
start_case.phase_angle(0) #radians

#define gravity
start_case.gravity(gravity=9.81, gravity_unit=u.Unit('m/(s**2)')) #any astropy units available

#define star
start_case.star(opacity, 5000,0,4.0) #opacity db, pysynphot database, temp, metallicity,

start_case.atmosphere(filename='earth.pt', delim_whitespace=True)

df = start_case.spectrum(opacity)

I get the following error

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
Input In [8], in <cell line: 1>()
----> 1 df = start_case.spectrum(opacity)

File ~/miniconda3/envs/picaso/lib/python3.9/site-packages/picaso/justdoit.py:1980, in inputs.spectrum(self, opacityclass, calculation, dimension, full_output, plot_opacity, as_dict)
   1976     self.inputs['surface_reflect'] = 0 
   1977     self.inputs['hard_surface'] = 0 
-> 1980 return picaso(self, opacityclass,dimension=dimension,calculation=calculation,
   1981     full_output=full_output, plot_opacity=plot_opacity, as_dict=as_dict)

File ~/miniconda3/envs/picaso/lib/python3.9/site-packages/picaso/justdoit.py:139, in picaso(bundle, opacityclass, dimension, calculation, full_output, plot_opacity, as_dict)
    137 atm.get_mmw()
    138 atm.get_density()
--> 139 atm.get_altitude(p_reference = p_reference)#will calculate altitude if r and m are given (opposed to just g)
    140 atm.get_column_density()
    142 #gets both continuum and needed rayleigh cross sections 
    143 #relies on continuum molecules are added into the opacity 
    144 #database. Rayleigh molecules are all in `rayleigh.py` 

File ~/miniconda3/envs/picaso/lib/python3.9/site-packages/picaso/atmsetup.py:371, in ATMSETUP.get_altitude(self, p_reference, constant_gravity)
    368         gravity[i] = self.c.G * self.planet.mass / ( z[i] )**2  
    370     scale_h = self.c.k_b * tlevel[i] / (mmw[i] * gravity[i])
--> 371     dz[i] = scale_h*(np.log(plevel[i+1]/ plevel[i]))
    372     z[i-1] = z[i] + dz[i]
    374 self.level['z'] = z

IndexError: index 200 is out of bounds for axis 0 with size 200

Wondering if you could help spot the issue with my input file.

Issue with thermal contribution function and pcolormesh

Trying to make a thermal contribution function plot and I get the following error when using the command fig, ax, CF = jpi.thermal_contribution(spect_em['full_output'], norm=jpi.colors.LogNorm(vmin=1e7, vmax=1e11))

`---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Input In [12], in <cell line: 1>()
----> 1 fig, ax, CF = jpi.thermal_contribution(spect_em['full_output'], norm=jpi.colors.LogNorm(vmin=1e7, vmax=1e11))

File ~/picaso/picaso/justplotit.py:1605, in thermal_contribution(full_output, tau_max, **kwargs)
1600 fig, ax = plt.subplots()
1601 #if not isinstance( clim , type(None)):
1602 # CF_clipped = np.clip(CF, clim[0],clim[1])
1603 #else:
1604 # CF_clipped = CF+0
-> 1605 smap = ax.pcolormesh(1e4/full_output['wavenumber'], full_output['layer']['pressure'], CF, **kwargs)
1606 ax.set_ylim(np.max(full_output['layer']['pressure']), np.min(full_output['layer']['pressure']))
1607 ax.set_yscale('log')

File ~/opt/anaconda3/envs/picaso38/lib/python3.8/site-packages/matplotlib-3.5.2-py3.8-macosx-10.9-x86_64.egg/matplotlib/init.py:1412, in _preprocess_data..inner(ax, data, *args, **kwargs)
1409 @functools.wraps(func)
1410 def inner(ax, *args, data=None, **kwargs):
1411 if data is None:
-> 1412 return func(ax, *map(sanitize_sequence, args), **kwargs)
1414 bound = new_sig.bind(ax, *args, **kwargs)
1415 auto_label = (bound.arguments.get(label_namer)
1416 or bound.kwargs.get(label_namer))

File ~/opt/anaconda3/envs/picaso38/lib/python3.8/site-packages/matplotlib-3.5.2-py3.8-macosx-10.9-x86_64.egg/matplotlib/axes/_axes.py:6058, in Axes.pcolormesh(self, alpha, norm, cmap, vmin, vmax, shading, antialiased, *args, **kwargs)
6055 shading = shading.lower()
6056 kwargs.setdefault('edgecolors', 'none')
-> 6058 X, Y, C, shading = self._pcolorargs('pcolormesh', *args,
6059 shading=shading, kwargs=kwargs)
6060 coords = np.stack([X, Y], axis=-1)
6061 # convert to one dimensional array

File ~/opt/anaconda3/envs/picaso38/lib/python3.8/site-packages/matplotlib-3.5.2-py3.8-macosx-10.9-x86_64.egg/matplotlib/axes/_axes.py:5568, in Axes._pcolorargs(self, funcname, shading, *args, **kwargs)
5566 if shading == 'flat':
5567 if (Nx, Ny) != (ncols + 1, nrows + 1):
-> 5568 raise TypeError('Dimensions of C %s are incompatible with'
5569 ' X (%d) and/or Y (%d); see help(%s)' % (
5570 C.shape, Nx, Ny, funcname))
5571 else: # ['nearest', 'gouraud']:
5572 if (Nx, Ny) != (ncols, nrows):

TypeError: Dimensions of C (43, 29957) are incompatible with X (29957) and/or Y (44); see help(pcolormesh)`

This occurs within my own code but also when running the 5_AddingThermalFlux tutorial.

To jdi.input.star add check to ensure stellar spectrum grid is more dense than opacity grid

To ensure that energy is being computed correctly during a climate calculation, there must be at least one stellar spectrum grid point within each opacity grid bin, else you will have bins with zero energy within them. If a user is inputting a stellar spectrum from a file rather than grabbing a spectrum from the phoenix database, the stellar spectrum may be less densely sampled than the opacity grid, giving regions of zero energy during the integration step. Add a check to this step to ensure that the stellar spectrum is more densely sampled than the opacity grid.

https://github.com/natashabatalha/picaso/blob/dc1b3eab2605575239cf1e643ead28e6b211eb79/picaso/justdoit.py#L1685C13-L1694C39

setup.py install dependency issues

When installing PICASO 3.0 from git and using the setup.py script, the script failed for pysynphot (first for missing numpy), which I had to fetch manually.

Initial run of setup.py install, in a totally fresh conda environement (python 3.8.13) , with just git clone https://github.com/natashabatalha/picaso.git:

Reading https://pypi.org/simple/pysynphot/
Downloading https://files.pythonhosted.org/packages/53/7e/44eb1e24af0c81613cc591f31fbb614001d696ff889a032871d1c0f4d1df/pysynphot-2.0.0.tar.gz#sha256=45c29f69248ec8a641c38625d11409dd2411ea1d6faffd8c3b44da354c4d22e7
Best match: pysynphot 2.0.0
Processing pysynphot-2.0.0.tar.gz
Writing /var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-o3ewd7ts/pysynphot-2.0.0/setup.cfg
Running pysynphot-2.0.0/setup.py -q bdist_egg --dist-dir /var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-o3ewd7ts/pysynphot-2.0.0/egg-dist-tmp-v8tg9t3e
Traceback (most recent call last):
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 156, in save_modules
    yield saved
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 198, in setup_context
    yield
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 259, in run_setup
    _execfile(setup_script, ns)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 46, in _execfile
    exec(code, globals, locals)
  File "/var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-o3ewd7ts/pysynphot-2.0.0/setup.py", line 3, in <module>
    # This sample setup.py can be used as a template for any project using d2to1.
ModuleNotFoundError: No module named 'numpy'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "setup.py", line 42, in <module>
    setup(
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/__init__.py", line 87, in setup
    return distutils.core.setup(**attrs)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/_distutils/core.py", line 185, in setup
    return run_commands(dist)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
    dist.run_commands()
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/_distutils/dist.py", line 968, in run_commands
    self.run_command(cmd)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/dist.py", line 1217, in run_command
    super().run_command(command)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/_distutils/dist.py", line 987, in run_command
    cmd_obj.run()
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/install.py", line 74, in run
    self.do_egg_install()
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/install.py", line 131, in do_egg_install
    cmd.run(show_deprecation=False)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 420, in run
    self.easy_install(spec, not self.no_deps)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 662, in easy_install
    return self.install_item(None, spec, tmpdir, deps, True)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 709, in install_item
    self.process_distribution(spec, dist, deps)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 754, in process_distribution
    distros = WorkingSet([]).resolve(
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/pkg_resources/__init__.py", line 789, in resolve
    dist = best[req.key] = env.best_match(
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/pkg_resources/__init__.py", line 1075, in best_match
    return self.obtain(req, installer)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/pkg_resources/__init__.py", line 1087, in obtain
    return installer(requirement)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 681, in easy_install
    return self.install_item(spec, dist.location, tmpdir, deps)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 707, in install_item
    dists = self.install_eggs(spec, download, tmpdir)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 900, in install_eggs
    return self.build_and_install(setup_script, setup_base)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 1174, in build_and_install
    self.run_setup(setup_script, setup_base, args)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 1158, in run_setup
    run_setup(setup_script, args)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 262, in run_setup
    raise
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/contextlib.py", line 131, in __exit__
    self.gen.throw(type, value, traceback)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 198, in setup_context
    yield
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/contextlib.py", line 131, in __exit__
    self.gen.throw(type, value, traceback)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 169, in save_modules
    saved_exc.resume()
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 143, in resume
    raise exc.with_traceback(self._tb)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 156, in save_modules
    yield saved
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 198, in setup_context
    yield
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 259, in run_setup
    _execfile(setup_script, ns)
  File "/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/sandbox.py", line 46, in _execfile
    exec(code, globals, locals)
  File "/var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-o3ewd7ts/pysynphot-2.0.0/setup.py", line 3, in <module>
    # This sample setup.py can be used as a template for any project using d2to1.
ModuleNotFoundError: No module named 'numpy' 

Ran conda install numpy, then install.py setup again:

Reading https://pypi.org/simple/pysynphot/
Downloading https://files.pythonhosted.org/packages/53/7e/44eb1e24af0c81613cc591f31fbb614001d696ff889a032871d1c0f4d1df/pysynphot-2.0.0.tar.gz#sha256=45c29f69248ec8a641c38625d11409dd2411ea1d6faffd8c3b44da354c4d22e7
Best match: pysynphot 2.0.0
Processing pysynphot-2.0.0.tar.gz
Writing /var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-gb58f5o8/pysynphot-2.0.0/setup.cfg
Running pysynphot-2.0.0/setup.py -q bdist_egg --dist-dir /var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-gb58f5o8/pysynphot-2.0.0/egg-dist-tmp-qodtr664
/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer.
  warnings.warn(
listing git files failed - pretending there aren't any
listing git files failed - pretending there aren't any
no previously-included directories found matching 'build'
no previously-included directories found matching 'doc/build'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files matching '*.o' found anywhere in distribution
/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
In file included from pysynphot/src/pysynphot_utils.c:2:
In file included from /Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/numpy/core/include/numpy/arrayobject.h:5:
In file included from /Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:
In file included from /Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/numpy/core/include/numpy/ndarraytypes.h:1948:
/Users/kappi/miniconda3/envs/git_picaso/lib/python3.8/site-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:17:2: warning: "Using deprecated NumPy API, disable it with "          "#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-W#warnings]
#warning "Using deprecated NumPy API, disable it with " \
 ^
pysynphot/src/pysynphot_utils.c:40:27: warning: variable 'out_arr_len' is uninitialized when used here [-Wuninitialized]
  out_dim[0] = (npy_intp) out_arr_len;
                          ^~~~~~~~~~~
pysynphot/src/pysynphot_utils.c:6:24: note: initialize the variable 'out_arr_len' to silence this warning
  const int out_arr_len;
                       ^
                        = 0
2 warnings generated.
No eggs found in /var/folders/ft/tj4pw82n3w31kn4wlkk447n80000gn/T/easy_install-gb58f5o8/pysynphot-2.0.0/egg-dist-tmp-qodtr664 (setup script problem?)
error: The 'pysynphot' distribution was not found and is required by picaso

Solution : conda install -c conda-forge pysynphot

Issue with notebook 9d

for the command out3d = case_3d.spectrum(opacity,calculation='thermal',dimension='3d',full_output=True)

/Users/tkataria/Research/picaso/picaso/atmsetup.py:132: UserWarning: Ignoring graphite in input file, not a recognized molecule
  warnings.warn("Ignoring %s in input file, not a recognized molecule" % i, UserWarning)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Input In [4], in <cell line: 1>()
----> 1 out3d = case_3d.spectrum(opacity,calculation='thermal',dimension='3d',full_output=True)

File ~/Research/picaso/picaso/justdoit.py:2777, in inputs.spectrum(self, opacityclass, calculation, dimension, full_output, plot_opacity, as_dict)
   2773     self.inputs['surface_reflect'] = 0 
   2774     self.inputs['hard_surface'] = 0 
-> 2777 return picaso(self, opacityclass,dimension=dimension,calculation=calculation,
   2778     full_output=full_output, plot_opacity=plot_opacity, as_dict=as_dict)

File ~/Research/picaso/picaso/justdoit.py:144, in picaso(bundle, opacityclass, dimension, calculation, full_output, plot_opacity, as_dict)
    142 atm.get_mmw()
    143 atm.get_density()
--> 144 atm.get_altitude(p_reference = p_reference)#will calculate altitude if r and m are given (opposed to just g)
    145 atm.get_column_density()
    147 #gets both continuum and needed rayleigh cross sections 
    148 #relies on continuum molecules are added into the opacity 
    149 #database. Rayleigh molecules are all in `rayleigh.py` 

File ~/Research/picaso/picaso/atmsetup.py:355, in ATMSETUP.get_altitude(self, p_reference, constant_gravity)
    352 tlevel = self.level['temperature']
    353 plevel = self.level['pressure']
--> 355 if p_reference >= max(plevel):
    356     p_reference = plevel[0]
    358 z = np.zeros(np.shape(tlevel)) + self.planet.radius

ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

Smaller opacities.db option?

Small feature request:

It would be nice to run some of the tutorials in the docs on Google Colab as interactive demos. I figured out how to automate the environment setup, but the downloading of the ~6 GB opacity file makes it impractical (downloads on Colab are very slow). Maybe it would be useful to have a lower resolution version of the database so that users don't need to install picaso on their own machines for simpler, pedagogical purposes.

Thanks!

Add 'flux unit' parameter to inputs['star']

self.inputs['star']['wno'] = wno_planet

It would be nice to have an entry in the inputs object tracking the unit of flux that picaso is using. When using phoenix models they are in ergs/cm^2/s/A, but picaso converts them to ergs/cm^2/s/cm in the star function. Adding this entry will help make the unit explicit for users.

Spectra for brown dwarf thermal emission tutorial is strange

I've been running the brown dwarf thermal emission tutorial directly from the provided Jupyter Notebook to get used to working with PICASO. However each time, I produce a spectrum that's completely different from the one shown in the documentation. In fact, my produced spectrum is several orders of magnitudes off. Additionally, the separate line for the Sonora comparison doesn't even appear on the graph. I've attached a screenshot to show the odd spectra.

Screen Shot 2022-11-24 at 6 45 55 PM

find_nearest takes first value if there are duplicates

if the cumulative opacities in photo_attenuation are 0 until a certain level find_nearest picks the first when it should be taking the last.

suggest changing find_nearest to np.where functionality numpy.where(x == x.min()) and take [-1]

sqlite query for single ptid is throwing an error

It seems that when there is only one unique p-t index in the opacity database, the index value gets formatted as tuple causing an error in the query.

Should the ind_pt query also have a special case similar to the molecule query, like:

unique_ptind = np.unique(ind_pt)

if len(unique_ptind)==1:
        query_ptid = """ptid= '{}' """.format(str(unique_ptind[0]))
else:
        query_ptid = 'ptid in '+str(tuple(unique_ptind))

cur.execute("""SELECT molecule,ptid,opacity
            FROM molecular
            {}
            AND {}""".format(query_mol, query_ptid))

Add opacity confusion to FAQs

FAQ:
"I'm confused about the opacity files. Version 1.0 (opacity.db) is the low res version that covers the "useful" wavelengths (0.3 to 14 micron) while version 2.0 has two files that cover the 0.6-6 micron at higher resolution and 4.8-15 micron. What should I download? And once I download it, where do I put them? If it's just the one file that seems straightforward in terms of where to put it but what happens when there are two files (or all 3) in the same folder?"

Answer:
What do I download?

The low sampled files across a large wavelength range, which is on Zenodo as V1 is great for quick calculations that don't necessarily need to be accurate. For example: proposals, example models, any testing, retrievals on fake data.

However, when comparing to real data, it's important to use higher sampling. This tutorial shows users estimated sampling errors. Therefore, it is important to have higher sampling files as well. The higher sampling files are located here under V2.

So depending on your use case, the answer might be: download both!

Where do I put all the files?

PICASO uses the function justdoit.opannection to grab the opacity file located in the reference directory opacities. In the installation instructions you will notice there is a step to place the zenodo file here. Just for completeness, internally, we specify the name of this file here.

The general recommendation is to keep one "default" file in your reference/opacities folder so that you do not need to worry about always specifying a file when running the code. Then assign one place, easy to locate, where you include the rest of the files. In order to access these will need to point to this file path using the filename_db keyword in opannection.

Phase Curve zero point not explicitly defined

Pointed out by Tiffany Kataria

Phase curve code does not explicitly define a zero point. Instead it is sets the zero point to the secondary eclipse, which is not intuitive based on standards in phase curve literature.

Need to:

  1. set explicit addition to atmosphere_4d and clouds_4d that sets a zero_point
  2. change notebooks to reflect this change

Second, need to make sure that atmosphere and cloud routines start off with a deepcopy so that users notebook variables are not reset.

Mismatch in keywords and used variables.

In jdi.star(): the keyword requested is w_units and f_units. The variable used is w_unit and f_unit.

picaso/picaso/justdoit.py

Lines 608 to 610 in 0737df3

def star(self, opannection,temp=None, metal=None, logg=None ,radius = None, radius_unit=None,
semi_major=None, semi_major_unit = None,
database='ck04models',filename=None, w_units=None, f_units=None):

picaso/picaso/justdoit.py

Lines 675 to 687 in 0737df3

if w_unit == 'um':
WAVEUNITS = 'um'
elif w_unit == 'nm':
WAVEUNITS = 'nm'
elif w_unit == 'cm' :
WAVEUNITS = 'cm'
elif w_unit == 'Angs' :
WAVEUNITS = 'angstrom'
elif w_unit == 'Hz' :
WAVEUNITS = 'Hz'
else:
raise Exception('Stellar units are not correct. Pick um, nm, cm, hz, or Angs')

picaso/picaso/justdoit.py

Lines 689 to 697 in 0737df3

if f_unit == 'Jy':
FLUXUNITS = 'jy'
elif f_unit == 'FLAM' :
FLUXUNITS = 'FLAM'
elif f_unit == 'erg/cm2/s/Hz':
flux = flux*1e23
FLUXUNITS = 'jy'
else:
raise Exception('Stellar units are not correct. Pick FLAM or Jy or erg/cm2/s/Hz')

Strange surface emission with rocky planets

I've been using PICASO to create spectra for Earth-like planets, and have discovered some strange results for thermal emission spectra (the transmission spectra match are totally fine). I was finding that although the surface temperature of my Earth-Sun model is 288 K, emission from the planet was being calculated to be much higher than even a 500 K blackbody at some wavelengths. The code I used (saved as a text file so I could upload it) is here:
PICASO_emissions_Earth-Sun_py.txt
And the atmospheric profile text file that the code needs to run is here:
atmos_output_Sun-Earth.txt

I found that the high emission that went over blackbody emission for 288 K went away when I set b1 values for the surface (b1[-1]) equal to 0 in get_thermal_1d. The problem also went away when I let b1 be calculated as usual, but then commented out the contributions from b1[-1] in both b_surface and flux_plus in that same function.

A senior researcher in my group, João Mendonça, ended up writing his own simpler version of get_thermal_1d that was basically what they did in appendix A of Lacis & Oinas (1991)(available here: https://pubs.giss.nasa.gov/docs/1991/1991_Lacis_la01100t.pdf). It's a lot less complicated than the scattering stuff in Toon et al (1989).

In this plot compare_PICASOs.pdf you can see a comparison of spectra created by the original PICASO, PICASO with the surface layer b1 values set to 0 (b1[-1] = 0), PICASO with the b1[-1] contributions from b_surface and flux_plus set to 0, and then João's version of the code. I also have another version of that plot compare_PICASOs_noorig.pdf without the original PICASO spectrum so you can see that the other 3 solutions I used gave back essentially the same emission spectrum (the actual spectra differ very slightly numerically, but you can't even see it in the plots).

It looks like I can use João's version of get_thermal_1d within PICASO to fix this problem, but I'm still confused as to why the original PICASO had this issue with unrealistically high emission values at some wavelengths and why zeroing out the b1 addition to surface flux fixes it.

Thanks for your help!
Thea

Add input/output capabilities for xarray

With the new model data uniformity standards, we need to add capabilities for 1D inputs to be in xarray. This includes

  • atmosphere 1d input
  • cloud 1d input
  • all spectra output (e.g. in addition to as_dict, we should add as_xarray to comply with output standards)

Add interpolation feature to sonora()

Add feature to interpolate to exact Teff, and g number for the function sonora

Currently if a user runs case.sonora(teff) for a certain logg, it will find the nearest neighbor.

We should be able to interpolate so that a user can retrieve a PT profile at an exact logg, and teff point.

New function should have kward method which can be either nearest_neighbor or interp

Negative metallicity in visscher_grid

The grid contains files for a negative metallicity (Fe/H=-0.3, labeled as m0.3). The function to call the grid only considers positive metallicities. If these grids are reliable, it would be nice to modify the function to use these files. You could modify the function with these two lines if the -0.3 grids are reliable:

fehs = np.array([-0.3,0.0,0.3,0.5,0.7,1.0,1.5,1.7,2.0])
filename = (os.path.join(__refdata__,'chemistry','visscher_grid', f'2015_06_1060grid_feh_{str_fe}_co_{str_co}.txt')).replace('_-','m')

Adding surfaces at different pressure levels

I was wondering if it is possible to specify a fixed pressure grid and assign the surface of the planet to a particular pressure level in the grid like it is done for clouds?

Check for profiles doesn't catch both issues

In the check for the sonora profiles this AND needs to be an OR. If people unzipped all the files it fails this check and does not the throw the necessary exception.

if ((len(flist)<300) & ('t400g3160nc_m0.0.cmp.gz' not in flist)):

if ((len(flist)<300) & ('t400g3160nc_m0.0.cmp.gz' not in flist)):

Add brightness temperature function to justplotit

Incorporate @sagnickm brightness temp but with picaso output units (erg/cm2/s/cm and cm)

def brightnessT(lam,flux):
    """
    lam : meters 
    flux : W/m^2/microns
    """
    a=1.43877735e-2  #m K
    hc2=2*5.95521476e-17   # m^4 kg/s^3
    ## since flux is in W/m^2/microns hence need to multiple 1e6 to the flux to make it W/m^2/m
    flux=flux*1e6
    ## converting wv to m from microns
    lam=lam*1e-6
    
    return (a/lam)/np.log(1+(hc2/flux/lam**5))

Fpfs_reflected and fpfs_thermal

if (not isinstance(returns['fpfs_reflected'],str)):

The above line of code is attempting to catch that fpfs_reflected is a string. It is, in fact, a string in a list. The check passes and an uninformative exception is thrown instead.

Suggestion, don't populate fpfs quantities without semi_major or throw a usable exception.

input phase angle can be anything

Yes I know the docs say it wants radians but sometimes there aren't enough mods in your equation and you end up calling

case.phase_angle(400, num_gangle=8, num_tangle=8) and you wonder why it looks funny.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.