lmurmann / multi_illumination Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
This paper “A Dataset of Multi-Illumination Images in the Wild” describes the HDR process as:
After merging exposures, we normalize the brightness of the HDR image by matching the intensity of the diffuse gray sphere. The gray sphere also serves as a reference point for white balance.
What exactly calculation does the HDR pixel value do about 'matching the intensity of the diffuse gray sphere'? I found negative values in HDR images of chrome sphere and gray probe. How could that negative intensity happened?
Hi Lukas,
First of all, thanks for releasing such a great dataset! It's clear to me that a lot of effort here was made to make it easy to use.
Just posting an issue here to let you know that the size of the material maps and the image sizes are off by 1 when using MIP=5 (e.g. the fetched material map is 124x187 when the image is 125x187). Reproduced using the following snippet, which throws a corresponding shape mismatch error:
scenes=ml.test_scenes()
S = query_images(scenes, mip=5, hdr=False)
M = query_materials(scenes, mip=5)
For now I've resorted to just downsampling the the higher resolution material map with nearest neighbor interpolation as a temporary stopgap. If this was incorrect, please let me know!
from PIL import Image
M2 = query_materials(scenes, mip=2)
M5 = []
for m in M2:
M5.append(np.array(Image.fromarray(m).resize((125, 187), Image.NEAREST)).reshape(1,125,187))
M5 = np.concatenate(M5,0)
~Eric
@lmurmann Hello, I want to do the illumination research topic.I want to know when you are going to public the model.Thank you very much!
Hi Lukas,
Thank you for creating this interesting dataset.
I was able to find the material map numbers from #4, but I was unable to find information regarding the mapping between these material annotations to the corresponding RGB color palette in the material segmentation maps. Could you please provide this information?
Thanks,
Mahesh
@lmurmann Hello, thanks for releasing the dataset and the code to evaluate the models.
Could you please provide some additional details about the training of the relighting model? For my understanding, for training the relighting model you use both the input and the target images
as HDR in the log domain. So, you preprocess these images before you feed them in the training pipeline. Furthermore, you use for the loss function akin
L(I, Î) = || \nabla I - \nabla Î ||_1,
where \nabla is the spatial gradient. Am I missing anything?
Thank you!
Hi @lmurmann,
Thanks for providing the code for such a nice work. However, I would like to train new dataset based on your model. But I do not see any code for training the model. I would be really helpful if you could provide some information about how do I train new data.
Thank you and best regards,
I couldn't find this in the repository. Can you share the exact mapping from the numerical classes to the actual annotations for the material maps?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.