📈 my github stats
📊 this week i spent my time on:
No activity tracked
License: Apache License 2.0
请问可以分享轻量模型在benchmark上的结果图吗?万分感谢
请问什么时候可以公布轻量预训练模型,万分感谢!!
The ReLU function replacing Softmax is of strong interest to me in terms of the sparsity it can provide in that global attention mechanism. Could the authors please provide the visualisation code of Figure 6 in your paper for our learning?
When I want to train with a single GPU.
I can't find "options/DLGSANet/train_ClassicSR_Large_90C6G4B_DLGSANet_SRx4_scratch_img_size_48_lr5e_4.yml"
what should i do?
Thank you for nice work. I can't find the code for DLGSANet. Can you tell me which file it is in?
Hi,
I am wondering if you could share the arch/model?
Or I'm trying to implement it myself and have some problems. If you can help me, it will be great.
if you can share a onnx model file with like onnx.save(onnx.shape_inference.infer_shapes(onnx_model), model_file), it will help me to refer to the arch structure in ntron.app.
Anyway, I'm still looking forward to the official arch‘s release.
Thx!
Hello,
I can see there are the fles for inference with basic SR, MSRNet, ESRGAN, etc. but I don't see which file we have to use to make the inference with DLGSANet, could you tell it plz ?
Thanks a lot
Sorry to bother, but which path will the trained model be stored,there is no experience folder.
Good morning,
First of all thank you for sharing this great work ! I'd like to use it for a talking head project of the company I'm currently working for.
I tried to run the notebook for Basic SR inference but got the following error :
/usr/local/lib/python3.10/dist-packages/torchvision/transforms/functional_tensor.py:5: UserWarning: The torchvision.transforms.functional_tensor module is deprecated in 0.15 and will be removed in 0.17. Please don't rely on it. You probably just need to use APIs in torchvision.transforms.functional or in torchvision.transforms.v2.functional.
warnings.warn(
Please install facexlib: pip install facexlib
Traceback (most recent call last):
File "/content/BasicSR/inference/inference_dfdnet.py", line 114, in
net = DFDNet(64, dict_path=args.dict_path).to(device)
File "/content/BasicSR/basicsr/archs/dfdnet_arch.py", line 77, in init
self.dict = torch.load(dict_path)
File "/usr/local/lib/python3.10/dist-packages/torch/serialization.py", line 815, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "/usr/local/lib/python3.10/dist-packages/torch/serialization.py", line 1033, in _legacy_load
magic_number = pickle_module.load(f, **pickle_load_args)
_pickle.UnpicklingError: invalid load key, '<'.
I also wanted to ask a question. You have scripts for DFDNET inference, why ? And where is exactly the script to test your own model DGLSANet ?
Thank you very much.
Best regards,
Geoffrey.
Have you tested the PSNR and SSIM values without using TLC? How much performance gain can TLC bring?
Hi authors,
Thanks you very much for your marvellous work and opensourcing! I am wondering where I could get the config files for tiny/small models. Thank you~
The data for test is missing. There is no data folder. Can you pls help to locate it.
I found that when modifying the arch file of DLGSA, even if dlgsanet_arch.py is removed from basicsr/arch.The model can still be trained. Is this due to an issue with the BasicSR framework registration mechanism? Why did this situation occur?看见的哥哥救救孩子吧!
thank you for nice work. when would you release the codes?
Hi, @NeonLeexiang,
First off, I'd like to thank you for your remarkable work. Your approach of using multi-head sliding windows as an alternative to shifted windows and sparse attention has truly been an inspiration.
However, I've noticed a potential oversight in the speed test.
In the speed test codes [Link], it seems there are no codes that switch the model into test mode. This is vital, especially since your model leverages TLC which performs local ensembles of Restormer's attention results, incurring a significant number of loops and additional computational overhead.
I've benchmarked your model on my server (rtx3090) in both training and testing states. Here are the results:
Scale 4 / SwinIR vs DLGSANet / Train vs Test state:
SwinIR (Train Mode):
avg = 275.6369
SwinIR (Test Mode):
avg = 259.4968
DLGSANet (Train Mode):
avg = 186.1960
DLGSANet (Test Mode):
avg = 291.5465
The results in train mode show almost the same results in your paper.
It might be beneficial to re-evaluate this.
Please let me know if there are any discrepancies or if further clarifications are needed.
Thank you for your time and dedication to this project.
Dear author,
I have a question about the the PSNR and SSIM values of SWINIR. In your paper, the PSNR values of SWINIR in all datesets(like Set5, Set14...) are different from the original paper of SWINIR, but the PSNR values of the common models(like RCAN, SAN, HAN) of your paper and the paper of SWINIR remain same for all datasets.
您好,感谢您的分享!
请问您训练用的什么设备,一共训练了多久?
HI! thanks for your work!
I am trying to test the inference time and max memory of light-dlgsanet on my device. could you please release the options of light version.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.