Code Monkey home page Code Monkey logo

eigencontours's People

Contributors

dnjs3594 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

eigencontours's Issues

Question on F measure computation

Hi,

It seems that the F measure is computed by using batch_f_measure provided by davisinteractive:

f_measure = batched_f_measure(mask[np.newaxis],
mask_ap[np.newaxis],
average_over_objects=True,
nb_objects=None,
bound_th=0.008)

But when I run the code, my F measure results are really high. For M=8, the average F score on COCO2017 of my experiments is around 0.84, which is much higher than that of around 0.4 reported in the paper (see Fig. 8 (c)).

In particular, even with M=1, the metric is very high. As a specific example, the following display imgs (left is the approximate mask with cfg.dim=1, right is the ground-truth mask) have a computed batch_f_measure of 1.0. I think there is something wrong since the two masks are so different.

image

More related questions:

  1. I take the SVD of the full contour matrix computed over COCO2017 train, whose size is 360 x 557905, and during the convert step, I apply it to COCO2017 val. Although the paper does not mention how to use train/val, I assume this is the correct way. Can you please confirm this?

  2. The paper mentions to compute the F measure, "bipartite matching is performed between the boundary points of a ground-truth contour and its approximated version" and then "F score is defined as the harmonic mean of the precision (P) and the recall (R) of the matching results". In the code this is done by using batch_f_measure provided by davisinteractive between the ground truth masks and approximate masks. Is this equivalent to the way described in the paper?

Looking forward to your reply. Thank you very much!

When will the rest part of the code be released?

Very insightful work! I am wondering when will the code for KINS dataset and SBD dataset be released as that for COCO?

Also, is there any plan of releasing code for instance segmentation by embedding eigencontours in YOLOv3?

Thank you so much!

Eigencontours only computed over 40k subsamples?

Hi, I found your work is very interesting! I have a quick question. It seems that the eigencontours for COCO is computed only over 40000 subsamples based on the following line in the code:

idx = torch.linspace(0, l-1, 40000).type(torch.int64).cuda()

However, in the paper it says that "the proposed eigencontours are determined for all instances in all categories in a training dataset". Since 40k < 110k (size of COCO), which is not "all instances", I wonder which one is correct?

Thanks!

About visualize U

def visualization_U(self):
self.U = load_pickle("data_pickle/U")
for k in range(self.U.shape[1]):
if k == 6:
break
temp = np.full((640, 640, 3), fill_value=255, dtype=np.uint8)
self.visualize.show['candidates'] = np.copy(temp)
U = self.U[:, k:k + 1] * 3000
dc = torch.full((360, 1), fill_value=180).type(torch.float32).cuda()
xy = self.r_coord_xy * U
xy_dc = self.r_coord_xy * dc
polygon_pts = self.cen + xy
polygon_pts_dc = self.cen + xy_dc
allow_pts = torch.cat((self.cen, polygon_pts), dim=1)
# self.visualize.draw_arrowedlines_cv(data=to_np(allow_pts).astype(np.int64), name='candidates', interval=1, ref_name='candidates',
# color=(255, 255, 255), s=1)
self.visualize.draw_polyline_cv(data=polygon_pts.cpu().numpy(), name='candidates', ref_name='candidates',
color=(0, 0, 255), s=15)
# self.visualize.draw_polyline_cv(data=polygon_pts.numpy(), name='candidates', ref_name='candidates',
# color=(51, 153, 102), s=15)
# self.visualize.draw_points_cv(data=to_np(self.cen[0:1, :]), name='candidates', ref_name='candidates',
# color=(255, 255, 255), s=5)
dir_name = 'display_U_red/'
file_name = 'U_' + str(k + 1) + '.jpg'
# file_name = 'U_' + str(k) + '.png'
self.visualize.display_saveimg(dir_name=dir_name,
file_name=file_name,
list=['candidates'])

In line 107:U = self.U[:, k:k + 1] * 3000
What's the reason for multiplying by 3000 here? (I drew the image as a centre ray in my own dataset based on this code, and I'm wondering what could be the cause of the error)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.