Code Monkey home page Code Monkey logo

sctenifoldnet's People

Contributors

dependabot[bot] avatar dosorio avatar guanxunli avatar jamesjcai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

sctenifoldnet's Issues

null hypothesis with significant DE

hello, thank you for developing and maintaining the tool!
I tried to use my dataset for simulating a situation like your outputH0 example (using in X and Y the same input dataset). I am a bit puzzled by the result, as it returns a set of significant DE features (111 genes over a dataset of 6000 genes X 3000 cells).
do you have any idea why is that the case, I would have expected no difference as you showed in the tutorial?

diagonal of adjacency matrix is not zero any more after tensor decomposition

Dear dev,

In the intial outputs of network construction, the diagonal of the square adjacency matrix is 0, whic makes sense.

However, after tensor decomposition, the diagonal of returned consensus matrix is not zero any more. It is difficult
for me to intepret those value on the diagonal, as the value are made up by cp. It seems these value are used in the later steps in manifoldAlignment without setting back to 0 or other preprocessing.

An non zero value on the diagonal would normally suggest self-regulation. How do you think of the resulted regulation network by tensor decomposition?

See an example taken from the package.

nCells = 2000
nGenes = 100
set.seed(1)
X <- rnbinom(n = nGenes * nCells, size = 20, prob = 0.98)
X <- round(X)
X <- matrix(X, ncol = nCells)
rownames(X) <- c(paste0('ng', 1:90), paste0('mt-', 1:10))

# Computing 3 single-cell gene regulatory networks each one from a subsample of 500 cells
mnOutput <- makeNetworks(X = X,
                         nNet = 3, 
                         nCells = 500, 
                         nComp = 3, 
                         scaleScores = TRUE, 
                         symmetric = FALSE, 
                         q = 0.95
                         )

# Computing a K = 3 CANDECOMP/PARAFAC (CP) Tensor Decomposition
tdOutput <- tensorDecomposition(mnOutput, K = 3, maxError = 1e5, maxIter = 1e3)

# All 0 for all matrix in mnOutput
diag(as.matrix(mnOutput[[1]]))
  ng1   ng2   ng3   ng4   ng5   ng6   ng7   ng8   ng9  ng10  ng11  ng12  ng13 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 ng14  ng15  ng16  ng17  ng18  ng19  ng20  ng21  ng22  ng23  ng24  ng25  ng26 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 ng27  ng28  ng29  ng30  ng31  ng32  ng33  ng34  ng35  ng36  ng37  ng38  ng39 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 ng40  ng41  ng42  ng43  ng44  ng45  ng46  ng47  ng48  ng49  ng50  ng51  ng52 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 ng53  ng54  ng55  ng56  ng57  ng58  ng59  ng60  ng61  ng62  ng63  ng64  ng65 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 ng66  ng67  ng68  ng69  ng70  ng71  ng72  ng73  ng74  ng75  ng76  ng77  ng78 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 ng79  ng80  ng81  ng82  ng83  ng84  ng85  ng86  ng87  ng88  ng89  ng90  mt-1 
    0     0     0     0     0     0     0     0     0     0     0     0     0 
 mt-2  mt-3  mt-4  mt-5  mt-6  mt-7  mt-8  mt-9 mt-10 
    0     0     0     0     0     0     0     0     0 

#- now the value are non zero.
diag(as.matrix(tdOutput[[1]]))
  ng1   ng2   ng3   ng4   ng5   ng6   ng7   ng8   ng9  ng10  ng11  ng12  ng13 
  0.2   0.0   0.0   0.3   0.0   0.0   0.2   0.3   0.0   0.0   0.0   0.0   0.1 
 ng14  ng15  ng16  ng17  ng18  ng19  ng20  ng21  ng22  ng23  ng24  ng25  ng26 
  0.0   0.0   0.0   0.0   0.1   0.0   0.1   0.0   0.1   0.0   0.0   0.0   0.0 
 ng27  ng28  ng29  ng30  ng31  ng32  ng33  ng34  ng35  ng36  ng37  ng38  ng39 
  0.0   0.0   0.0   0.1   0.0   0.1   0.2   0.0   0.0   0.0   0.0   0.0   0.4 
 ng40  ng41  ng42  ng43  ng44  ng45  ng46  ng47  ng48  ng49  ng50  ng51  ng52 
  0.0   0.0   0.1   0.3   0.0   0.3   0.2   0.1   0.0   0.0   0.0   0.2   0.3 
 ng53  ng54  ng55  ng56  ng57  ng58  ng59  ng60  ng61  ng62  ng63  ng64  ng65 
  0.4   0.0   0.3   0.0   0.0   0.0   0.0   0.0   0.5   0.0   0.0   0.0   0.0 
 ng66  ng67  ng68  ng69  ng70  ng71  ng72  ng73  ng74  ng75  ng76  ng77  ng78 
  0.0   0.0   0.0   0.0   0.0   0.0   0.2   0.0   0.0   0.3   0.0   0.0   0.0 
 ng79  ng80  ng81  ng82  ng83  ng84  ng85  ng86  ng87  ng88  ng89  ng90  mt-1 
  0.0   0.0   0.0   0.0   0.0   0.0   0.4   0.1   0.3   0.2   0.0   0.0   1.0 
 mt-2  mt-3  mt-4  mt-5  mt-6  mt-7  mt-8  mt-9 mt-10 
  0.0   0.0   0.0   0.0   0.0   0.0   0.0   0.4   0.3 

why using a 4d array in cp?

Dear scTenifoldNet dev,

Thanks very much for the interesting method. That is the first time I learn something about tensor low approximation,
ane I would like to try it on my work.

As I am new to the idea of tensor, I was confused on the 4d array used in the rTensor::cp steps. As the N * N * T (N is the number of genes, T is the sampling times) is 3d array, why a 4d array is used in the implementations (example codes taken from CRAN 1.0.0)?

Her are codes used In the script of tensorDecomposition.R,

#- init a 4d array with the 3rd dim is 1
  tensorX <- array(data = 0, dim = c(nGenes,nGenes,1,nNet))
  if(!is.null(yList)){
    tensorY <- array(data = 0, dim = c(nGenes,nGenes,1,nNet))
  }
  

#- Here only the last dim is filled with correlation matrix.
  for(i in seq_len(nNet)){
    tempX <- matrix(0, nGenes, nGenes)
    rownames(tempX) <- colnames(tempX) <- sGenes
    temp <- as.matrix(xList[[i]])
    tGenes <- sGenes[sGenes %in% rownames(temp)]
    tempX[tGenes,tGenes] <- temp[tGenes,tGenes]
    tensorX[,,,i] <- tempX
    
    if(!is.null(yList)){
      tempY <- matrix(0, nGenes, nGenes)
      rownames(tempY) <- colnames(tempY) <- sGenes
      temp <- as.matrix(yList[[i]])
      tGenes <- sGenes[sGenes %in% rownames(temp)]
      tempY[tGenes,tGenes] <- temp[tGenes,tGenes]
      tensorY[,,,i] <- tempY
    }
    
  }

  tensorX <- rTensor::as.tensor(tensorX)
  set.seed(1)
  tensorX <- rTensor::cp(tnsr = tensorX, num_components = K, max_iter = maxIter, tol = maxError)
  tX <- tensorX$est@data[,,,1]
  for(i in seq_len(nNet)[-1]){
    tX <- tX +  tensorX$est@data[,,,i]
  }
  tX <- tX/nNet
  tX <- tX/max(abs(tX))
  tX <- round(tX,1)
  tX <- as(tX, 'dgCMatrix')
  rownames(tX) <- colnames(tX) <- sGenes

Actually I have tried tensorDecomposition with 3d array, i.e., simly remove the 3rd dim; and replace the "1" in the 3rd to other values. In the former senario I got different results compared to your codes, and the latter case there was an error:

Error in as(tX, "dgCMatrix") :
no method or default for coercing “array” to “dgCMatrix”

Thanks in advance.

Pseudotime based subsampling

Hi,

I tried scTenifoldNet and it gives interesting results. Is there already some R implementation of a pseudotime-guided subsampling, as you have described in your preprint?

All the best,
Kevin

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.