evolocity.tl.velocity_embedding

evolocity.tl.velocity_embedding(data, basis=None, vkey='velocity', scale=1, self_transitions=True, use_negative_cosines=True, direct_pca_projection=None, retain_scale=False, autoscale=True, all_comps=True, T=None, copy=False)

Projects the velocities into any embedding.

Given normalized difference of the embedding positions \(\tilde \delta_{ij} = \frac{x_j-x_i}{\left\lVert x_j-x_i \right\rVert}\). the projections are obtained as expected displacements with respect to the transition matrix \(\tilde \pi_{ij}\) as

\[\tilde \nu_i = E_{\tilde \pi_{i\cdot}} [\tilde \delta_{i \cdot}] = \sum_{j \neq i} \left( \tilde \pi_{ij} - \frac1n \right) \tilde \delta_{ij}.\]
Parameters
data : AnnData

Annotated data matrix.

basis : str (default: ‘tsne’)

Which embedding to use.

vkey : str (default: ‘velocity’)

Name of velocity estimates to be used.

scale : int (default: 1)

Scale parameter of gaussian kernel for transition matrix.

self_transitions : bool (default: True)

Whether to allow self transitions, based on the confidences of transitioning to neighboring nodes.

use_negative_cosines : bool (default: True)

Whether to project node-to-node transitions with negative cosines into negative/opposite direction.

direct_pca_projection : bool (default: None)

Whether to directly project the velocities into PCA space, thus skipping the velocity graph.

retain_scale : bool (default: False)

Whether to retain scale from high dimensional space in embedding.

autoscale : bool (default: True)

Whether to scale the embedded velocities by a scalar multiplier, which simply ensures that the arrows in the embedding are properly scaled.

all_comps : bool (default: True)

Whether to compute the velocities on all embedding components.

T : csr_matrix (default: None)

Allows the user to directly pass a transition matrix.

copy : bool (default: False)

Return a copy instead of writing to adata.

Returns

  • Returns or updates adata with the attributes

  • velocity_basis (.obsm) – coordinates of velocity projection on embedding