Two salient limitations have long hindered the relevance of optimal transport methods to machine learning. First, the computational cost of standard sample-based solvers (when used on batches of samples) is prohibitive. Second, the mass conservation constraint makes OT solvers too rigid in practice: because they must match textit{all} points from both measures, their output can be heavily influenced by outliers. A flurry of recent works has addressed these computational and modeling limitations. Still it has resulted in two separate strains of methods: While the computational outlook was much improved by entropic regularization, more recent linear-time textit{low-rank} solvers hold the promise to scale up OT further. In terms of modeling flexibility, the rigidity of mass conservation has been eased for entropic regularized OT thanks to unbalanced variants of OT that can penalize couplings whose marginals deviate from those specified by the source and target distributions. The goal of this paper is to merge these two strains, low-rank and unbalanced, to achieve the promise of solvers that are both scalable and versatile. We propose custom algorithms to implement these extensions for the linear OT problem and its fused-Gromov-Wasserstein generalization, and demonstrate their practical relevance to challenging spatial transcriptomics matching problems. These algorithms are implemented in the ott-jax toolbox.