Osman Asif Malik

Research Scientist
Encube Technologies

I work as a research scientist at Encube Technologies. Prior to this, I was an Alvarez Postdoctoral Fellow at Lawrence Berkeley National Laboratory where I was a member of the Scalable Solvers Group. I received my PhD in Applied Mathematics from University of Colorado Boulder where I was advised by Stephen Becker. During my PhD, I had the opportunity to do internships at IBM Research in Yorktown Heights, NY, and at Fujitsu Research of America (formerly known as Fujitsu Laboratories of America) in Sunnyvale, CA.

To get in touch: Please either send me an email to my old university email (see above), or reach out on LinkedIn.

My research interests include

  • Machine learning
  • Randomized algorithms
  • Numerical linear algebra
  • Tensor decomposition
  • Optimization
  • Quantum computing

Recordings of some talks I've given:

Publications

Preprint Papers
  • Y. Yaniv, O. A. Malik, P. Ghysels, X. S. Li. Construction of hierarchically semi-separable matrix representation using adaptive Johnson-Lindenstrauss sketching. arXiv:2302.01977, 2023.
  • O. A. Malik, V. Bharadwaj, R. Murray. Sampling-based decomposition algorithms for arbitrary tensor networks. arXiv:2210.03828, 2022.
  • O. A. Malik, Y. Xu, N. Cheng, S. Becker, A. Doostan, A. Narayan. Fast algorithms for monotone lower subsets of Kronecker least squares problems. arXiv:2209.05662, 2022.
Journal and Conference Papers
  • V. Bharadwaj, B. T. Rakhshan, O. A. Malik, G. Rabusseau. Efficient leverage score sampling for tensor train decomposition. To appear at Advances in Neural Information Processing Systems (NeurIPS), 2024, arXiv:2406.02749.
  • V. Bharadwaj, O. A. Malik, R. Murray, A. Buluç, J. Demmel. Distributed-memory randomized algorithms for sparse tensor CP decomposition. ACM Symposium on Parallelism in Algorithms and Architectures (SPAA), pp. 155-168, 2024.
  • N. Cheng, O. A. Malik, Y. Xu, S. Becker, A. Doostan, A. Narayan. Subsampling of parametric models with bifidelity boosting. SIAM/ASA Journal on Uncertainty Quantification 12, issue 2, pp. 213-241, 2024.
  • N. Cheng, O. A. Malik, S. De, S. Becker, A. Doostan. Bi-fidelity variational auto-encoder for uncertainty quantification. Computer Methods in Applied Mechanics and Engineering 421, 2024.
  • V. Bharadwaj, O. A. Malik, R. Murray, L. Grigori, A. Buluç, J. Demmel. Fast exact leverage score sampling from Khatri-Rao products with applications to tensor decomposition. Advances in Neural Information Processing Systems (NeurIPS), 2023.
  • R. Border, O. A. Malik. rBahadur: efficient simulation of structured high-dimensional genotype data with applications to assortative mating. BMC Bioinformatics 24, 314, 2023.
  • O. A. Malik. More efficient sampling for tensor decomposition with worst-case guarantees. International Conference on Machine Learning (ICML), PMLR 162, pp. 14887-14917, 2022.
  • O. A. Malik, H. Ushijima-Mwesigwa, A. Roy, A. Mandal, I. Ghosh. Binary matrix factorization on special purpose hardware. PLOS ONE 16(12): e0261250, 2021.
  • O. A. Malik, S. Becker. A sampling-based method for tensor ring decomposition. International Conference on Machine Learning (ICML), PMLR 139, pp. 7400-7411, 2021.
  • O. A. Malik, S. Ubaru, L. Horesh, M. E. Kilmer, H. Avron. Dynamic graph convolutional networks using the tensor M-product. SIAM International Conference on Data Mining (SDM), pp. 729-737, 2021.
  • O. A. Malik, S. Becker. Randomization of approximate bilinear computation for matrix multiplication. International Journal of Computer Mathematics: Computer Systems Theory 6, issue 1, pp. 54-93, 2021.
  • O. A. Malik, S. Becker. Fast randomized matrix and tensor interpolative decomposition using CountSketch. Advances in Computational Mathematics 46, article number 76, 2020.
  • O. A. Malik, S. Becker. Guarantees for the Kronecker fast Johnson–Lindenstrauss transform using a coherence and sampling argument. Linear Algebra and its Applications 602, pp. 120–137, 2020.
  • O. A. Malik, S. Becker. Low-rank Tucker decomposition of large tensors using TensorSketch. Advances in Neural Information Processing Systems (NeurIPS), pp. 10096-10106, 2018.
Workshop Papers
  • O. A. Malik, V. V. Narumanchi, S. Becker, T. W. Murray. Superresolution photoacoustic tomography using random speckle illumination and second order moments. IEEE Workshop on Applications of Signal Processing to Audio and Acoustics (WASPAA), pp. 141-145, 2021.
  • O. A. Malik, S. Ubaru, L. Horesh, M. E. Kilmer, H. Avron. Tensor graph neural networks for learning on time varying graphs. NeurIPS Workshop on Graph Representation Learning, 2019.
Technical Reports
  • R. Murray, J. Demmel, M. W. Mahoney, N. B. Erichson, M. Melnichenko, O. A. Malik, L. Grigori, P. Luszczek, M. Derezinski, M. E. Lopes, T. Liang, H. Luo, J. Dongarra. Randomized numerical linear algebra: A perspective on the field with an eye to software. Technical Report No. UCB/EECS-2023-19, EECS Department, University of California, Berkeley, 2023.
Patents
  • O. A. Malik, H. Ushijima, A. Mandal, I. Ghosh, A. Roy. Data clustering. US Patent Number: 11,537,637. Date of Patent: 27 December 2022.
  • L. Horesh, O. A. Malik, S. Ubaru, M. E. Kilmer, H. Avron. Tensor-based predictions from analysis of time-varying graphs. US Patent Number: 11,386,507. Date of Patent: 12 July 2022.