Skip to main content

Tensor Sketch

  • Chapter
  • First Online:
Tensor Computation for Data Analysis
  • 2942 Accesses

Abstract

Sketching is a group of dimensionality reduction approaches which succinctly approximate the original data using random projections or samplings. In contrast to general random subsamplings, the sketching matrices are chosen from certain sets of random maps, which can reduce the computational and storage complexities with provable theoretical guarantees. Sketching finds successful applications in regression, low-rank approximation, network compression, etc. In this chapter, we introduce basic ideas, methods and applications of sketching. We start with count sketch, but mainly focus on its extensions to higher-order tensors, including tensor sketch and higher-order count sketch. Finally, the effectiveness of tensor sketching methods is verified in some multidimensional data processing tasks, such as tensor decomposition, Kronecker product regression, and tensorized neural network approximations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
€32.70 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (Bulgaria)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 96.29
Price includes VAT (Bulgaria)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 128.39
Price includes VAT (Bulgaria)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
EUR 128.39
Price includes VAT (Bulgaria)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In this chapter, [N] denotes the set \(\left \{0, 1, \cdots , N-1\right \}\) for any \(N\in \mathbb {R}^+\).

  2. 2.

    For simplicity we focus on symmetric tensor here, i.e., \(\mathcal {T}_{i, j, k}\) remains the same for all permutations of i, j, k. An extension to asymmetric case can be seen in [4].

  3. 3.

    https://github.com/andrewssobral/lrslibrary.

  4. 4.

    https://github.com/andrewssobral/lrslibrary.

  5. 5.

    https://github.com/xwcao/LowRankTRN.

References

  1. Ahle, T.D., Kapralov, M., Knudsen, J.B., Pagh, R., Velingker, A., Woodruff, D.P., Zandieh, A.: Oblivious sketching of high-degree polynomial kernels. In: Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 141–160. SIAM, Philadelphia (2020)

    Google Scholar 

  2. Ailon, N., Chazelle, B.: Approximate nearest neighbors and the fast johnson-lindenstrauss transform. In: Proceedings of the Thirty-Eighth Annual ACM Symposium on Theory of Computing, pp. 557–563 (2006)

    Google Scholar 

  3. Anandkumar, A., Ge, R., Hsu, D., Kakade, S., Telgarsky, M.: Tensor decompositions for learning latent variable models. J. Mach. Learn. Res. 15, 2773–2832 (2014)

    MathSciNet  MATH  Google Scholar 

  4. Anandkumar, A., Ge, R., Janzamin, M.: Guaranteed non-orthogonal tensor decomposition via alternating rank-1 updates (2014, preprint). arXiv:1402.5180

    Google Scholar 

  5. Avron, H., Nguyen, H., Woodruff, D.: Subspace embeddings for the polynomial kernel. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, vol. 27. Curran Associates, Red Hook (2014). https://proceedings.neurips.cc/paper/2014/file/b571ecea16a9824023ee1af16897a582-Paper.pdf

  6. Avron, H., Nguyundefinedn, H.L., Woodruff, D.P.: Subspace embeddings for the polynomial kernel. In: Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 2, NIPS’14, pp. 2258–2266. MIT Press, Cambridge (2014)

    Google Scholar 

  7. Bhojanapalli, S., Sanghavi, S.: A new sampling technique for tensors (2015, preprint). arXiv:1502.05023

    Google Scholar 

  8. Bingham, E., Mannila, H.: Random projection in dimensionality reduction: applications to image and text data. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 245–250 (2001)

    Google Scholar 

  9. Bringmann, K., Panagiotou, K.: Efficient sampling methods for discrete distributions. In: International Colloquium on Automata, Languages, and Programming, pp. 133–144. Springer, Berlin (2012)

    Google Scholar 

  10. Charikar, M., Chen, K., Farach-Colton, M.: Finding frequent items in data streams. Autom. Lang. Program. 2380, 693–703 (2002)

    MathSciNet  MATH  Google Scholar 

  11. Clarkson, K.L., Woodruff, D.P.: Low-rank approximation and regression in input sparsity time. J. ACM 63(6), 1–45 (2017)

    Article  MathSciNet  Google Scholar 

  12. Demaine, E.D., López-Ortiz, A., Munro, J.I.: Frequency estimation of internet packet streams with limited space. In: European Symposium on Algorithms, pp. 348–360. Springer, Berlin (2002)

    Google Scholar 

  13. Diao, H., Jayaram, R., Song, Z., Sun, W., Woodruff, D.: Optimal sketching for kronecker product regression and low rank approximation. In: Advances in Neural Information Processing Systems, pp. 4737–4748 (2019)

    Google Scholar 

  14. Diao, H., Song, Z., Sun, W., Woodruff, D.P.: Sketching for kronecker product regression and p-splines. In: Proceedings of International Conference on Artificial Intelligence and Statistics (AISTATS 2018) (2018)

    Google Scholar 

  15. Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)

    Article  MathSciNet  Google Scholar 

  16. Eldar, Y.C., Kutyniok, G.: Compressed Sensing: Theory and Applications. Cambridge University Press, Cambridge (2012)

    Book  Google Scholar 

  17. Gama, F., Marques, A.G., Mateos, G., Ribeiro, A.: Rethinking sketching as sampling: a graph signal processing approach. Signal Process. 169, 107404 (2020)

    Article  Google Scholar 

  18. Han, I., Avron, H., Shin, J.: Polynomial tensor sketch for element-wise function of low-rank matrix. In: International Conference on Machine Learning, pp. 3984–3993. PMLR, Westminster (2020)

    Google Scholar 

  19. Hsu, C.Y., Indyk, P., Katabi, D., Vakilian, A.: Learning-based frequency estimation algorithms. In: International Conference on Learning Representations (2019)

    Google Scholar 

  20. Indyk, P., Vakilian, A., Yuan, Y.: Learning-based low-rank approximations. In: Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Red Hook (2019). https://proceedings.neurips.cc/paper/2019/file/1625abb8e458a79765c62009235e9d5b-Paper.pdf

  21. Kasiviswanathan, S.P., Narodytska, N., Jin, H.: Network approximation using tensor sketching. In: Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI-18, pp. 2319–2325. International Joint Conferences on Artificial Intelligence Organization (2018). https://doi.org/10.24963/ijcai.2018/321

  22. Kossaifi, J., Lipton, Z.C., Kolbeinsson, A., Khanna, A., Furlanello, T., Anandkumar, A.: Tensor regression networks. J. Mach. Learn. Res. 21, 1–21 (2020)

    MathSciNet  MATH  Google Scholar 

  23. LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)

    Article  Google Scholar 

  24. Liu, S., Liu, T., Vakilian, A., Wan, Y., Woodruff, D.P.: Extending and improving learned CountSketch (2020, preprint). arXiv:2007.09890

    Google Scholar 

  25. Ma, J., Zhang, Q., Ho, J.C., Xiong, L.: Spatio-temporal tensor sketching via adaptive sampling (2020, preprint). arXiv:2006.11943

    Google Scholar 

  26. Malik, O.A., Becker, S.: Low-rank tucker decomposition of large tensors using tensorsketch. Adv. Neural Inf. Process. Syst. 31, 10096–10106 (2018)

    Google Scholar 

  27. Malik, O.A., Becker, S.: Guarantees for the kronecker fast johnson–lindenstrauss transform using a coherence and sampling argument. Linear Algebra Appl. 602, 120–137 (2020)

    Article  MathSciNet  Google Scholar 

  28. Nelson, J., Nguyên, H.L.: OSNAP: faster numerical linear algebra algorithms via sparser subspace embeddings. In: 2013 IEEE 54th Annual Symposium on Foundations of Computer Science, pp. 117–126. IEEE, Piscataway (2013)

    Google Scholar 

  29. Pagh, R.: Compressed matrix multiplication. ACM Trans. Comput. Theory 5(3), 1–17 (2013)

    Article  MathSciNet  Google Scholar 

  30. Pham, N., Pagh, R.: Fast and scalable polynomial kernels via explicit feature maps. In: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD’13, vol. 128815, pp. 239–247. ACM, New York (2013)

    Google Scholar 

  31. Prasad Kasiviswanathan, S., Narodytska, N., Jin, H.: Deep neural network approximation using tensor sketching (2017, e-prints). arXiv:1710.07850

    Google Scholar 

  32. Sarlos, T.: Improved approximation algorithms for large matrices via random projections. In: 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS’06), pp. 143–152. IEEE, Piscataway (2006)

    Google Scholar 

  33. Sarlós, T., Benczúr, A.A., Csalogány, K., Fogaras, D., Rácz, B.: To randomize or not to randomize: space optimal summaries for hyperlink analysis. In: Proceedings of the 15th International Conference on World Wide Web, pp. 297–306 (2006)

    Google Scholar 

  34. Shi, Y.: Efficient tensor operations via compression and parallel computation. Ph.D. Thesis, UC Irvine (2019)

    Google Scholar 

  35. Shi, Y., Anandkumar, A.: Higher-order count sketch: dimensionality reduction that retains efficient tensor operations. In: Data Compression Conference, DCC 2020, Snowbird, March 24–27, 2020, p. 394. IEEE, Piscataway (2020). https://doi.org/10.1109/DCC47342.2020.00045

  36. Sun, Y., Guo, Y., Luo, C., Tropp, J., Udell, M.: Low-rank tucker approximation of a tensor from streaming data. SIAM J. Math. Data Sci. 2(4), 1123–1150 (2020)

    Article  MathSciNet  Google Scholar 

  37. Vempala, S.S.: The Random Projection Method, vol. 65. American Mathematical Society, Providence (2005)

    Book  Google Scholar 

  38. Wang, Y., Tung, H.Y., Smola, A., Anandkumar, A.: Fast and guaranteed tensor decomposition via sketching. UC Irvine 1, 991–999 (2015). https://escholarship.org/uc/item/6zt3b0g3

    Google Scholar 

  39. Wang, Y., Tung, H.Y., Smola, A.J., Anandkumar, A.: Fast and guaranteed tensor decomposition via sketching. In: Advances in Neural Information Processing Systems, pp. 991–999 (2015)

    Google Scholar 

  40. Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10, 207–244 (2009)

    MATH  Google Scholar 

  41. Woodruff, D.P., et al.: Sketching as a tool for numerical linear algebra. Found. Trends® Theor. Comput. Sci. 10(1–2), 1–157 (2014)

    Google Scholar 

  42. Xia, D., Yuan, M.: Effective tensor sketching via sparsification. IEEE Trans. Inf. Theory 67(2), 1356–1369 (2021)

    Article  MathSciNet  Google Scholar 

  43. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms (2017, preprint). arXiv:1708.07747 (2017)

    Google Scholar 

  44. Yang, B., Zamzam, A., Sidiropoulos, n.d.: Parasketch: parallel tensor factorization via sketching. In: Proceedings of the 2018 SIAM International Conference on Data Mining, pp. 396–404. SIAM, Philadelphia (2018)

    Google Scholar 

  45. Yang, B., Zamzam, A.S., Sidiropoulos, n.d.: Large scale tensor factorization via parallel sketches. IEEE Trans. Knowl. Data Eng. (2020). https://doi.org/10.1109/TKDE.2020.2982144

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Liu, Y., Liu, J., Long, Z., Zhu, C. (2022). Tensor Sketch. In: Tensor Computation for Data Analysis. Springer, Cham. https://doi.org/10.1007/978-3-030-74386-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-74386-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-74385-7

  • Online ISBN: 978-3-030-74386-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics