mrfox321 t1_j8esd6k wrote on February 13, 2023 at 7:54 PM Reply to [R] What are some papers that describe TikTok's algorithm? by Thin-Shirt6688 These are their features: https://www.cs.princeton.edu/courses/archive/spring21/cos598D/icde_2021_camera_ready.pdf The paper also references older neural network architectures used in late stages of the recsys stack. Permalink 7
mrfox321 t1_is7pudf wrote on October 13, 2022 at 10:16 PM Reply to [R]Wq can be omited in single head attention by wangyi_fudan Sure, but using W_q allows for low-rank representations of W := W_k @ W_q^T Permalink 5
mrfox321 t1_j8esd6k wrote
Reply to [R] What are some papers that describe TikTok's algorithm? by Thin-Shirt6688
These are their features:
https://www.cs.princeton.edu/courses/archive/spring21/cos598D/icde_2021_camera_ready.pdf
The paper also references older neural network architectures used in late stages of the recsys stack.