본문 바로가기

728x90

분류 전체보기

(430)
[Paper Review] Interventional Speech Noise Injection for ASR Generalizable Spoken Language Understanding Interventional Speech Noise Injection for ASR Generalizable Spoken Language UnderstandingRecently, pre-trained language models (PLMs) have been increasingly adopted in spoken language understanding (SLU). However, automatic speech recognition (ASR) systems frequently produce inaccurate transcriptions, leading to noisy inputs for SLU models, wharxiv.org0 Abstract1. ASR errors are propagated to SL..
[Paper Review] Investigating Decoder-only Large Language Models for Speech-to-text Translation Investigating Decoder-only Large Language Models for Speech-to-text TranslationLarge language models (LLMs), known for their exceptional reasoning capabilities, generalizability, and fluency across diverse domains, present a promising avenue for enhancing speech-related tasks. In this paper, we focus on integrating decoder-only LLMsarxiv.org0 Abstractintegrate decoder-only LLMs to the task of sp..
[Paper Review] Recent Advances in Speech Language Models: A Survey Recent Advances in Speech Language Models: A SurveyLarge Language Models (LLMs) have recently garnered significant attention, primarily for their capabilities in text-based interactions. However, natural human interaction often relies on speech, necessitating a shift towards voice-based models. A straightfarxiv.org0. Abstractnatural human interaction often relies on speech, necessitating a shift..
[Paper Review] Feature Unlearning for Pre-trained GANs and VAEs Feature Unlearning for Pre-trained GANs and VAEsWe tackle the problem of feature unlearning from a pre-trained image generative model: GANs and VAEs. Unlike a common unlearning task where an unlearning target is a subset of the training set, we aim to unlearn a specific feature, such as hairstyle from farxiv.org0. AbstractFeature unlearning is simply making a model to exclude the production of s..
[Math] Linear Independent, Span, Basis and Rank 1. Linear IndependentLet's say we have x1, ..., xk vectors. If we can express one vector with anotherThoese vectors are linearly dependent if and only if (at least) one of them is a linear combination of the others. In particular, if one vector is a multiple of another vector then the set is lineary dependent. Here is a practical way of checking whether vectors x1, ..., xk are lineary independen..
[Paper Review] Zipformer: A faster and better encoder for automatic speech recognition Zipformer: A faster and better encoder for automatic speech recognitionThe Conformer has become the most popular encoder model for automatic speech recognition (ASR). It adds convolution modules to a transformer to learn both local and global dependencies. In this work we describe a faster, more memory-efficient, and better-parxiv.org0. AbstractThe Conformer has become the most popular encoder m..
[Math] Groups, Vector Spaces and Vector Subspaces **My Personal opinionIn linear algebra, finding solution x from Ax = 0 is very important.And possible solution set x forms a vector subspace which is included in R^n vector space.Before defining the solution space, What we will do is to look into what group, vector space, vector subspaces are.Simply, group gurantees that the results from the operation still lives in the same group(has more speci..
[Math] Finding Solutions of a System of Linear Equations How do we get solutions x from a system of linear equations, Ax = b.If we have a particular solution x_p such that Ax_p = B, and a homogenous solution x_h such that Ax_h = 0, then any scalar t.A(x_p + t*x_h) = bx = x_p + t*x_h Thus, adding any multiple of a homogeneous solution to a particular solution still satisfies the original solution.Notice that neither the general nor the particular solut..

728x90