版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
1、Sparse Coding第x 章 稀疏編碼稀疏編碼與稀疏感知簡(jiǎn)介優(yōu)化算法 - MP,BP,K-SVD小結(jié)1Better Basis ?Better bases for compression or representation? WaveletsHow about data-dependent bases?How about learning?Sparse CodingIntroduction to sparse coding & sensing2First stage of visual processing in brain: V1Edge detection(Olshausen & F
2、ield, Nature, 1996)Sparse CodingIntroduction to sparse coding & sensing3Schematic of simple cellActual simple cell“Gabor functions.”Why sparsity?Sparse CodingIntroduction to sparse coding & sensing4Which do you want?Assume full rank, N d Choose the x with theleast nonzero componentWhy sparsity?The m
3、ore concise, the more betterIn some domain, there naturally exists a sparse latent vector that controls the data we saw. (ex. MRI, music)In some domain, samples from the same class have the sparse property.The domain can be learned.Sparse CodingIntroduction to sparse coding & sensing5A k-sparse doma
4、in means that each b can be constructed by a x vector with at most k nonzero element Sparse Sensing VS. Sparse CodingAssume that: Sparse CodingIntroduction to sparse coding & sensing6Sparse codingSparse sensingNote: p is based on the sparsity of the data (on k)Sparse Sensing VS. Sparse CodingSparse
5、sensing (compressed sensing):It spends much time or money to get b, so get y first then recover bSparse coding (sparse representation):Believe that there exists the sparse property in the data, otherwise sparse representation means nothing.x is used to be the feature of bx can be used to efficiently
6、 store b and reconstruct bSparse CodingIntroduction to sparse coding & sensing7Sparse Sensing VS. Sparse CodingSolution:There is no algorithm other than exhaustively searching to solve:While in some situations (ex. special form of A), the solution of l1 minimization approaches the one of l0 minimiza
7、tion Sparse CodingIntroduction to sparse coding & sensing8Sparse CodingIntroduction to sparse coding & sensing9KNFixed DictionarySparse & random vectorND=XSparse codingEvery column in D (dictionary) is a prototype signal (atom).The vector is generated randomly with few non-zeros in random locations
8、and with random values. Sparse CodingIntroduction to sparse coding & sensing10Sparse codingSimple: Every signal is built as a linear combination of a few atoms from the dictionary D.Effective: Recent works adopt this model and successfully deploy it to applications.Empirically established: Neurologi
9、cal studies show similarity between this model and early vision processes. Olshausen & Field (96)Sparse CodingIntroduction to sparse coding & sensing11Sparse codingGiven x, find the that generated it in D.XSparse CodingIntroduction to sparse coding & sensing12Sparse codingWe need to solve an under-d
10、etermined linear system of equations:Among all (infinitely many) possible solutions we want the sparsest ! We will measure sparsity using the L0 norm:DXSparse CodingIntroduction to sparse coding & sensing13Measure of SparsityAs p 0 we get a count of the non-zeros in the vector-1+11Multiply by DA spa
11、rse & random vectorSparse CodingIntroduction to sparse coding & sensingThree major questions:NP-hard: practical ways to get ?Is ?How do we know D ? 14Sparse CodingIntroduction to sparse coding & sensingThe sparse sensing:15Suppose we observe , a degraded and noisy version of x with . How do we recov
12、er x?How about find the that generated y ? NoiseQMSparse CodingIntroduction to sparse coding & sensingThree major questions:How can we compute ?Is ?What D should we use? 16Multiply by DA sparse & random vectorblur by HSparse CodingIntroduction to sparse coding & sensingThe sparse coding and sensing:
13、Our dream for now: Find the sparsest solution to Put formally:17Sparse codingSparse sensingSparse CodingIntroduction to sparse coding & sensingWhy should we necessarily get ?It might happen that eventually .18Are there reasonable ways to find ?Sparse Coding第x 章 稀疏編碼與感知稀疏編碼與稀疏感知簡(jiǎn)介算法 - MP,BP,K-SVD小結(jié)19
14、Sparse CodingSparse AlgorithmsMatching Pursuit (MP):The MP is a greedy algorithm that finds one atom at a time.Step 1: find the one atom that best matches the signal. Next steps: given the previously found atoms, find the next one to best fit The Orthogonal MP (OMP) is an improved version that re-ev
15、aluates the coefficients after each round.20Mallat & Zhang (1993)Sparse CodingSparse AlgorithmsBasis Pursuit (BP):The newly defined problem is convex (linear programming).Very efficient solvers can be deployed:Interior point methods Chen, Donoho, & Saunders (95)Iterated shrinkage Figuerido & Nowak (
16、03), Daubechies, Defrise, & Demole (04), Elad (05), Elad, Matalon, & Zibulevsky (06).21Instead of solvingSolve thisSparse CodingSparse AlgorithmsApprox. Quality? How effective are MP/BP ?MP and BP are different in general (hard to say which is better).The below results correspond to the worst-case.A
17、verage performance results available too, showing much better bounds Donoho (04), Candes et.al. (04), Tanner et.al. (05), Tropp et.al. (06).22Donoho & Elad (02)Gribonval & Nielsen(03)Tropp (03) Temlyakov (03)Given a signal x with a representation , if then BP and MP are guaranteed to find it. Sparse
18、 CodingSparse AlgorithmsWhat about the dictionary?-The quest for the origin of signals Given P examples and a fixed size (NK) dictionary, how would we find D?23Sparse CodingSparse AlgorithmsThe Objective Function 24DXASparse CodingSparse AlgorithmsThe Objective Function 25The examples are linear com
19、binations of atoms from DEach example has a sparse representation with no more than L atoms(N,K,L are assumed known, D has normalized columns)Sparse CodingSparse Algorithms26DInitializeDSparse CodingUse MP or BPDictionary UpdateColumn-by-Column by SVD computationAharon, Elad & Bruckstein (04)XTK-SVD
20、Sparse CodingSparse Algorithms27For the jth example we solve Ordinary Sparse Coding !DXTK-SVDSparse CodingSparse Algorithms28DXTFor the kth atomwe solve (the residual)K-SVDSparse CodingSparse Algorithms29dkakTWe want to solve:EkOnly some of the examples use column dk!When updating ak, only recompute the coefficients corresponding to those examplesSolve with SVD!K-SVDSparse CodingSparse Algorithms30DInitializeDSparse CodingUse MP or BPDictionary UpdateC
溫馨提示
- 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- CCAA - 2014年09月建筑施工領(lǐng)域?qū)I(yè)答案及解析 - 詳解版(70題)
- 建筑工地安全責(zé)任協(xié)議2025
- 養(yǎng)老院消防安全制度
- 養(yǎng)老院安全巡查制度
- 企業(yè)內(nèi)部信息傳播制度
- 2025年高考(上海卷)歷史真題(學(xué)生版+解析版)
- 系統(tǒng)結(jié)構(gòu)自考通簡(jiǎn)答
- 我國(guó)上市公司環(huán)境信息披露:現(xiàn)狀、問題與突破路徑
- 貨裝值班員安全實(shí)踐測(cè)試考核試卷含答案
- 罐頭封裝工崗前基礎(chǔ)效率考核試卷含答案
- 2025年馬口鐵包裝容器行業(yè)當(dāng)前市場(chǎng)規(guī)模及未來(lái)五到十年發(fā)展趨勢(shì)報(bào)告
- 焊工獎(jiǎng)罰管理辦法
- 2024版電網(wǎng)典型設(shè)計(jì)10kV配電站房分冊(cè)
- 《SPSS與AMOS在中介效應(yīng)與調(diào)節(jié)效應(yīng)分析中的應(yīng)用》
- 家屬院停車管理暫行辦法
- 錫圓電子科技有限公司高端半導(dǎo)體封測(cè)項(xiàng)目環(huán)評(píng)資料環(huán)境影響
- T/CGAS 031-2024城鎮(zhèn)燃?xì)饧映艏夹g(shù)要求
- T/CGAS 026.2-2023瓶裝液化石油氣管理規(guī)范第2部分:平臺(tái)建設(shè)
- 《新能源汽車電力電子技術(shù)》電子教案-新能源汽車電力電子技術(shù).第一版.電子教案
- 金屬非金屬礦山開采方法手冊(cè)
- GB/T 45356-2025無(wú)壓埋地排污、排水用聚丙烯(PP)管道系統(tǒng)
評(píng)論
0/150
提交評(píng)論