meeting_2021spring:tlt
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
meeting_2021spring:tlt [2021/02/23 09:23] – [Schedule] yang | meeting_2021spring:tlt [2021/03/18 22:17] (current) – yang | ||
---|---|---|---|
Line 1: | Line 1: | ||
====== Transfer Learning Theory Reading Group ====== | ====== Transfer Learning Theory Reading Group ====== | ||
+ | **Location**: | ||
- | ===== Schedule ===== | + | **Time**: Saturday 2pm (every other week) |
- | In this reading group, we will read classic domain adaptation theory papers discussed in the following textbook: | + | |
- | **Textbook**: | + | In this bi-weekly reading group, we will read classic |
+ | [[https:// | ||
+ | Through the readings, we hope to get a basic understanding of how and why domain adaptation algorithms work fundamentally, | ||
+ | ===== Reading Schedule ===== | ||
==== Background ==== | ==== Background ==== | ||
Line 18: | Line 21: | ||
* Ben-David, Shai, John Blitzer, Koby Crammer, Alex Kulesza, Fernando Pereira, and Jennifer Wortman Vaughan. "A theory of learning from different domains." | * Ben-David, Shai, John Blitzer, Koby Crammer, Alex Kulesza, Fernando Pereira, and Jennifer Wortman Vaughan. "A theory of learning from different domains." | ||
- | === Week 2: Discrepancy distance I === | + | === Week 3: Discrepancy distance I === |
* ADAT chapter 3.5.1-3.5.2 | * ADAT chapter 3.5.1-3.5.2 | ||
* Mansour, Yishay, Mehryar Mohri, and Afshin Rostamizadeh. " | * Mansour, Yishay, Mehryar Mohri, and Afshin Rostamizadeh. " | ||
- | === Week 3: Discrepancy distance II === | + | === Week 5: Discrepancy distance II === |
* ADAT chapter 3.5.3 //a discrepancy distance based generalization bound for regression problems.// | * ADAT chapter 3.5.3 //a discrepancy distance based generalization bound for regression problems.// | ||
* Cortes, Corinna, and Mehryar Mohri. " | * Cortes, Corinna, and Mehryar Mohri. " | ||
Line 30: | Line 33: | ||
==== Impossibility theorems for domain adaptation ==== | ==== Impossibility theorems for domain adaptation ==== | ||
- | === Week 4: Impossibility theorems === | + | === Week 7: Impossibility theorems === |
* ADAT Chapter 4.1-4.2 | * ADAT Chapter 4.1-4.2 | ||
- | * David, Shai Ben, Tyler Lu, Teresa Luu, and Dávid Pál. " | + | * David, Shai Ben, Tyler Lu, Teresa Luu, and Dávid Pál. " |
- | === Week 5: Hardness results === | + | === Week 9: Hardness results === |
* ADAT Chapter 4.3-4.4 | * ADAT Chapter 4.3-4.4 | ||
* Ben-David, Shai, and Ruth Urner. "On the hardness of domain adaptation and the utility of unlabeled target samples." | * Ben-David, Shai, and Ruth Urner. "On the hardness of domain adaptation and the utility of unlabeled target samples." | ||
==== Integral probability generalization bound ==== | ==== Integral probability generalization bound ==== | ||
- | === Week 6: Wasserstein distance | + | === Week 11: Wasserstein distance |
* ADAT Chapter 5.1-5.3 | * ADAT Chapter 5.1-5.3 | ||
* Redko, Ievgen, Amaury Habrard, and Marc Sebban. " | * Redko, Ievgen, Amaury Habrard, and Marc Sebban. " | ||
===== Other candidate papers ===== | ===== Other candidate papers ===== | ||
- | * Baxter, Jonathan. "A model of inductive bias learning." | + | * Baxter, Jonathan. "A model of inductive bias learning." |
+ | * ERM-based Multi-source Transfer Learning //(recent work by Xinyi on the sample complexity of multi-source transfer learning)// | ||
meeting_2021spring/tlt.1614090187.txt.gz · Last modified: 2021/02/23 09:23 by yang