Telegram Group & Telegram Channel
๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Data Unlearning in Diffusion Models


๐Ÿ”ธ Presenter: Aryan Komaei

๐ŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- ๐Ÿ“… Date: Tuesday
- ๐Ÿ•’ Time: 4:45 - 5:45 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! โœŒ๏ธ



tg-me.com/RIMLLab/196
Create:
Last Update:

๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Data Unlearning in Diffusion Models


๐Ÿ”ธ Presenter: Aryan Komaei

๐ŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- ๐Ÿ“… Date: Tuesday
- ๐Ÿ•’ Time: 4:45 - 5:45 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! โœŒ๏ธ

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/196

View MORE
Open in Telegram


RIML Lab Telegram | DID YOU KNOW?

Date: |

Telegram Be The Next Best SPAC

I have no inside knowledge of a potential stock listing of the popular anti-Whatsapp messaging app, Telegram. But I know this much, judging by most people I talk to, especially crypto investors, if Telegram ever went public, people would gobble it up. I know I would. Iโ€™m waiting for it. So is Sergei Sergienko, who claims he owns $800,000 of Telegramโ€™s pre-initial coin offering (ICO) tokens. โ€œIf Telegram does a SPAC IPO, there would be demand for this issue. It would probably outstrip the interest we saw during the ICO. Why? Because as of right now Telegram looks like a liberal application that can accept anyone - right after WhatsApp and others have turn on the censorship,โ€ he says.

RIML Lab from tw


Telegram RIML Lab
FROM USA