Telegram Group & Telegram Channel
๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Data Unlearning in Diffusion Models


๐Ÿ”ธ Presenter: Aryan Komaei

๐ŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- ๐Ÿ“… Date: Tuesday
- ๐Ÿ•’ Time: 4:45 - 5:45 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! โœŒ๏ธ



tg-me.com/RIMLLab/196
Create:
Last Update:

๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Data Unlearning in Diffusion Models


๐Ÿ”ธ Presenter: Aryan Komaei

๐ŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- ๐Ÿ“… Date: Tuesday
- ๐Ÿ•’ Time: 4:45 - 5:45 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! โœŒ๏ธ

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/196

View MORE
Open in Telegram


telegram Telegram | DID YOU KNOW?

Date: |

How Does Bitcoin Mining Work?

Bitcoin mining is the process of adding new transactions to the Bitcoin blockchain. Itโ€™s a tough job. People who choose to mine Bitcoin use a process called proof of work, deploying computers in a race to solve mathematical puzzles that verify transactions.To entice miners to keep racing to solve the puzzles and support the overall system, the Bitcoin code rewards miners with new Bitcoins. โ€œThis is how new coins are createdโ€ and new transactions are added to the blockchain, says Okoro.

Tata Power whose core business is to generate, transmit and distribute electricity has made no money to investors in the last one decade. That is a big blunder considering it is one of the largest power generation companies in the country. One of the reasons is the company's huge debt levels which stood at โ‚น43,559 crore at the end of March 2021 compared to the companyโ€™s market capitalisation of โ‚น44,447 crore.

telegram from kr


Telegram RIML Lab
FROM USA