Telegram Group & Telegram Channel
πŸ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

βœ… This Week's Presentation:

πŸ”Ή Title: Data Unlearning in Diffusion Models


πŸ”Έ Presenter: Aryan Komaei

πŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- πŸ“… Date: Tuesday
- πŸ•’ Time: 4:45 - 5:45 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️



tg-me.com/RIMLLab/196
Create:
Last Update:

πŸ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

βœ… This Week's Presentation:

πŸ”Ή Title: Data Unlearning in Diffusion Models


πŸ”Έ Presenter: Aryan Komaei

πŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- πŸ“… Date: Tuesday
- πŸ•’ Time: 4:45 - 5:45 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/196

View MORE
Open in Telegram


telegram Telegram | DID YOU KNOW?

Date: |

The messaging service and social-media platform owes creditors roughly $700 million by the end of April, according to people briefed on the company’s plans and loan documents viewed by The Wall Street Journal. At the same time, Telegram Group Inc. must cover rising equipment and bandwidth expenses because of its rapid growth, despite going years without attempting to generate revenue.

At a time when the Indian stock market is peaking and has rallied immensely compared to global markets, there are companies that have not performed in the last 10 years. These are definitely a minor portion of the market considering there are hundreds of stocks that have turned multibagger since 2020. What went wrong with these stocks? Reasons vary from corporate governance, sectoral weakness, company specific and so on. But the more important question is, are these stocks worth buying?

telegram from de


Telegram RIML Lab
FROM USA