Telegram Group & Telegram Channel
πŸ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

βœ… This Week's Presentation:

πŸ”Ή Title: Data Unlearning in Diffusion Models


πŸ”Έ Presenter: Aryan Komaei

πŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- πŸ“… Date: Tuesday
- πŸ•’ Time: 4:45 - 5:45 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️



tg-me.com/RIMLLab/196
Create:
Last Update:

πŸ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

βœ… This Week's Presentation:

πŸ”Ή Title: Data Unlearning in Diffusion Models


πŸ”Έ Presenter: Aryan Komaei

πŸŒ€ Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- πŸ“… Date: Tuesday
- πŸ•’ Time: 4:45 - 5:45 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/196

View MORE
Open in Telegram


telegram Telegram | DID YOU KNOW?

Date: |

Pinterest (PINS) Stock Sinks As Market Gains

Pinterest (PINS) closed at $71.75 in the latest trading session, marking a -0.18% move from the prior day. This change lagged the S&P 500's daily gain of 0.1%. Meanwhile, the Dow gained 0.9%, and the Nasdaq, a tech-heavy index, lost 0.59%. Heading into today, shares of the digital pinboard and shopping tool company had lost 17.41% over the past month, lagging the Computer and Technology sector's loss of 5.38% and the S&P 500's gain of 0.71% in that time. Investors will be hoping for strength from PINS as it approaches its next earnings release. The company is expected to report EPS of $0.07, up 170% from the prior-year quarter. Our most recent consensus estimate is calling for quarterly revenue of $467.87 million, up 72.05% from the year-ago period.

Why Telegram?

Telegram has no known backdoors and, even though it is come in for criticism for using proprietary encryption methods instead of open-source ones, those have yet to be compromised. While no messaging app can guarantee a 100% impermeable defense against determined attackers, Telegram is vulnerabilities are few and either theoretical or based on spoof files fooling users into actively enabling an attack.

telegram from br


Telegram RIML Lab
FROM USA