Telegram Group & Telegram Channel
💠 Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

This Week's Presentation:

🔹 Title: Data Unlearning in Diffusion Models


🔸 Presenter: Aryan Komaei

🌀 Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- 📅 Date: Tuesday
- 🕒 Time: 4:45 - 5:45 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️



tg-me.com/RIMLLab/196
Create:
Last Update:

💠 Compositional Learning Journal Club

Join us this week for an in-depth discussion on Data Unlearning in Deep generative models in the context of cutting-edge generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle unlearning tasks and where improvements can be made.

This Week's Presentation:

🔹 Title: Data Unlearning in Diffusion Models


🔸 Presenter: Aryan Komaei

🌀 Abstract:
Diffusion models have been shown to memorize and reproduce training data, raising legal and ethical concerns regarding data privacy and copyright compliance. While retraining these models from scratch to remove specific data is computationally costly, existing unlearning methods often rely on strong assumptions or exhibit instability. To address these limitations, we introduce a new family of loss functions called Subtracted Importance Sampled Scores (SISS). SISS leverages importance sampling to provide the first method for data unlearning in diffusion models with theoretical guarantees.

Session Details:
- 📅 Date: Tuesday
- 🕒 Time: 4:45 - 5:45 PM
- 🌐 Location: Online at vc.sharif.edu/ch/rohban

We look forward to your participation! ✌️

BY RIML Lab




Share with your friend now:
tg-me.com/RIMLLab/196

View MORE
Open in Telegram


telegram Telegram | DID YOU KNOW?

Date: |

What is Telegram?

Telegram’s stand out feature is its encryption scheme that keeps messages and media secure in transit. The scheme is known as MTProto and is based on 256-bit AES encryption, RSA encryption, and Diffie-Hellman key exchange. The result of this complicated and technical-sounding jargon? A messaging service that claims to keep your data safe.Why do we say claims? When dealing with security, you always want to leave room for scrutiny, and a few cryptography experts have criticized the system. Overall, any level of encryption is better than none, but a level of discretion should always be observed with any online connected system, even Telegram.

China’s stock markets are some of the largest in the world, with total market capitalization reaching RMB 79 trillion (US$12.2 trillion) in 2020. China’s stock markets are seen as a crucial tool for driving economic growth, in particular for financing the country’s rapidly growing high-tech sectors.Although traditionally closed off to overseas investors, China’s financial markets have gradually been loosening restrictions over the past couple of decades. At the same time, reforms have sought to make it easier for Chinese companies to list on onshore stock exchanges, and new programs have been launched in attempts to lure some of China’s most coveted overseas-listed companies back to the country.

telegram from cn


Telegram RIML Lab
FROM USA