this post was submitted on 19 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 11 months ago
MODERATORS
 

Hello,

I am currently working on my thesis, which focuses on elucidating fake news through humor. My objective is to fine-tune a transformer-based model for this purpose. I have a question:

If I initially fine-tune the model to generate humor (using a prompt like "tell me a joke" and providing an expected response in the form of a joke), and then fine-tune it again (using a prompt like "explain why this news is fake" and providing the expected response), will the final model be capable of responding effectively to a prompt like "explain why this is fake in a funny manner" if I proceed with this approach?

Or should I fine-tune the model specifically for the prompt "explain why this is fake in a funny way" and provide it with the expected response in a similar manner?

Has anyone come across a "problem" like this and if so what do u think its the best approach ?

Thank you for the help!

top 2 comments
sorted by: hot top controversial new old
[โ€“] gunshoes@alien.top 1 points 10 months ago (1 children)

It would more than likely lose the ability to "make funny." The problem is called "catastrophic forgetting" and it comes up when pretrained models are fine tuned on downstream tasks. There's some literature that shows the original pretraining induces some bias (English BERT models fine-tuned on multilingual sets retain English syntax patterns) but more often than not the model purges ability to perform it's original task.

[โ€“] RegisteredJustToSay@alien.top 1 points 10 months ago

There's some good recent papers on how to tackle this. My favourite paper on the topic is probably here: Robust fine-tuning of zero-shot models - arXiv https://arxiv.org/pdf/2109.01903

But tl;dr: a weighted average of fine-tuned weights and original weights through a manually chosen weight tend to greatly mitigate this problem. I was surprised the paper didn't get more attention when it came out, but oh well.