news
newest
ask
show
jobs
1
LLM Hacking: Quantifying the Hidden Risks of Using LLMs for Text Annotation